Overview of PECBO Module, using scripts to infer environmental conditions from biological observations, statistically estimating species-environment relationships, methods for inferring environmental conditions, statistical scripts in module.
CADDIS Volume 4. Data Analysis: Biological and Environmental Data Requirements
Overview of PECBO Module, using scripts to infer environmental conditions from biological observations, statistically estimating species-environment relationships, methods for inferring environmental conditions, statistical scripts in module.
Conditional statistics in a turbulent premixed flame derived from direct numerical simulation
NASA Technical Reports Server (NTRS)
Mantel, Thierry; Bilger, Robert W.
1994-01-01
The objective of this paper is to briefly introduce conditional moment closure (CMC) methods for premixed systems and to derive the transport equation for the conditional species mass fraction conditioned on the progress variable based on the enthalpy. Our statistical analysis will be based on the 3-D DNS database of Trouve and Poinsot available at the Center for Turbulence Research. The initial conditions and characteristics (turbulence, thermo-diffusive properties) as well as the numerical method utilized in the DNS of Trouve and Poinsot are presented, and some details concerning our statistical analysis are also given. From the analysis of DNS results, the effects of the position in the flame brush, of the Damkoehler and Lewis numbers on the conditional mean scalar dissipation, and conditional mean velocity are presented and discussed. Information concerning unconditional turbulent fluxes are also presented. The anomaly found in previous studies of counter-gradient diffusion for the turbulent flux of the progress variable is investigated.
CADDIS Volume 4. Data Analysis: PECBO Appendix - R Scripts for Non-Parametric Regressions
Script for computing nonparametric regression analysis. Overview of using scripts to infer environmental conditions from biological observations, statistically estimating species-environment relationships, statistical scripts.
EVALUATION OF A NEW MEAN SCALED AND MOMENT ADJUSTED TEST STATISTIC FOR SEM.
Tong, Xiaoxiao; Bentler, Peter M
2013-01-01
Recently a new mean scaled and skewness adjusted test statistic was developed for evaluating structural equation models in small samples and with potentially nonnormal data, but this statistic has received only limited evaluation. The performance of this statistic is compared to normal theory maximum likelihood and two well-known robust test statistics. A modification to the Satorra-Bentler scaled statistic is developed for the condition that sample size is smaller than degrees of freedom. The behavior of the four test statistics is evaluated with a Monte Carlo confirmatory factor analysis study that varies seven sample sizes and three distributional conditions obtained using Headrick's fifth-order transformation to nonnormality. The new statistic performs badly in most conditions except under the normal distribution. The goodness-of-fit χ(2) test based on maximum-likelihood estimation performed well under normal distributions as well as under a condition of asymptotic robustness. The Satorra-Bentler scaled test statistic performed best overall, while the mean scaled and variance adjusted test statistic outperformed the others at small and moderate sample sizes under certain distributional conditions.
Advanced microwave soil moisture studies. [Big Sioux River Basin, Iowa
NASA Technical Reports Server (NTRS)
Dalsted, K. J.; Harlan, J. C.
1983-01-01
Comparisons of low level L-band brightness temperature (TB) and thermal infrared (TIR) data as well as the following data sets: soil map and land cover data; direct soil moisture measurement; and a computer generated contour map were statistically evaluated using regression analysis and linear discriminant analysis. Regression analysis of footprint data shows that statistical groupings of ground variables (soil features and land cover) hold promise for qualitative assessment of soil moisture and for reducing variance within the sampling space. Dry conditions appear to be more conductive to producing meaningful statistics than wet conditions. Regression analysis using field averaged TB and TIR data did not approach the higher sq R values obtained using within-field variations. The linear discriminant analysis indicates some capacity to distinguish categories with the results being somewhat better on a field basis than a footprint basis.
10 CFR 431.173 - Requirements applicable to all manufacturers.
Code of Federal Regulations, 2011 CFR
2011-01-01
... COMMERCIAL AND INDUSTRIAL EQUIPMENT Provisions for Commercial Heating, Ventilating, Air-Conditioning and... is based on engineering or statistical analysis, computer simulation or modeling, or other analytic... method or methods used; (B) The mathematical model, the engineering or statistical analysis, computer...
A study of the feasibility of statistical analysis of airport performance simulation
NASA Technical Reports Server (NTRS)
Myers, R. H.
1982-01-01
The feasibility of conducting a statistical analysis of simulation experiments to study airport capacity is investigated. First, the form of the distribution of airport capacity is studied. Since the distribution is non-Gaussian, it is important to determine the effect of this distribution on standard analysis of variance techniques and power calculations. Next, power computations are made in order to determine how economic simulation experiments would be if they are designed to detect capacity changes from condition to condition. Many of the conclusions drawn are results of Monte-Carlo techniques.
Conditional Probability Analysis: A Statistical Tool for Environmental Analysis.
The use and application of environmental conditional probability analysis (CPA) is relatively recent. The first presentation using CPA was made in 2002 at the New England Association of Environmental Biologists Annual Meeting in Newport. Rhode Island. CPA has been used since the...
Biotic indices have been used ot assess biological condition by dividing index scores into condition categories. Historically the number of categories has been based on professional judgement. Alternatively, statistical methods such as power analysis can be used to determine the ...
Almalik, Osama; Nijhuis, Michiel B; van den Heuvel, Edwin R
2014-01-01
Shelf-life estimation usually requires that at least three registration batches are tested for stability at multiple storage conditions. The shelf-life estimates are often obtained by linear regression analysis per storage condition, an approach implicitly suggested by ICH guideline Q1E. A linear regression analysis combining all data from multiple storage conditions was recently proposed in the literature when variances are homogeneous across storage conditions. The combined analysis is expected to perform better than the separate analysis per storage condition, since pooling data would lead to an improved estimate of the variation and higher numbers of degrees of freedom, but this is not evident for shelf-life estimation. Indeed, the two approaches treat the observed initial batch results, the intercepts in the model, and poolability of batches differently, which may eliminate or reduce the expected advantage of the combined approach with respect to the separate approach. Therefore, a simulation study was performed to compare the distribution of simulated shelf-life estimates on several characteristics between the two approaches and to quantify the difference in shelf-life estimates. In general, the combined statistical analysis does estimate the true shelf life more consistently and precisely than the analysis per storage condition, but it did not outperform the separate analysis in all circumstances.
Developing Sampling Frame for Case Study: Challenges and Conditions
ERIC Educational Resources Information Center
Ishak, Noriah Mohd; Abu Bakar, Abu Yazid
2014-01-01
Due to statistical analysis, the issue of random sampling is pertinent to any quantitative study. Unlike quantitative study, the elimination of inferential statistical analysis, allows qualitative researchers to be more creative in dealing with sampling issue. Since results from qualitative study cannot be generalized to the bigger population,…
NASA Technical Reports Server (NTRS)
Silsby, Norman S
1955-01-01
Statistical measurements of contact conditions have been obtained, by means of a special photographic technique, of 478 landings of present-day transport airplanes made during routine daylight operations in clear air at the Washington National Airport. From the measurements, sinking speeds, rolling velocities, bank angles, and horizontal speeds at the instant before contact have been evaluated and a limited statistical analysis of the results has been made and is reported in this report.
Analysis of Multiple Contingency Tables by Exact Conditional Tests for Zero Partial Association.
ERIC Educational Resources Information Center
Kreiner, Svend
The tests for zero partial association in a multiple contingency table have gained new importance with the introduction of graphical models. It is shown how these may be performed as exact conditional tests, using as test criteria either the ordinary likelihood ratio, the standard x squared statistic, or any other appropriate statistics. A…
Lamart, Stephanie; Griffiths, Nina M; Tchitchek, Nicolas; Angulo, Jaime F; Van der Meeren, Anne
2017-03-01
The aim of this work was to develop a computational tool that integrates several statistical analysis features for biodistribution data from internal contamination experiments. These data represent actinide levels in biological compartments as a function of time and are derived from activity measurements in tissues and excreta. These experiments aim at assessing the influence of different contamination conditions (e.g. intake route or radioelement) on the biological behavior of the contaminant. The ever increasing number of datasets and diversity of experimental conditions make the handling and analysis of biodistribution data difficult. This work sought to facilitate the statistical analysis of a large number of datasets and the comparison of results from diverse experimental conditions. Functional modules were developed using the open-source programming language R to facilitate specific operations: descriptive statistics, visual comparison, curve fitting, and implementation of biokinetic models. In addition, the structure of the datasets was harmonized using the same table format. Analysis outputs can be written in text files and updated data can be written in the consistent table format. Hence, a data repository is built progressively, which is essential for the optimal use of animal data. Graphical representations can be automatically generated and saved as image files. The resulting computational tool was applied using data derived from wound contamination experiments conducted under different conditions. In facilitating biodistribution data handling and statistical analyses, this computational tool ensures faster analyses and a better reproducibility compared with the use of multiple office software applications. Furthermore, re-analysis of archival data and comparison of data from different sources is made much easier. Hence this tool will help to understand better the influence of contamination characteristics on actinide biokinetics. Our approach can aid the optimization of treatment protocols and therefore contribute to the improvement of the medical response after internal contamination with actinides.
Group Influences on Young Adult Warfighters’ Risk Taking
2016-12-01
Statistical Analysis Latent linear growth models were fitted using the maximum likelihood estimation method in Mplus (version 7.0; Muthen & Muthen...condition had a higher net score than those in the alone condition (b = 20.53, SE = 6.29, p < .001). Results of the relevant statistical analyses are...8.56 110.86*** 22.01 158.25*** 29.91 Model fit statistics BIC 4004.50 5302.539 5540.58 Chi-square (df) 41.51*** (16) 38.10** (20) 42.19** (20
Dadaser-Celik, Filiz; Azgin, Sukru Taner; Yildiz, Yalcin Sevki
2016-12-01
Biogas production from food waste has been used as an efficient waste treatment option for years. The methane yields from decomposition of waste are, however, highly variable under different operating conditions. In this study, a statistical experimental design method (Taguchi OA 9 ) was implemented to investigate the effects of simultaneous variations of three parameters on methane production. The parameters investigated were solid content (SC), carbon/nitrogen ratio (C/N) and food/inoculum ratio (F/I). Two sets of experiments were conducted with nine anaerobic reactors operating under different conditions. Optimum conditions were determined using statistical analysis, such as analysis of variance (ANOVA). A confirmation experiment was carried out at optimum conditions to investigate the validity of the results. Statistical analysis showed that SC was the most important parameter for methane production with a 45% contribution, followed by F/I ratio with a 35% contribution. The optimum methane yield of 151 l kg -1 volatile solids (VS) was achieved after 24 days of digestion when SC was 4%, C/N was 28 and F/I were 0.3. The confirmation experiment provided a methane yield of 167 l kg -1 VS after 24 days. The analysis showed biogas production from food waste may be increased by optimization of operating conditions. © The Author(s) 2016.
Transportation statistics annual report 2001
DOT National Transportation Integrated Search
2001-01-01
This eighth Transportation Statistics Annual Report (TSAR), like : those before it, provides data and analysis on the U.S. transportation : system: its extent and condition, relationship to the : nation's security and economic growth, safety aspects,...
Supaporn, Pansuwan; Yeom, Sung Ho
2018-04-30
This study investigated the biological conversion of crude glycerol generated from a commercial biodiesel production plant as a by-product to 1,3-propanediol (1,3-PD). Statistical analysis was employed to derive a statistical model for the individual and interactive effects of glycerol, (NH 4 ) 2 SO 4 , trace elements, pH, and cultivation time on the four objectives: 1,3-PD concentration, yield, selectivity, and productivity. Optimum conditions for each objective with its maximum value were predicted by statistical optimization, and experiments under the optimum conditions verified the predictions. In addition, by systematic analysis of the values of four objectives, optimum conditions for 1,3-PD concentration (49.8 g/L initial glycerol, 4.0 g/L of (NH 4 ) 2 SO 4 , 2.0 mL/L of trace element, pH 7.5, and 11.2 h of cultivation time) were determined to be the global optimum culture conditions for 1,3-PD production. Under these conditions, we could achieve high 1,3-PD yield (47.4%), 1,3-PD selectivity (88.8%), and 1,3-PD productivity (2.1/g/L/h) as well as high 1,3-PD concentration (23.6 g/L).
Collagen morphology and texture analysis: from statistics to classification
Mostaço-Guidolin, Leila B.; Ko, Alex C.-T.; Wang, Fei; Xiang, Bo; Hewko, Mark; Tian, Ganghong; Major, Arkady; Shiomi, Masashi; Sowa, Michael G.
2013-01-01
In this study we present an image analysis methodology capable of quantifying morphological changes in tissue collagen fibril organization caused by pathological conditions. Texture analysis based on first-order statistics (FOS) and second-order statistics such as gray level co-occurrence matrix (GLCM) was explored to extract second-harmonic generation (SHG) image features that are associated with the structural and biochemical changes of tissue collagen networks. Based on these extracted quantitative parameters, multi-group classification of SHG images was performed. With combined FOS and GLCM texture values, we achieved reliable classification of SHG collagen images acquired from atherosclerosis arteries with >90% accuracy, sensitivity and specificity. The proposed methodology can be applied to a wide range of conditions involving collagen re-modeling, such as in skin disorders, different types of fibrosis and muscular-skeletal diseases affecting ligaments and cartilage. PMID:23846580
Kalegowda, Yogesh; Harmer, Sarah L
2012-03-20
Time-of-flight secondary ion mass spectrometry (TOF-SIMS) spectra of mineral samples are complex, comprised of large mass ranges and many peaks. Consequently, characterization and classification analysis of these systems is challenging. In this study, different chemometric and statistical data evaluation methods, based on monolayer sensitive TOF-SIMS data, have been tested for the characterization and classification of copper-iron sulfide minerals (chalcopyrite, chalcocite, bornite, and pyrite) at different flotation pulp conditions (feed, conditioned feed, and Eh modified). The complex mass spectral data sets were analyzed using the following chemometric and statistical techniques: principal component analysis (PCA); principal component-discriminant functional analysis (PC-DFA); soft independent modeling of class analogy (SIMCA); and k-Nearest Neighbor (k-NN) classification. PCA was found to be an important first step in multivariate analysis, providing insight into both the relative grouping of samples and the elemental/molecular basis for those groupings. For samples exposed to oxidative conditions (at Eh ~430 mV), each technique (PCA, PC-DFA, SIMCA, and k-NN) was found to produce excellent classification. For samples at reductive conditions (at Eh ~ -200 mV SHE), k-NN and SIMCA produced the most accurate classification. Phase identification of particles that contain the same elements but a different crystal structure in a mixed multimetal mineral system has been achieved.
Chou, C P; Bentler, P M; Satorra, A
1991-11-01
Research studying robustness of maximum likelihood (ML) statistics in covariance structure analysis has concluded that test statistics and standard errors are biased under severe non-normality. An estimation procedure known as asymptotic distribution free (ADF), making no distributional assumption, has been suggested to avoid these biases. Corrections to the normal theory statistics to yield more adequate performance have also been proposed. This study compares the performance of a scaled test statistic and robust standard errors for two models under several non-normal conditions and also compares these with the results from ML and ADF methods. Both ML and ADF test statistics performed rather well in one model and considerably worse in the other. In general, the scaled test statistic seemed to behave better than the ML test statistic and the ADF statistic performed the worst. The robust and ADF standard errors yielded more appropriate estimates of sampling variability than the ML standard errors, which were usually downward biased, in both models under most of the non-normal conditions. ML test statistics and standard errors were found to be quite robust to the violation of the normality assumption when data had either symmetric and platykurtic distributions, or non-symmetric and zero kurtotic distributions.
A new statistical methodology predicting chip failure probability considering electromigration
NASA Astrophysics Data System (ADS)
Sun, Ted
In this research thesis, we present a new approach to analyze chip reliability subject to electromigration (EM) whose fundamental causes and EM phenomenon happened in different materials are presented in this thesis. This new approach utilizes the statistical nature of EM failure in order to assess overall EM risk. It includes within-die temperature variations from the chip's temperature map extracted by an Electronic Design Automation (EDA) tool to estimate the failure probability of a design. Both the power estimation and thermal analysis are performed in the EDA flow. We first used the traditional EM approach to analyze the design with a single temperature across the entire chip that involves 6 metal and 5 via layers. Next, we used the same traditional approach but with a realistic temperature map. The traditional EM analysis approach and that coupled with a temperature map and the comparison between the results of considering and not considering temperature map are presented in in this research. A comparison between these two results confirms that using a temperature map yields a less pessimistic estimation of the chip's EM risk. Finally, we employed the statistical methodology we developed considering a temperature map and different use-condition voltages and frequencies to estimate the overall failure probability of the chip. The statistical model established considers the scaling work with the usage of traditional Black equation and four major conditions. The statistical result comparisons are within our expectations. The results of this statistical analysis confirm that the chip level failure probability is higher i) at higher use-condition frequencies for all use-condition voltages, and ii) when a single temperature instead of a temperature map across the chip is considered. In this thesis, I start with an overall review on current design types, common flows, and necessary verifications and reliability checking steps used in this IC design industry. Furthermore, the important concepts about "Scripting Automation" which is used in all the integration of using diversified EDA tools in this research work are also described in detail with several examples and my completed coding works are also put in the appendix for your reference. Hopefully, this construction of my thesis will give readers a thorough understanding about my research work from the automation of EDA tools to the statistical data generation, from the nature of EM to the statistical model construction, and the comparisons among the traditional EM analysis and the statistical EM analysis approaches.
NASA Astrophysics Data System (ADS)
Bierstedt, Svenja E.; Hünicke, Birgit; Zorita, Eduardo; Ludwig, Juliane
2017-07-01
We statistically analyse the relationship between the structure of migrating dunes in the southern Baltic and the driving wind conditions over the past 26 years, with the long-term aim of using migrating dunes as a proxy for past wind conditions at an interannual resolution. The present analysis is based on the dune record derived from geo-radar measurements by Ludwig et al. (2017). The dune system is located at the Baltic Sea coast of Poland and is migrating from west to east along the coast. The dunes present layers with different thicknesses that can be assigned to absolute dates at interannual timescales and put in relation to seasonal wind conditions. To statistically analyse this record and calibrate it as a wind proxy, we used a gridded regional meteorological reanalysis data set (coastDat2) covering recent decades. The identified link between the dune annual layers and wind conditions was additionally supported by the co-variability between dune layers and observed sea level variations in the southern Baltic Sea. We include precipitation and temperature into our analysis, in addition to wind, to learn more about the dependency between these three atmospheric factors and their common influence on the dune system. We set up a statistical linear model based on the correlation between the frequency of days with specific wind conditions in a given season and dune migration velocities derived for that season. To some extent, the dune records can be seen as analogous to tree-ring width records, and hence we use a proxy validation method usually applied in dendrochronology, cross-validation with the leave-one-out method, when the observational record is short. The revealed correlations between the wind record from the reanalysis and the wind record derived from the dune structure is in the range between 0.28 and 0.63, yielding similar statistical validation skill as dendroclimatological records.
Modified Distribution-Free Goodness-of-Fit Test Statistic.
Chun, So Yeon; Browne, Michael W; Shapiro, Alexander
2018-03-01
Covariance structure analysis and its structural equation modeling extensions have become one of the most widely used methodologies in social sciences such as psychology, education, and economics. An important issue in such analysis is to assess the goodness of fit of a model under analysis. One of the most popular test statistics used in covariance structure analysis is the asymptotically distribution-free (ADF) test statistic introduced by Browne (Br J Math Stat Psychol 37:62-83, 1984). The ADF statistic can be used to test models without any specific distribution assumption (e.g., multivariate normal distribution) of the observed data. Despite its advantage, it has been shown in various empirical studies that unless sample sizes are extremely large, this ADF statistic could perform very poorly in practice. In this paper, we provide a theoretical explanation for this phenomenon and further propose a modified test statistic that improves the performance in samples of realistic size. The proposed statistic deals with the possible ill-conditioning of the involved large-scale covariance matrices.
KaDonna C. Randolph
2006-01-01
The U.S. Department of Agriculture Forest Service, Forest Inventory and Analysis Program (FIA) utilizes visual assessments of tree crown condition to monitor changes and trends in forest health. This report describes and discusses distributions of three FIA crown condition indicators (crown density, crown dieback, and foliage transparency) for trees in the Southern...
Compositional Solution Space Quantification for Probabilistic Software Analysis
NASA Technical Reports Server (NTRS)
Borges, Mateus; Pasareanu, Corina S.; Filieri, Antonio; d'Amorim, Marcelo; Visser, Willem
2014-01-01
Probabilistic software analysis aims at quantifying how likely a target event is to occur during program execution. Current approaches rely on symbolic execution to identify the conditions to reach the target event and try to quantify the fraction of the input domain satisfying these conditions. Precise quantification is usually limited to linear constraints, while only approximate solutions can be provided in general through statistical approaches. However, statistical approaches may fail to converge to an acceptable accuracy within a reasonable time. We present a compositional statistical approach for the efficient quantification of solution spaces for arbitrarily complex constraints over bounded floating-point domains. The approach leverages interval constraint propagation to improve the accuracy of the estimation by focusing the sampling on the regions of the input domain containing the sought solutions. Preliminary experiments show significant improvement on previous approaches both in results accuracy and analysis time.
CADDIS Volume 4. Data Analysis: Selecting an Analysis Approach
An approach for selecting statistical analyses to inform causal analysis. Describes methods for determining whether test site conditions differ from reference expectations. Describes an approach for estimating stressor-response relationships.
Validating Future Force Performance Measures (Army Class): Concluding Analyses
2016-06-01
32 Table 3.10. Descriptive Statistics and Intercorrelations for LV Final Predictor Factor Scores...55 Table 4.7. Descriptive Statistics for Analysis Criteria...Soldier attrition and performance: Dependability (Non- Delinquency ), Adjustment, Physical Conditioning, Leadership, Work Orientation, and Agreeableness
On an Additive Semigraphoid Model for Statistical Networks With Application to Pathway Analysis.
Li, Bing; Chun, Hyonho; Zhao, Hongyu
2014-09-01
We introduce a nonparametric method for estimating non-gaussian graphical models based on a new statistical relation called additive conditional independence, which is a three-way relation among random vectors that resembles the logical structure of conditional independence. Additive conditional independence allows us to use one-dimensional kernel regardless of the dimension of the graph, which not only avoids the curse of dimensionality but also simplifies computation. It also gives rise to a parallel structure to the gaussian graphical model that replaces the precision matrix by an additive precision operator. The estimators derived from additive conditional independence cover the recently introduced nonparanormal graphical model as a special case, but outperform it when the gaussian copula assumption is violated. We compare the new method with existing ones by simulations and in genetic pathway analysis.
Detailed Analysis of the Interoccurrence Time Statistics in Seismic Activity
NASA Astrophysics Data System (ADS)
Tanaka, Hiroki; Aizawa, Yoji
2017-02-01
The interoccurrence time statistics of seismiciry is studied theoretically as well as numerically by taking into account the conditional probability and the correlations among many earthquakes in different magnitude levels. It is known so far that the interoccurrence time statistics is well approximated by the Weibull distribution, but the more detailed information about the interoccurrence times can be obtained from the analysis of the conditional probability. Firstly, we propose the Embedding Equation Theory (EET), where the conditional probability is described by two kinds of correlation coefficients; one is the magnitude correlation and the other is the inter-event time correlation. Furthermore, the scaling law of each correlation coefficient is clearly determined from the numerical data-analysis carrying out with the Preliminary Determination of Epicenter (PDE) Catalog and the Japan Meteorological Agency (JMA) Catalog. Secondly, the EET is examined to derive the magnitude dependence of the interoccurrence time statistics and the multi-fractal relation is successfully formulated. Theoretically we cannot prove the universality of the multi-fractal relation in seismic activity; nevertheless, the theoretical results well reproduce all numerical data in our analysis, where several common features or the invariant aspects are clearly observed. Especially in the case of stationary ensembles the multi-fractal relation seems to obey an invariant curve, furthermore in the case of non-stationary (moving time) ensembles for the aftershock regime the multi-fractal relation seems to satisfy a certain invariant curve at any moving times. It is emphasized that the multi-fractal relation plays an important role to unify the statistical laws of seismicity: actually the Gutenberg-Richter law and the Weibull distribution are unified in the multi-fractal relation, and some universality conjectures regarding the seismicity are briefly discussed.
NASA Astrophysics Data System (ADS)
Nguyen, A.; Mueller, C.; Brooks, A. N.; Kislik, E. A.; Baney, O. N.; Ramirez, C.; Schmidt, C.; Torres-Perez, J. L.
2014-12-01
The Sierra Nevada is experiencing changes in hydrologic regimes, such as decreases in snowmelt and peak runoff, which affect forest health and the availability of water resources. Currently, the USDA Forest Service Region 5 is undergoing Forest Plan revisions to include climate change impacts into mitigation and adaptation strategies. However, there are few processes in place to conduct quantitative assessments of forest conditions in relation to mountain hydrology, while easily and effectively delivering that information to forest managers. To assist the USDA Forest Service, this study is the final phase of a three-term project to create a Decision Support System (DSS) to allow ease of access to historical and forecasted hydrologic, climatic, and terrestrial conditions for the entire Sierra Nevada. This data is featured within three components of the DSS: the Mapping Viewer, Statistical Analysis Portal, and Geospatial Data Gateway. Utilizing ArcGIS Online, the Sierra DSS Mapping Viewer enables users to visually analyze and locate areas of interest. Once the areas of interest are targeted, the Statistical Analysis Portal provides subbasin level statistics for each variable over time by utilizing a recently developed web-based data analysis and visualization tool called Plotly. This tool allows users to generate graphs and conduct statistical analyses for the Sierra Nevada without the need to download the dataset of interest. For more comprehensive analysis, users are also able to download datasets via the Geospatial Data Gateway. The third phase of this project focused on Python-based data processing, the adaptation of the multiple capabilities of ArcGIS Online and Plotly, and the integration of the three Sierra DSS components within a website designed specifically for the USDA Forest Service.
Comparing the Fit of Item Response Theory and Factor Analysis Models
ERIC Educational Resources Information Center
Maydeu-Olivares, Alberto; Cai, Li; Hernandez, Adolfo
2011-01-01
Linear factor analysis (FA) models can be reliably tested using test statistics based on residual covariances. We show that the same statistics can be used to reliably test the fit of item response theory (IRT) models for ordinal data (under some conditions). Hence, the fit of an FA model and of an IRT model to the same data set can now be…
NASA Astrophysics Data System (ADS)
Ndehedehe, Christopher E.; Agutu, Nathan O.; Okwuashi, Onuwa; Ferreira, Vagner G.
2016-09-01
Lake Chad has recently been perceived to be completely desiccated and almost extinct due to insufficient published ground observations. Given the high spatial variability of rainfall in the region, and the fact that extreme climatic conditions (for example, droughts) could be intensifying in the Lake Chad basin (LCB) due to human activities, a spatio-temporal approach to drought analysis becomes essential. This study employed independent component analysis (ICA), a fourth-order cumulant statistics, to decompose standardised precipitation index (SPI), standardised soil moisture index (SSI), and terrestrial water storage (TWS) derived from Gravity Recovery and Climate Experiment (GRACE) into spatial and temporal patterns over the LCB. In addition, this study uses satellite altimetry data to estimate variations in the Lake Chad water levels, and further employs relevant climate teleconnection indices (El-Niño Southern Oscillation-ENSO, Atlantic Multi-decadal Oscillation-AMO, and Atlantic Meridional Mode-AMM) to examine their links to the observed drought temporal patterns over the basin. From the spatio-temporal drought analysis, temporal evolutions of SPI at 12 month aggregation show relatively wet conditions in the last two decades (although with marked alterations) with the 2012-2014 period being the wettest. In addition to the improved rainfall conditions during this period, there was a statistically significant increase of 0.04 m/yr in altimetry water levels observed over Lake Chad between 2008 and 2014, which confirms a shift in the hydrological conditions of the basin. Observed trend in TWS changes during the 2002-2014 period shows a statistically insignificant increase of 3.0 mm/yr at the centre of the basin, coinciding with soil moisture deficit indicated by the temporal evolutions of SSI at all monthly accumulations during the 2002-2003 and 2009-2012 periods. Further, SPI at 3 and 6 month scales indicated fluctuating drought conditions at the extreme south of the basin, coinciding with a statistically insignificant decline in TWS of about 4.5 mm/yr at the southern catchment of the basin. Finally, correlation analyses indicate that ENSO, AMO, and AMM are associated with extreme rainfall conditions in the basin, with AMO showing the strongest association (statistically significant correlation of 0.55) with SPI 12 month aggregation. Therefore, this study provides a framework that will support drought monitoring in the LCB.
USDA-ARS?s Scientific Manuscript database
Soil properties and weather conditions are known to affect soil nitrogen (N) availability and plant N uptake. However, studies examining N response as affected by soil and weather sometimes give conflicting results. Meta-analysis is a statistical method for estimating treatment effects in a series o...
NASA Astrophysics Data System (ADS)
Jin, Seung-Seop; Jung, Hyung-Jo
2014-03-01
It is well known that the dynamic properties of a structure such as natural frequencies depend not only on damage but also on environmental condition (e.g., temperature). The variation in dynamic characteristics of a structure due to environmental condition may mask damage of the structure. Without taking the change of environmental condition into account, false-positive or false-negative damage diagnosis may occur so that structural health monitoring becomes unreliable. In order to address this problem, an approach to construct a regression model based on structural responses considering environmental factors has been usually used by many researchers. The key to success of this approach is the formulation between the input and output variables of the regression model to take into account the environmental variations. However, it is quite challenging to determine proper environmental variables and measurement locations in advance for fully representing the relationship between the structural responses and the environmental variations. One alternative (i.e., novelty detection) is to remove the variations caused by environmental factors from the structural responses by using multivariate statistical analysis (e.g., principal component analysis (PCA), factor analysis, etc.). The success of this method is deeply depending on the accuracy of the description of normal condition. Generally, there is no prior information on normal condition during data acquisition, so that the normal condition is determined by subjective perspective with human-intervention. The proposed method is a novel adaptive multivariate statistical analysis for monitoring of structural damage detection under environmental change. One advantage of this method is the ability of a generative learning to capture the intrinsic characteristics of the normal condition. The proposed method is tested on numerically simulated data for a range of noise in measurement under environmental variation. A comparative study with conventional methods (i.e., fixed reference scheme) demonstrates the superior performance of the proposed method for structural damage detection.
Properties of some statistics for AR-ARCH model with application to technical analysis
NASA Astrophysics Data System (ADS)
Huang, Xudong; Liu, Wei
2009-03-01
In this paper, we investigate some popular technical analysis indexes for AR-ARCH model as real stock market. Under the given conditions, we show that the corresponding statistics are asymptotically stationary and the law of large numbers hold for frequencies of the stock prices falling out normal scope of these technical analysis indexes under AR-ARCH, and give the rate of convergence in the case of nonstationary initial values, which give a mathematical rationale for these methods of technical analysis in supervising the security trends.
The Design and Analysis of Transposon-Insertion Sequencing Experiments
Chao, Michael C.; Abel, Sören; Davis, Brigid M.; Waldor, Matthew K.
2016-01-01
Preface Transposon-insertion sequencing (TIS) is a powerful approach that can be widely applied to genome-wide definition of loci that are required for growth in diverse conditions. However, experimental design choices and stochastic biological processes can heavily influence the results of TIS experiments and affect downstream statistical analysis. Here, we discuss TIS experimental parameters and how these factors relate to the benefits and limitations of the various statistical frameworks that can be applied to computational analysis of TIS data. PMID:26775926
Piotrowski, T; Rodrigues, G; Bajon, T; Yartsev, S
2014-03-01
Multi-institutional collaborations allow for more information to be analyzed but the data from different sources may vary in the subgroup sizes and/or conditions of measuring. Rigorous statistical analysis is required for pooling the data in a larger set. Careful comparison of all the components of the data acquisition is indispensable: identical conditions allow for enlargement of the database with improved statistical analysis, clearly defined differences provide opportunity for establishing a better practice. The optimal sequence of required normality, asymptotic normality, and independence tests is proposed. An example of analysis of six subgroups of position corrections in three directions obtained during image guidance procedures for 216 prostate cancer patients from two institutions is presented. Copyright © 2013 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
The Statistical Analysis Techniques to Support the NGNP Fuel Performance Experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bihn T. Pham; Jeffrey J. Einerson
2010-06-01
This paper describes the development and application of statistical analysis techniques to support the AGR experimental program on NGNP fuel performance. The experiments conducted in the Idaho National Laboratory’s Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel/graphite temperature) is regulated by the He-Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the SAS-based NGNP Data Management and Analysis System (NDMAS) for automatedmore » processing and qualification of the AGR measured data. The NDMAS also stores daily neutronic (power) and thermal (heat transfer) code simulation results along with the measurement data, allowing for their combined use and comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the target quantity (fuel temperature) within a given range.« less
Statistics 101 for Radiologists.
Anvari, Arash; Halpern, Elkan F; Samir, Anthony E
2015-10-01
Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. © RSNA, 2015.
Davis, J.C.
2000-01-01
Geologists may feel that geological data are not amenable to statistical analysis, or at best require specialized approaches such as nonparametric statistics and geostatistics. However, there are many circumstances, particularly in systematic studies conducted for environmental or regulatory purposes, where traditional parametric statistical procedures can be beneficial. An example is the application of analysis of variance to data collected in an annual program of measuring groundwater levels in Kansas. Influences such as well conditions, operator effects, and use of the water can be assessed and wells that yield less reliable measurements can be identified. Such statistical studies have resulted in yearly improvements in the quality and reliability of the collected hydrologic data. Similar benefits may be achieved in other geological studies by the appropriate use of classical statistical tools.
Han, Seong Kyu; Lee, Dongyeop; Lee, Heetak; Kim, Donghyo; Son, Heehwa G; Yang, Jae-Seong; Lee, Seung-Jae V; Kim, Sanguk
2016-08-30
Online application for survival analysis (OASIS) has served as a popular and convenient platform for the statistical analysis of various survival data, particularly in the field of aging research. With the recent advances in the fields of aging research that deal with complex survival data, we noticed a need for updates to the current version of OASIS. Here, we report OASIS 2 (http://sbi.postech.ac.kr/oasis2), which provides extended statistical tools for survival data and an enhanced user interface. In particular, OASIS 2 enables the statistical comparison of maximal lifespans, which is potentially useful for determining key factors that limit the lifespan of a population. Furthermore, OASIS 2 provides statistical and graphical tools that compare values in different conditions and times. That feature is useful for comparing age-associated changes in physiological activities, which can be used as indicators of "healthspan." We believe that OASIS 2 will serve as a standard platform for survival analysis with advanced and user-friendly statistical tools for experimental biologists in the field of aging research.
Descriptive statistics of tree crown condition in the Northeastern United States
KaDonna C. Randolph; Randall S. Morin; Jim Steinman
2010-01-01
The U.S. Forest Service Forest Inventory and Analysis (FIA) Program uses visual assessments of tree crown condition to monitor changes and trends in forest health. This report describes four crown condition indicators (crown dieback, crown density, foliage transparency, and sapling crown vigor) measured in Connecticut, Delaware, Maine, Maryland, Massachusetts, New...
Chida, Koji; Morohashi, Gembu; Fuji, Hitoshi; Magata, Fumihiko; Fujimura, Akiko; Hamada, Koki; Ikarashi, Dai; Yamamoto, Ryuichi
2014-01-01
Background and objective While the secondary use of medical data has gained attention, its adoption has been constrained due to protection of patient privacy. Making medical data secure by de-identification can be problematic, especially when the data concerns rare diseases. We require rigorous security management measures. Materials and methods Using secure computation, an approach from cryptography, our system can compute various statistics over encrypted medical records without decrypting them. An issue of secure computation is that the amount of processing time required is immense. We implemented a system that securely computes healthcare statistics from the statistical computing software ‘R’ by effectively combining secret-sharing-based secure computation with original computation. Results Testing confirmed that our system could correctly complete computation of average and unbiased variance of approximately 50 000 records of dummy insurance claim data in a little over a second. Computation including conditional expressions and/or comparison of values, for example, t test and median, could also be correctly completed in several tens of seconds to a few minutes. Discussion If medical records are simply encrypted, the risk of leaks exists because decryption is usually required during statistical analysis. Our system possesses high-level security because medical records remain in encrypted state even during statistical analysis. Also, our system can securely compute some basic statistics with conditional expressions using ‘R’ that works interactively while secure computation protocols generally require a significant amount of processing time. Conclusions We propose a secure statistical analysis system using ‘R’ for medical data that effectively integrates secret-sharing-based secure computation and original computation. PMID:24763677
Chida, Koji; Morohashi, Gembu; Fuji, Hitoshi; Magata, Fumihiko; Fujimura, Akiko; Hamada, Koki; Ikarashi, Dai; Yamamoto, Ryuichi
2014-10-01
While the secondary use of medical data has gained attention, its adoption has been constrained due to protection of patient privacy. Making medical data secure by de-identification can be problematic, especially when the data concerns rare diseases. We require rigorous security management measures. Using secure computation, an approach from cryptography, our system can compute various statistics over encrypted medical records without decrypting them. An issue of secure computation is that the amount of processing time required is immense. We implemented a system that securely computes healthcare statistics from the statistical computing software 'R' by effectively combining secret-sharing-based secure computation with original computation. Testing confirmed that our system could correctly complete computation of average and unbiased variance of approximately 50,000 records of dummy insurance claim data in a little over a second. Computation including conditional expressions and/or comparison of values, for example, t test and median, could also be correctly completed in several tens of seconds to a few minutes. If medical records are simply encrypted, the risk of leaks exists because decryption is usually required during statistical analysis. Our system possesses high-level security because medical records remain in encrypted state even during statistical analysis. Also, our system can securely compute some basic statistics with conditional expressions using 'R' that works interactively while secure computation protocols generally require a significant amount of processing time. We propose a secure statistical analysis system using 'R' for medical data that effectively integrates secret-sharing-based secure computation and original computation. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
NASA Astrophysics Data System (ADS)
Giner-Sanz, J. J.; Ortega, E. M.; Pérez-Herranz, V.
2018-03-01
The internal resistance of a PEM fuel cell depends on the operation conditions and on the current delivered by the cell. This work's goal is to obtain a semiempirical model able to reproduce the effect of the operation current on the internal resistance of an individual cell of a commercial PEM fuel cell stack; and to perform a statistical analysis in order to study the effect of the operation temperature and the inlet humidities on the parameters of the model. First, the internal resistance of the individual fuel cell operating in different operation conditions was experimentally measured for different DC currents, using the high frequency intercept of the impedance spectra. Then, a semiempirical model based on Springer and co-workers' model was proposed. This model is able to successfully reproduce the experimental trends. Subsequently, the curves of resistance versus DC current obtained for different operation conditions were fitted to the semiempirical model, and an analysis of variance (ANOVA) was performed in order to determine which factors have a statistically significant effect on each model parameter. Finally, a response surface method was applied in order to obtain a regression model.
Krypotos, Angelos-Miltiadis; Klugkist, Irene; Engelhard, Iris M.
2017-01-01
ABSTRACT Threat conditioning procedures have allowed the experimental investigation of the pathogenesis of Post-Traumatic Stress Disorder. The findings of these procedures have also provided stable foundations for the development of relevant intervention programs (e.g. exposure therapy). Statistical inference of threat conditioning procedures is commonly based on p-values and Null Hypothesis Significance Testing (NHST). Nowadays, however, there is a growing concern about this statistical approach, as many scientists point to the various limitations of p-values and NHST. As an alternative, the use of Bayes factors and Bayesian hypothesis testing has been suggested. In this article, we apply this statistical approach to threat conditioning data. In order to enable the easy computation of Bayes factors for threat conditioning data we present a new R package named condir, which can be used either via the R console or via a Shiny application. This article provides both a non-technical introduction to Bayesian analysis for researchers using the threat conditioning paradigm, and the necessary tools for computing Bayes factors easily. PMID:29038683
Disutility analysis of oil spills: graphs and trends.
Ventikos, Nikolaos P; Sotiropoulos, Foivos S
2014-04-15
This paper reports the results of an analysis of oil spill cost data assembled from a worldwide pollution database that mainly includes data from the International Oil Pollution Compensation Fund. The purpose of the study is to analyze the conditions of marine pollution accidents and the factors that impact the costs of oil spills worldwide. The accidents are classified into categories based on their characteristics, and the cases are compared using charts to show how the costs are affected under all conditions. This study can be used as a helpful reference for developing a detailed statistical model that is capable of reliably and realistically estimating the total costs of oil spills. To illustrate the differences identified by this statistical analysis, the results are compared with the results of previous studies, and the findings are discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.
Descriptive statistics of tree crown condition in California, Oregon, and Washington
KaDonna C. Randolph; Sally J. Campbell; Glenn Christensen
2010-01-01
The U.S. Forest Service Forest Inventory and Analysis (FIA) Program uses visual assessments of tree crown condition to monitor changes and trends in forest health. This report describes four tree crown condition indicators (crown dieback, crown density, foliage transparency, and sapling crown vigor) measured in California, Oregon, and Washington between 1996 and 1999....
Descriptive statistics of tree crown condition in the North Central United States
KaDonna C. Randolph; Randall S. Morin; Jim Steinman
2010-01-01
The U.S. Forest Service Forest Inventory and Analysis (FIA) Program uses visual assessments of tree crown condition to monitor changes and trends in forest health. This report describes four crown condition indicators (crown dieback, crown density, foliage transparency, and sapling crown vigor) measured in Illinois, Indiana, Michigan, Minnesota, Missouri, and Wisconsin...
Descriptive statistics of tree crown condition in the United States Interior West
KaDonna C. Randolph; Mike T. Thompson
2010-01-01
The U.S. Forest Service Forest Inventory and Analysis (FIA) Program uses visual assessments of tree crown condition to monitor changes and trends in forest health. This report describes four crown condition indicators (crown dieback, crown density, foliage transparency, and sapling crown vigor) measured in Colorado, Idaho, Nevada, Utah, and Wyoming between 1996 and...
Kang, Guangliang; Du, Li; Zhang, Hong
2016-06-22
The growing complexity of biological experiment design based on high-throughput RNA sequencing (RNA-seq) is calling for more accommodative statistical tools. We focus on differential expression (DE) analysis using RNA-seq data in the presence of multiple treatment conditions. We propose a novel method, multiDE, for facilitating DE analysis using RNA-seq read count data with multiple treatment conditions. The read count is assumed to follow a log-linear model incorporating two factors (i.e., condition and gene), where an interaction term is used to quantify the association between gene and condition. The number of the degrees of freedom is reduced to one through the first order decomposition of the interaction, leading to a dramatically power improvement in testing DE genes when the number of conditions is greater than two. In our simulation situations, multiDE outperformed the benchmark methods (i.e. edgeR and DESeq2) even if the underlying model was severely misspecified, and the power gain was increasing in the number of conditions. In the application to two real datasets, multiDE identified more biologically meaningful DE genes than the benchmark methods. An R package implementing multiDE is available publicly at http://homepage.fudan.edu.cn/zhangh/softwares/multiDE . When the number of conditions is two, multiDE performs comparably with the benchmark methods. When the number of conditions is greater than two, multiDE outperforms the benchmark methods.
NASA Astrophysics Data System (ADS)
Irvine, John M.; Ghadar, Nastaran; Duncan, Steve; Floyd, David; O'Dowd, David; Lin, Kristie; Chang, Tom
2017-03-01
Quantitative biomarkers for assessing the presence, severity, and progression of age-related macular degeneration (AMD) would benefit research, diagnosis, and treatment. This paper explores development of quantitative biomarkers derived from OCT imagery of the retina. OCT images for approximately 75 patients with Wet AMD, Dry AMD, and no AMD (healthy eyes) were analyzed to identify image features indicative of the patients' conditions. OCT image features provide a statistical characterization of the retina. Healthy eyes exhibit a layered structure, whereas chaotic patterns indicate the deterioration associated with AMD. Our approach uses wavelet and Frangi filtering, combined with statistical features that do not rely on image segmentation, to assess patient conditions. Classification analysis indicates clear separability of Wet AMD from other conditions, including Dry AMD and healthy retinas. The probability of correct classification of was 95.7%, as determined from cross validation. Similar classification analysis predicts the response of Wet AMD patients to treatment, as measured by the Best Corrected Visual Acuity (BCVA). A statistical model predicts BCVA from the imagery features with R2 = 0.846. Initial analysis of OCT imagery indicates that imagery-derived features can provide useful biomarkers for characterization and quantification of AMD: Accurate assessment of Wet AMD compared to other conditions; image-based prediction of outcome for Wet AMD treatment; and features derived from the OCT imagery accurately predict BCVA; unlike many methods in the literature, our techniques do not rely on segmentation of the OCT image. Next steps include larger scale testing and validation.
A new approach for remediation of As-contaminated soil: ball mill-based technique.
Shin, Yeon-Jun; Park, Sang-Min; Yoo, Jong-Chan; Jeon, Chil-Sung; Lee, Seung-Woo; Baek, Kitae
2016-02-01
In this study, a physical ball mill process instead of chemical extraction using toxic chemical agents was applied to remove arsenic (As) from contaminated soil. A statistical analysis was carried out to establish the optimal conditions for ball mill processing. As a result of the statistical analysis, approximately 70% of As was removed from the soil at the following conditions: 5 min, 1.0 cm, 10 rpm, and 5% of operating time, media size, rotational velocity, and soil loading conditions, respectively. A significant amount of As remained in the grinded fine soil after ball mill processing while more than 90% of soil has the original properties to be reused or recycled. As a result, the ball mill process could remove the metals bound strongly to the surface of soil by the surface grinding, which could be applied as a pretreatment before application of chemical extraction to reduce the load.
NASA Astrophysics Data System (ADS)
Jokhio, Gul A.; Syed Mohsin, Sharifah M.; Gul, Yasmeen
2018-04-01
It has been established that Adobe provides, in addition to being sustainable and economic, a better indoor air quality without spending extensive amounts of energy as opposed to the modern synthetic materials. The material, however, suffers from weak structural behaviour when subjected to adverse loading conditions. A wide range of mechanical properties has been reported in literature owing to lack of research and standardization. The present paper presents the statistical analysis of the results that were obtained through compressive and flexural tests on Adobe samples. Adobe specimens with and without wire mesh reinforcement were tested and the results were reported. The statistical analysis of these results presents an interesting read. It has been found that the compressive strength of adobe increases by about 43% after adding a single layer of wire mesh reinforcement. This increase is statistically significant. The flexural response of Adobe has also shown improvement with the addition of wire mesh reinforcement, however, the statistical significance of the same cannot be established.
Statistical analysis of flight times for space shuttle ferry flights
NASA Technical Reports Server (NTRS)
Graves, M. E.; Perlmutter, M.
1974-01-01
Markov chain and Monte Carlo analysis techniques are applied to the simulated Space Shuttle Orbiter Ferry flights to obtain statistical distributions of flight time duration between Edwards Air Force Base and Kennedy Space Center. The two methods are compared, and are found to be in excellent agreement. The flights are subjected to certain operational and meteorological requirements, or constraints, which cause eastbound and westbound trips to yield different results. Persistence of events theory is applied to the occurrence of inclement conditions to find their effect upon the statistical flight time distribution. In a sensitivity test, some of the constraints are varied to observe the corresponding changes in the results.
Entropy in statistical energy analysis.
Le Bot, Alain
2009-03-01
In this paper, the second principle of thermodynamics is discussed in the framework of statistical energy analysis (SEA). It is shown that the "vibrational entropy" and the "vibrational temperature" of sub-systems only depend on the vibrational energy and the number of resonant modes. A SEA system can be described as a thermodynamic system slightly out of equilibrium. In steady-state condition, the entropy exchanged with exterior by sources and dissipation exactly balances the production of entropy by irreversible processes at interface between SEA sub-systems.
NASA Technical Reports Server (NTRS)
Berrios, Daniel C.; Thompson, Terri G.
2015-01-01
NASA GeneLab is expected to capture and distribute omics data and experimental and process conditions most relevant to research community in their statistical and theoretical analysis of NASAs omics data.
Chahal, Gurparkash Singh; Chhina, Kamalpreet; Chhabra, Vipin; Bhatnagar, Rakhi; Chahal, Amna
2014-01-01
Background: A surface smear layer consisting of organic and inorganic material is formed on the root surface following mechanical instrumentation and may inhibit the formation of new connective tissue attachment to the root surface. Modification of the tooth surface by root conditioning has resulted in improved connective tissue attachment and has advanced the goal of reconstructive periodontal treatment. Aim: The aim of this study was to compare the effects of citric acid, tetracycline, and doxycycline on the instrumented periodontally involved root surfaces in vitro using a scanning electron microscope. Settings and Design: A total of 45 dentin samples obtained from 15 extracted, scaled, and root planed teeth were divided into three groups. Materials and Methods: The root conditioning agents were applied with cotton pellets using the Passive burnishing technique for 5 minutes. The samples were then examined by the scanning electron microscope. Statistical Analysis Used: The statistical analysis was carried out using Statistical Package for Social Sciences (SPSS Inc., Chicago, IL, version 15.0 for Windows). For all quantitative variables means and standard deviations were calculated and compared. For more than two groups ANOVA was applied. For multiple comparisons post hoc tests with Bonferroni correction was used. Results: Upon statistical analysis the root conditioning agents used in this study were found to be effective in removing the smear layer, uncovering and widening the dentin tubules and unmasking the dentin collagen matrix. Conclusion: Tetracycline HCl was found to be the best root conditioner among the three agents used. PMID:24744541
Li, Ke; Zhang, Qiuju; Wang, Kun; Chen, Peng; Wang, Huaqing
2016-01-01
A new fault diagnosis method for rotating machinery based on adaptive statistic test filter (ASTF) and Diagnostic Bayesian Network (DBN) is presented in this paper. ASTF is proposed to obtain weak fault features under background noise, ASTF is based on statistic hypothesis testing in the frequency domain to evaluate similarity between reference signal (noise signal) and original signal, and remove the component of high similarity. The optimal level of significance α is obtained using particle swarm optimization (PSO). To evaluate the performance of the ASTF, evaluation factor Ipq is also defined. In addition, a simulation experiment is designed to verify the effectiveness and robustness of ASTF. A sensitive evaluation method using principal component analysis (PCA) is proposed to evaluate the sensitiveness of symptom parameters (SPs) for condition diagnosis. By this way, the good SPs that have high sensitiveness for condition diagnosis can be selected. A three-layer DBN is developed to identify condition of rotation machinery based on the Bayesian Belief Network (BBN) theory. Condition diagnosis experiment for rolling element bearings demonstrates the effectiveness of the proposed method. PMID:26761006
Li, Ke; Zhang, Qiuju; Wang, Kun; Chen, Peng; Wang, Huaqing
2016-01-08
A new fault diagnosis method for rotating machinery based on adaptive statistic test filter (ASTF) and Diagnostic Bayesian Network (DBN) is presented in this paper. ASTF is proposed to obtain weak fault features under background noise, ASTF is based on statistic hypothesis testing in the frequency domain to evaluate similarity between reference signal (noise signal) and original signal, and remove the component of high similarity. The optimal level of significance α is obtained using particle swarm optimization (PSO). To evaluate the performance of the ASTF, evaluation factor Ipq is also defined. In addition, a simulation experiment is designed to verify the effectiveness and robustness of ASTF. A sensitive evaluation method using principal component analysis (PCA) is proposed to evaluate the sensitiveness of symptom parameters (SPs) for condition diagnosis. By this way, the good SPs that have high sensitiveness for condition diagnosis can be selected. A three-layer DBN is developed to identify condition of rotation machinery based on the Bayesian Belief Network (BBN) theory. Condition diagnosis experiment for rolling element bearings demonstrates the effectiveness of the proposed method.
Statistical functions and relevant correlation coefficients of clearness index
NASA Astrophysics Data System (ADS)
Pavanello, Diego; Zaaiman, Willem; Colli, Alessandra; Heiser, John; Smith, Scott
2015-08-01
This article presents a statistical analysis of the sky conditions, during years from 2010 to 2012, for three different locations: the Joint Research Centre site in Ispra (Italy, European Solar Test Installation - ESTI laboratories), the site of National Renewable Energy Laboratory in Golden (Colorado, USA) and the site of Brookhaven National Laboratories in Upton (New York, USA). The key parameter is the clearness index kT, a dimensionless expression of the global irradiance impinging upon a horizontal surface at a given instant of time. In the first part, the sky conditions are characterized using daily averages, giving a general overview of the three sites. In the second part the analysis is performed using data sets with a short-term resolution of 1 sample per minute, demonstrating remarkable properties of the statistical distributions of the clearness index, reinforced by a proof using fuzzy logic methods. Successively some time-dependent correlations between different meteorological variables are presented in terms of Pearson and Spearman correlation coefficients, and introducing a new one.
Dynamical interpretation of conditional patterns
NASA Technical Reports Server (NTRS)
Adrian, R. J.; Moser, R. D.; Moin, P.
1988-01-01
While great progress is being made in characterizing the 3-D structure of organized turbulent motions using conditional averaging analysis, there is a lack of theoretical guidance regarding the interpretation and utilization of such information. Questions concerning the significance of the structures, their contributions to various transport properties, and their dynamics cannot be answered without recourse to appropriate dynamical governing equations. One approach which addresses some of these questions uses the conditional fields as initial conditions and calculates their evolution from the Navier-Stokes equations, yielding valuable information about stability, growth, and longevity of the mean structure. To interpret statistical aspects of the structures, a different type of theory which deals with the structures in the context of their contributions to the statistics of the flow is needed. As a first step toward this end, an effort was made to integrate the structural information from the study of organized structures with a suitable statistical theory. This is done by stochastically estimating the two-point conditional averages that appear in the equation for the one-point probability density function, and relating the structures to the conditional stresses. Salient features of the estimates are identified, and the structure of the one-point estimates in channel flow is defined.
Friedman, David B
2012-01-01
All quantitative proteomics experiments measure variation between samples. When performing large-scale experiments that involve multiple conditions or treatments, the experimental design should include the appropriate number of individual biological replicates from each condition to enable the distinction between a relevant biological signal from technical noise. Multivariate statistical analyses, such as principal component analysis (PCA), provide a global perspective on experimental variation, thereby enabling the assessment of whether the variation describes the expected biological signal or the unanticipated technical/biological noise inherent in the system. Examples will be shown from high-resolution multivariable DIGE experiments where PCA was instrumental in demonstrating biologically significant variation as well as sample outliers, fouled samples, and overriding technical variation that would not be readily observed using standard univariate tests.
NASA Astrophysics Data System (ADS)
Barré, Anthony; Suard, Frédéric; Gérard, Mathias; Montaru, Maxime; Riu, Delphine
2014-01-01
This paper describes the statistical analysis of recorded data parameters of electrical battery ageing during electric vehicle use. These data permit traditional battery ageing investigation based on the evolution of the capacity fade and resistance raise. The measured variables are examined in order to explain the correlation between battery ageing and operating conditions during experiments. Such study enables us to identify the main ageing factors. Then, detailed statistical dependency explorations present the responsible factors on battery ageing phenomena. Predictive battery ageing models are built from this approach. Thereby results demonstrate and quantify a relationship between variables and battery ageing global observations, and also allow accurate battery ageing diagnosis through predictive models.
Improved Test Planning and Analysis Through the Use of Advanced Statistical Methods
NASA Technical Reports Server (NTRS)
Green, Lawrence L.; Maxwell, Katherine A.; Glass, David E.; Vaughn, Wallace L.; Barger, Weston; Cook, Mylan
2016-01-01
The goal of this work is, through computational simulations, to provide statistically-based evidence to convince the testing community that a distributed testing approach is superior to a clustered testing approach for most situations. For clustered testing, numerous, repeated test points are acquired at a limited number of test conditions. For distributed testing, only one or a few test points are requested at many different conditions. The statistical techniques of Analysis of Variance (ANOVA), Design of Experiments (DOE) and Response Surface Methods (RSM) are applied to enable distributed test planning, data analysis and test augmentation. The D-Optimal class of DOE is used to plan an optimally efficient single- and multi-factor test. The resulting simulated test data are analyzed via ANOVA and a parametric model is constructed using RSM. Finally, ANOVA can be used to plan a second round of testing to augment the existing data set with new data points. The use of these techniques is demonstrated through several illustrative examples. To date, many thousands of comparisons have been performed and the results strongly support the conclusion that the distributed testing approach outperforms the clustered testing approach.
Statistical analysis of landing contact conditions for three lifting body research vehicles
NASA Technical Reports Server (NTRS)
Larson, R. R.
1972-01-01
The landing contact conditions for the HL-10, M2-F2/F3, and the X-24A lifting body vehicles are analyzed statistically for 81 landings. The landing contact parameters analyzed are true airspeed, peak normal acceleration at the center of gravity, roll angle, and roll velocity. Ground measurement parameters analyzed are lateral and longitudinal distance from intended touchdown, lateral distance from touchdown to full stop, and rollout distance. The results are presented in the form of histograms for frequency distributions and cumulative frequency distribution probability curves with a Pearson Type 3 curve fit for extrapolation purposes.
ERIC Educational Resources Information Center
Fouladi, Rachel T.
2000-01-01
Provides an overview of standard and modified normal theory and asymptotically distribution-free covariance and correlation structure analysis techniques and details Monte Carlo simulation results on Type I and Type II error control. Demonstrates through the simulation that robustness and nonrobustness of structure analysis techniques vary as a…
The slip resistance of common footwear materials measured with two slipmeters.
Chang, W R; Matz, S
2001-12-01
The slip resistance of 16 commonly used footwear materials was measured with the Brungraber Mark II and the English XL on 3 floor surfaces under surface conditions of dry, wet, oily and oily wet. Three samples were used for each material combination and surface condition. The results of a one way ANOVA analysis indicated that the differences among different samples were statistically significant for a large number of material combinations and surface conditions. The results indicated that the ranking of materials based on their slip resistance values depends highly on the slipmeters, floor surfaces and surface conditions. For contaminated surfaces including wet, oily and oily wet surfaces, the slip resistance obtained with the English XL was usually higher than that measured with the Brungraber Mark II. The correlation coefficients between the slip resistance obtained with these two slipmeters calculated for different surface conditions indicated a strong correlation with statistical significance.
NASA Technical Reports Server (NTRS)
1989-01-01
An assessment of quantitative methods and measures for measuring launch commit criteria (LCC) performance measurement trends is made. A statistical performance trending analysis pilot study was processed and compared to STS-26 mission data. This study used four selected shuttle measurement types (solid rocket booster, external tank, space shuttle main engine, and range safety switch safe and arm device) from the five missions prior to mission 51-L. After obtaining raw data coordinates, each set of measurements was processed to obtain statistical confidence bounds and mean data profiles for each of the selected measurement types. STS-26 measurements were compared to the statistical data base profiles to verify the statistical capability of assessing occurrences of data trend anomalies and abnormal time-varying operational conditions associated with data amplitude and phase shifts.
Video pulse rate variability analysis in stationary and motion conditions.
Melchor Rodríguez, Angel; Ramos-Castro, J
2018-01-29
In the last few years, some studies have measured heart rate (HR) or heart rate variability (HRV) parameters using a video camera. This technique focuses on the measurement of the small changes in skin colour caused by blood perfusion. To date, most of these works have obtained HRV parameters in stationary conditions, and there are practically no studies that obtain these parameters in motion scenarios and by conducting an in-depth statistical analysis. In this study, a video pulse rate variability (PRV) analysis is conducted by measuring the pulse-to-pulse (PP) intervals in stationary and motion conditions. Firstly, given the importance of the sampling rate in a PRV analysis and the low frame rate of commercial cameras, we carried out an analysis of two models to evaluate their performance in the measurements. We propose a selective tracking method using the Viola-Jones and KLT algorithms, with the aim of carrying out a robust video PRV analysis in stationary and motion conditions. Data and results of the proposed method are contrasted with those reported in the state of the art. The webcam achieved better results in the performance analysis of video cameras. In stationary conditions, high correlation values were obtained in PRV parameters with results above 0.9. The PP time series achieved an RMSE (mean ± standard deviation) of 19.45 ± 5.52 ms (1.70 ± 0.75 bpm). In the motion analysis, most of the PRV parameters also achieved good correlation results, but with lower values as regards stationary conditions. The PP time series presented an RMSE of 21.56 ± 6.41 ms (1.79 ± 0.63 bpm). The statistical analysis showed good agreement between the reference system and the proposed method. In stationary conditions, the results of PRV parameters were improved by our method in comparison with data reported in related works. An overall comparative analysis of PRV parameters in motion conditions was more limited due to the lack of studies or studies containing insufficient data analysis. Based on the results, the proposed method could provide a low-cost, contactless and reliable alternative for measuring HR or PRV parameters in non-clinical environments.
Summary of Hydrologic Conditions in Georgia, 2008
Knaak, Andrew E.; Joiner, John K.; Peck, Michael F.
2009-01-01
The United States Geological Survey (USGS) Georgia Water Science Center (WSC) maintains a long-term hydrologic monitoring network of more than 290 real-time streamgages, more than 170 groundwater wells, and 10 lake and reservoir monitoring stations. One of the many benefits of data collected from this monitoring network is that analysis of the data provides an overview of the hydrologic conditions of rivers, creeks, reservoirs, and aquifers in Georgia. Hydrologic conditions are determined by statistical analysis of data collected during the current water year (WY) and comparison of the results to historical data collected at long-term stations. During the drought that persisted through 2008, the USGS succeeded in verifying and documenting numerous historic low-flow statistics at many streamgages and current water levels in aquifers, lakes, and reservoirs in Georgia. Streamflow data from the 2008 WY indicate that this drought is one of the most severe on record when compared to drought periods of 1950-1957, 1985-1989, and 1999-2002.
Nogueira, Leandro Alberto Calazans; Santos, Luciano Teixeira Dos; Sabino, Pollyane Galinari; Alvarenga, Regina Maria Papais; Thuler, Luiz Claudio Santos
2013-08-01
We analysed the cognitive influence on walking in multiple sclerosis (MS) patients, in the absence of clinical disability. A case-control study was conducted with 12 MS patients with no disability and 12 matched healthy controls. Subjects were referred for completion a timed walk test of 10 m and a 3D-kinematic analysis. Participants were instructed to walk at a comfortable speed in a dual-task (arithmetic task) condition, and motor planning was measured by mental chronometry. Scores of walking speed and cadence showed no statistically significant differences between the groups in the three conditions. The dual-task condition showed an increase in the double support duration in both groups. Motor imagery analysis showed statistically significant differences between real and imagined walking in patients. MS patients with no disability did not show any influence of divided attention on walking execution. However, motor planning was overestimated as compared with real walking.
NASA Astrophysics Data System (ADS)
Guo, Zhan; Yan, Xuefeng
2018-04-01
Different operating conditions of p-xylene oxidation have different influences on the product, purified terephthalic acid. It is necessary to obtain the optimal combination of reaction conditions to ensure the quality of the products, cut down on consumption and increase revenues. A multi-objective differential evolution (MODE) algorithm co-evolved with the population-based incremental learning (PBIL) algorithm, called PBMODE, is proposed. The PBMODE algorithm was designed as a co-evolutionary system. Each individual has its own parameter individual, which is co-evolved by PBIL. PBIL uses statistical analysis to build a model based on the corresponding symbiotic individuals of the superior original individuals during the main evolutionary process. The results of simulations and statistical analysis indicate that the overall performance of the PBMODE algorithm is better than that of the compared algorithms and it can be used to optimize the operating conditions of the p-xylene oxidation process effectively and efficiently.
Effective Thermal Inactivation of the Spores of Bacillus cereus Biofilms Using Microwave.
Park, Hyong Seok; Yang, Jungwoo; Choi, Hee Jung; Kim, Kyoung Heon
2017-07-28
Microwave sterilization was performed to inactivate the spores of biofilms of Bacillus cereus involved in foodborne illness. The sterilization conditions, such as the amount of water and the operating temperature and treatment time, were optimized using statistical analysis based on 15 runs of experimental results designed by the Box-Behnken method. Statistical analysis showed that the optimal conditions for the inactivation of B. cereus biofilms were 14 ml of water, 108°C of temperature, and 15 min of treatment time. Interestingly, response surface plots showed that the amount of water is the most important factor for microwave sterilization under the present conditions. Complete inactivation by microwaves was achieved in 5 min, and the inactivation efficiency by microwave was obviously higher than that by conventional steam autoclave. Finally, confocal laser scanning microscopy images showed that the principal effect of microwave treatment was cell membrane disruption. Thus, this study can contribute to the development of a process to control food-associated pathogens.
Effective Analysis of Reaction Time Data
ERIC Educational Resources Information Center
Whelan, Robert
2008-01-01
Most analyses of reaction time (RT) data are conducted by using the statistical techniques with which psychologists are most familiar, such as analysis of variance on the sample mean. Unfortunately, these methods are usually inappropriate for RT data, because they have little power to detect genuine differences in RT between conditions. In…
Samuel A. Cushman; Kevin S. McKelvey
2006-01-01
The primary weakness in our current ability to evaluate future landscapes in terms of wildlife lies in the lack of quantitative models linking wildlife to forest stand conditions, including fuels treatments. This project focuses on 1) developing statistical wildlife habitat relationships models (WHR) utilizing Forest Inventory and Analysis (FIA) and National Vegetation...
Fatima, Nikhat; Khan, Aleem A.; Vishwakarma, Sandeep K.
2017-01-01
Background: Growing evidence shows that dental pulp (DP) tissues could be a potential source of adult stem cells for the treatment of devastating neurological diseases and several other conditions. Aims: Exploration of the expression profile of several key molecular markers to evaluate the molecular dynamics in undifferentiated and differentiated DP-derived stem cells (DPSCs) in vitro. Settings and Design: The characteristics and multilineage differentiation ability of DPSCs were determined by cellular and molecular kinetics. DPSCs were further induced to form adherent (ADH) and non-ADH (NADH) neurospheres under serum-free condition which was further induced into neurogenic lineage cells and characterized for their molecular and cellular diversity at each stage. Statistical Analysis Used: Statistical analysis used one-way analysis of variance, Student's t-test, Livak method for relative quantification, and R programming. Results: Immunophenotypic analysis of DPSCs revealed >80% cells positive for mesenchymal markers CD90 and CD105, >70% positive for transferring receptor (CD71), and >30% for chemotactic factor (CXCR3). These cells showed mesodermal differentiation also and confirmed by specific staining and molecular analysis. Activation of neuronal lineage markers and neurogenic growth factors was observed during lineage differentiation of cells derived from NADH and ADH spheroids. Greater than 80% of cells were found to express β-tubulin III in both differentiation conditions. Conclusions: The present study reported a cascade of immunophenotypic and molecular markers to characterize neurogenic differentiation of DPSCs under serum-free condition. These findings trigger the future analyses for clinical applicability of DP-derived cells in regenerative applications. PMID:28566856
NASA Technical Reports Server (NTRS)
Welker, J.
1981-01-01
A histogram analysis of average monthly precipitation over 30 and 84 year periods for both Maryland and Kansas was made and the results compared. A second analysis, a statistical assessment of the effect of average monthly precipitation on Kansas winter wheat yield was made. The data sets covered the three periods of 1941-1970, 1887-1970, and 1887-1921. Analyses of the limited data sets used (only the average monthly precipitation and temperature were correlated against yield) indicated that fall precipitation values, especially those of September and October, were more important to winter wheat yield than were spring values, particularly for the period 1941-1970.
System Analysis for the Huntsville Operation Support Center, Distributed Computer System
NASA Technical Reports Server (NTRS)
Ingels, F. M.; Massey, D.
1985-01-01
HOSC as a distributed computing system, is responsible for data acquisition and analysis during Space Shuttle operations. HOSC also provides computing services for Marshall Space Flight Center's nonmission activities. As mission and nonmission activities change, so do the support functions of HOSC change, demonstrating the need for some method of simulating activity at HOSC in various configurations. The simulation developed in this work primarily models the HYPERchannel network. The model simulates the activity of a steady state network, reporting statistics such as, transmitted bits, collision statistics, frame sequences transmitted, and average message delay. These statistics are used to evaluate such performance indicators as throughout, utilization, and delay. Thus the overall performance of the network is evaluated, as well as predicting possible overload conditions.
Automated Box-Cox Transformations for Improved Visual Encoding.
Maciejewski, Ross; Pattath, Avin; Ko, Sungahn; Hafen, Ryan; Cleveland, William S; Ebert, David S
2013-01-01
The concept of preconditioning data (utilizing a power transformation as an initial step) for analysis and visualization is well established within the statistical community and is employed as part of statistical modeling and analysis. Such transformations condition the data to various inherent assumptions of statistical inference procedures, as well as making the data more symmetric and easier to visualize and interpret. In this paper, we explore the use of the Box-Cox family of power transformations to semiautomatically adjust visual parameters. We focus on time-series scaling, axis transformations, and color binning for choropleth maps. We illustrate the usage of this transformation through various examples, and discuss the value and some issues in semiautomatically using these transformations for more effective data visualization.
ERIC Educational Resources Information Center
Serebryakova, Tat'yana A.; Morozova, Lyudmila B.; Kochneva, Elena M.; Zharova, Darya V.; Kostyleva, Elena A.; Kolarkova, Oxana G.
2016-01-01
Background/Objectives: The objective of the paper is analysis and description of findings of an empiric study on the issue of social and psychological adaptation of first year students to studying in a higher educational institution. Methods/Statistical analysis: Using the methods of theoretical analysis the paper's authors plan and carry out an…
Dale D. Gormanson; Scott A. Pugh; Charles J. Barnett; Patrick D. Miles; Randall S. Morin; Paul A. Sowers; Jim Westfall
2017-01-01
The U.S. Forest Service Forest Inventory and Analysis (FIA) program collects sample plot data on all forest ownerships across the United States. FIA's primary objective is to determine the extent, condition, volume, growth, and use of trees on the Nation's forest land through a comprehensive inventory and analysis of the Nation's forest resources. The...
Modelling short time series in metabolomics: a functional data analysis approach.
Montana, Giovanni; Berk, Maurice; Ebbels, Tim
2011-01-01
Metabolomics is the study of the complement of small molecule metabolites in cells, biofluids and tissues. Many metabolomic experiments are designed to compare changes observed over time under two or more experimental conditions (e.g. a control and drug-treated group), thus producing time course data. Models from traditional time series analysis are often unsuitable because, by design, only very few time points are available and there are a high number of missing values. We propose a functional data analysis approach for modelling short time series arising in metabolomic studies which overcomes these obstacles. Our model assumes that each observed time series is a smooth random curve, and we propose a statistical approach for inferring this curve from repeated measurements taken on the experimental units. A test statistic for detecting differences between temporal profiles associated with two experimental conditions is then presented. The methodology has been applied to NMR spectroscopy data collected in a pre-clinical toxicology study.
Specialized data analysis of SSME and advanced propulsion system vibration measurements
NASA Technical Reports Server (NTRS)
Coffin, Thomas; Swanson, Wayne L.; Jong, Yen-Yi
1993-01-01
The basic objectives of this contract were to perform detailed analysis and evaluation of dynamic data obtained during Space Shuttle Main Engine (SSME) test and flight operations, including analytical/statistical assessment of component dynamic performance, and to continue the development and implementation of analytical/statistical models to effectively define nominal component dynamic characteristics, detect anomalous behavior, and assess machinery operational conditions. This study was to provide timely assessment of engine component operational status, identify probable causes of malfunction, and define feasible engineering solutions. The work was performed under three broad tasks: (1) Analysis, Evaluation, and Documentation of SSME Dynamic Test Results; (2) Data Base and Analytical Model Development and Application; and (3) Development and Application of Vibration Signature Analysis Techniques.
Richard. D. Wood-Smith; John M. Buffington
1996-01-01
Multivariate statistical analyses of geomorphic variables from 23 forest stream reaches in southeast Alaska result in successful discrimination between pristine streams and those disturbed by land management, specifically timber harvesting and associated road building. Results of discriminant function analysis indicate that a three-variable model discriminates 10...
Reproducibility-optimized test statistic for ranking genes in microarray studies.
Elo, Laura L; Filén, Sanna; Lahesmaa, Riitta; Aittokallio, Tero
2008-01-01
A principal goal of microarray studies is to identify the genes showing differential expression under distinct conditions. In such studies, the selection of an optimal test statistic is a crucial challenge, which depends on the type and amount of data under analysis. While previous studies on simulated or spike-in datasets do not provide practical guidance on how to choose the best method for a given real dataset, we introduce an enhanced reproducibility-optimization procedure, which enables the selection of a suitable gene- anking statistic directly from the data. In comparison with existing ranking methods, the reproducibilityoptimized statistic shows good performance consistently under various simulated conditions and on Affymetrix spike-in dataset. Further, the feasibility of the novel statistic is confirmed in a practical research setting using data from an in-house cDNA microarray study of asthma-related gene expression changes. These results suggest that the procedure facilitates the selection of an appropriate test statistic for a given dataset without relying on a priori assumptions, which may bias the findings and their interpretation. Moreover, the general reproducibilityoptimization procedure is not limited to detecting differential expression only but could be extended to a wide range of other applications as well.
NASA Technical Reports Server (NTRS)
Perry, Boyd, III; Pototzky, Anthony S.; Woods, Jessica A.
1989-01-01
The results of a NASA investigation of a claimed Overlap between two gust response analysis methods: the Statistical Discrete Gust (SDG) Method and the Power Spectral Density (PSD) Method are presented. The claim is that the ratio of an SDG response to the corresponding PSD response is 10.4. Analytical results presented for several different airplanes at several different flight conditions indicate that such an Overlap does appear to exist. However, the claim was not met precisely: a scatter of up to about 10 percent about the 10.4 factor can be expected.
Yan, Binjun; Fang, Zhonghua; Shen, Lijuan; Qu, Haibin
2015-01-01
The batch-to-batch quality consistency of herbal drugs has always been an important issue. To propose a methodology for batch-to-batch quality control based on HPLC-MS fingerprints and process knowledgebase. The extraction process of Compound E-jiao Oral Liquid was taken as a case study. After establishing the HPLC-MS fingerprint analysis method, the fingerprints of the extract solutions produced under normal and abnormal operation conditions were obtained. Multivariate statistical models were built for fault detection and a discriminant analysis model was built using the probabilistic discriminant partial-least-squares method for fault diagnosis. Based on multivariate statistical analysis, process knowledge was acquired and the cause-effect relationship between process deviations and quality defects was revealed. The quality defects were detected successfully by multivariate statistical control charts and the type of process deviations were diagnosed correctly by discriminant analysis. This work has demonstrated the benefits of combining HPLC-MS fingerprints, process knowledge and multivariate analysis for the quality control of herbal drugs. Copyright © 2015 John Wiley & Sons, Ltd.
NASA Technical Reports Server (NTRS)
Parrish, R. V.; Mckissick, B. T.; Steinmetz, G. G.
1979-01-01
A recent modification of the methodology of profile analysis, which allows the testing for differences between two functions as a whole with a single test, rather than point by point with multiple tests is discussed. The modification is applied to the examination of the issue of motion/no motion conditions as shown by the lateral deviation curve as a function of engine cut speed of a piloted 737-100 simulator. The results of this application are presented along with those of more conventional statistical test procedures on the same simulator data.
Dall'Asta, Andrea; Schievano, Silvia; Bruse, Jan L; Paramasivam, Gowrishankar; Kaihura, Christine Tita; Dunaway, David; Lees, Christoph C
2017-07-01
The antenatal detection of facial dysmorphism using 3-dimensional ultrasound may raise the suspicion of an underlying genetic condition but infrequently leads to a definitive antenatal diagnosis. Despite advances in array and noninvasive prenatal testing, not all genetic conditions can be ascertained from such testing. The aim of this study was to investigate the feasibility of quantitative assessment of fetal face features using prenatal 3-dimensional ultrasound volumes and statistical shape modeling. STUDY DESIGN: Thirteen normal and 7 abnormal stored 3-dimensional ultrasound fetal face volumes were analyzed, at a median gestation of 29 +4 weeks (25 +0 to 36 +1 ). The 20 3-dimensional surface meshes generated were aligned and served as input for a statistical shape model, which computed the mean 3-dimensional face shape and 3-dimensional shape variations using principal component analysis. Ten shape modes explained more than 90% of the total shape variability in the population. While the first mode accounted for overall size differences, the second highlighted shape feature changes from an overall proportionate toward a more asymmetric face shape with a wide prominent forehead and an undersized, posteriorly positioned chin. Analysis of the Mahalanobis distance in principal component analysis shape space suggested differences between normal and abnormal fetuses (median and interquartile range distance values, 7.31 ± 5.54 for the normal group vs 13.27 ± 9.82 for the abnormal group) (P = .056). This feasibility study demonstrates that objective characterization and quantification of fetal facial morphology is possible from 3-dimensional ultrasound. This technique has the potential to assist in utero diagnosis, particularly of rare conditions in which facial dysmorphology is a feature. Copyright © 2017 Elsevier Inc. All rights reserved.
Pantyley, Viktoriya
2014-01-01
In new conditions of socio-economic development in the Ukraine, the health of the population of children is considered as the most reliable indicator of socio-economic development of the country. The primary goal of the study was analysis of the effect of contemporary socio-economic transformations, their scope, and strength of effect on the demographic and social situation of children in various regions of the Ukraine. The methodological objectives of the study were as follows: development of a synthetic measure of the state of health of the population of children, based on the Hellwig's method, and selection of districts in the Ukraine according to the present health-demographic situation of children. The study was based on statistical data from the State Statistics Service of Ukraine, Centre of Medical Statistics in Kiev, Ukrainian Ministry of Defence, as well as Ministry of Education and Science, Youth and Sports of Ukraine. The following research methods were used: analysis of literature and Internet sources, selection and analysis of statistical materials, cartographic and statistical methods. Basic indices of the demographic and health situation of the population of children were analyzed, as well as factors of a socio-economic nature which affect this situation. A set of variables was developed for the synthetic evaluation of the state of health of the population of children. The typology of the Ukrainian districts was performed according to the state of health of the child population, based on the Hellwig's taxonomic method. Deterioration was observed of selected quality parameters, as well as a change in the strength and directions of effect of factors of organizational-institutional, socioeconomic, historical and cultural nature on the population of children potential.
Ing, Alex; Schwarzbauer, Christian
2014-01-01
Functional connectivity has become an increasingly important area of research in recent years. At a typical spatial resolution, approximately 300 million connections link each voxel in the brain with every other. This pattern of connectivity is known as the functional connectome. Connectivity is often compared between experimental groups and conditions. Standard methods used to control the type 1 error rate are likely to be insensitive when comparisons are carried out across the whole connectome, due to the huge number of statistical tests involved. To address this problem, two new cluster based methods--the cluster size statistic (CSS) and cluster mass statistic (CMS)--are introduced to control the family wise error rate across all connectivity values. These methods operate within a statistical framework similar to the cluster based methods used in conventional task based fMRI. Both methods are data driven, permutation based and require minimal statistical assumptions. Here, the performance of each procedure is evaluated in a receiver operator characteristic (ROC) analysis, utilising a simulated dataset. The relative sensitivity of each method is also tested on real data: BOLD (blood oxygen level dependent) fMRI scans were carried out on twelve subjects under normal conditions and during the hypercapnic state (induced through the inhalation of 6% CO2 in 21% O2 and 73%N2). Both CSS and CMS detected significant changes in connectivity between normal and hypercapnic states. A family wise error correction carried out at the individual connection level exhibited no significant changes in connectivity.
Ing, Alex; Schwarzbauer, Christian
2014-01-01
Functional connectivity has become an increasingly important area of research in recent years. At a typical spatial resolution, approximately 300 million connections link each voxel in the brain with every other. This pattern of connectivity is known as the functional connectome. Connectivity is often compared between experimental groups and conditions. Standard methods used to control the type 1 error rate are likely to be insensitive when comparisons are carried out across the whole connectome, due to the huge number of statistical tests involved. To address this problem, two new cluster based methods – the cluster size statistic (CSS) and cluster mass statistic (CMS) – are introduced to control the family wise error rate across all connectivity values. These methods operate within a statistical framework similar to the cluster based methods used in conventional task based fMRI. Both methods are data driven, permutation based and require minimal statistical assumptions. Here, the performance of each procedure is evaluated in a receiver operator characteristic (ROC) analysis, utilising a simulated dataset. The relative sensitivity of each method is also tested on real data: BOLD (blood oxygen level dependent) fMRI scans were carried out on twelve subjects under normal conditions and during the hypercapnic state (induced through the inhalation of 6% CO2 in 21% O2 and 73%N2). Both CSS and CMS detected significant changes in connectivity between normal and hypercapnic states. A family wise error correction carried out at the individual connection level exhibited no significant changes in connectivity. PMID:24906136
Is it possible to shorten examination time in posture control studies?
Faraldo García, Ana; Soto Varela, Andrés; Santos Pérez, Sofía
2015-01-01
The sensory organization test (SOT) is the gold-standard test for the study of postural control with posturographic platforms. Three registers of Conditions 3, 4, 5 and 6 are conducted to find an arithmetic mean of the 3, with the time that this entails. The aim of this study was to determine whether a single record for each SOT condition would give us the same information as the arithmetic mean of the 3 recordings used until now. 100 healthy individuals who performed a sensory organisation test in the Smart Balance Master(®) Neurocom platform. For the statistical analysis we used the Wilcoxon test for nonparametric variables and dependent t-student for paired samples for parametric variables (P<.05). When comparing the scores on the first record with the average of the 3 records, we found statistically significant differences for the 4 conditions (P<0.05). Comparing the first record to the second record also yielded statistically significant differences in the 4 conditions (P<.05). Upon comparing the second record with the third, however, we found differences in only Condition 5, with the significance being borderline (P=.04). Finally, comparing the average of the first and second record with the average of the 3 records, we also found statistically significant differences for the 4 conditions (P<.05). Using only 1 or 2 records from each of the conditions on the SOT does not give us the same information as the arithmetic average of the 3 records used until now. Copyright © 2014 Elsevier España, S.L.U. and Sociedad Española de Otorrinolaringología y Patología Cérvico-Facial. All rights reserved.
Steinka-Fry, Katarzyna T; Tanner-Smith, Emily E; Dakof, Gayle A; Henderson, Craig
2017-04-01
This systematic review and meta-analysis synthesized findings from studies examining culturally sensitive substance use treatment for racial/ethnic minority youth. An extensive literature search located eight eligible studies using experimental or quasi-experimental designs. The meta-analysis quantitatively synthesized findings comparing seven culturally sensitive treatment conditions to seven alternative conditions on samples composed of at least 90% racial/ethnic minority youth. The results from the meta-analysis indicated that culturally sensitive treatments were associated with significantly larger reductions in post-treatment substance use levels relative to their comparison conditions (g=0.37, 95% CI [0.12, 0.62], k=7, total number participants=723). The average time between pretest and posttest was 21weeks (SD=11.79). There was a statistically significant amount of heterogeneity across the seven studies (Q=26.5, p=0.00, τ 2 =0.08, I 2 =77.4%). Differential effects were not statistically significant when contrasts were active generic counterparts of treatment conditions (direct "bona fide" comparisons; g=-0.08, 95% CI [-0.51, 0.35]) and 'treatment as usual' conditions (g=0.39, 95% CI [-0.14, 0.91]). Strong conclusions from the review were hindered by the small number of available studies for synthesis, variability in comparison conditions across studies, and lack of diversity in the adolescent clients served in the studies. Nonetheless, this review suggests that culturally sensitive treatments offer promise as an effective way to address substance use among racial/ethnic minority youth. Copyright © 2017 Elsevier Inc. All rights reserved.
Statistical modelling of networked human-automation performance using working memory capacity.
Ahmed, Nisar; de Visser, Ewart; Shaw, Tyler; Mohamed-Ameen, Amira; Campbell, Mark; Parasuraman, Raja
2014-01-01
This study examines the challenging problem of modelling the interaction between individual attentional limitations and decision-making performance in networked human-automation system tasks. Analysis of real experimental data from a task involving networked supervision of multiple unmanned aerial vehicles by human participants shows that both task load and network message quality affect performance, but that these effects are modulated by individual differences in working memory (WM) capacity. These insights were used to assess three statistical approaches for modelling and making predictions with real experimental networked supervisory performance data: classical linear regression, non-parametric Gaussian processes and probabilistic Bayesian networks. It is shown that each of these approaches can help designers of networked human-automated systems cope with various uncertainties in order to accommodate future users by linking expected operating conditions and performance from real experimental data to observable cognitive traits like WM capacity. Practitioner Summary: Working memory (WM) capacity helps account for inter-individual variability in operator performance in networked unmanned aerial vehicle supervisory tasks. This is useful for reliable performance prediction near experimental conditions via linear models; robust statistical prediction beyond experimental conditions via Gaussian process models and probabilistic inference about unknown task conditions/WM capacities via Bayesian network models.
Variable system: An alternative approach for the analysis of mediated moderation.
Kwan, Joyce Lok Yin; Chan, Wai
2018-06-01
Mediated moderation (meMO) occurs when the moderation effect of the moderator (W) on the relationship between the independent variable (X) and the dependent variable (Y) is transmitted through a mediator (M). To examine this process empirically, 2 different model specifications (Type I meMO and Type II meMO) have been proposed in the literature. However, both specifications are found to be problematic, either conceptually or statistically. For example, it can be shown that each type of meMO model is statistically equivalent to a particular form of moderated mediation (moME), another process that examines the condition when the indirect effect from X to Y through M varies as a function of W. Consequently, it is difficult for one to differentiate these 2 processes mathematically. This study therefore has 2 objectives. First, we attempt to differentiate moME and meMO by proposing an alternative specification for meMO. Conceptually, this alternative specification is intuitively meaningful and interpretable, and, statistically, it offers meMO a unique representation that is no longer identical to its moME counterpart. Second, using structural equation modeling, we propose an integrated approach for the analysis of meMO as well as for other general types of conditional path models. VS, a computer software program that implements the proposed approach, has been developed to facilitate the analysis of conditional path models for applied researchers. Real examples are considered to illustrate how the proposed approach works in practice and to compare its performance against the traditional methods. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Reply to discussion: ground water response to forest harvest: implications or hillslope stability
Amod Dhakal; Roy C. Sidle; A.C. Johnson; R.T. Edwards
2008-01-01
Dhakal and Sidle (this volume) have requested clarification of some of the rationales and approaches used in analyses described by Johnson et al. (2007). Here we further describe hydrologic conditions typical of southeast Alaska and elaborate on an accepted methodology used for conducting analysis of covariance statistical analysis (ANCOVA). We discuss Dhakal and Sidle...
Analysis of high-resolution foreign exchange data of USD-JPY for 13 years
NASA Astrophysics Data System (ADS)
Mizuno, Takayuki; Kurihara, Shoko; Takayasu, Misako; Takayasu, Hideki
2003-06-01
We analyze high-resolution foreign exchange data consisting of 20 million data points of USD-JPY for 13 years to report firm statistical laws in distributions and correlations of exchange rate fluctuations. A conditional probability density analysis clearly shows the existence of trend-following movements at time scale of 8-ticks, about 1 min.
Ferreira, Camila Lopes; da Rocha, Vinicius Clemente; da Silva Ursi, Weber José; De Marco, Andrea Carvalho; Santamaria, Milton; Santamaria, Mauro Pedrine; Jardini, Maria Aparecida Neves
2018-03-01
Systemic conditions can influence orthodontic tooth movement. This study evaluates histologic periodontal responses to orthodontic tooth movement in diabetes-induced rats with or without periodontal disease. Forty Wistar rats were divided according their systemic condition (SC) into diabetic (D) and non-diabetic (ND) groups. Each group was subdivided into control (C), orthodontic tooth movement (OM), ligature-induced periodontitis (P) and ligature-induced periodontitis with orthodontic movement (P+OM) groups. Diabetes mellitus (DM) was induced with alloxan monohydrate, and after 30 days, the P group received a cotton ligature around their first lower molar crown. An orthodontic device was placed in OM and P+OM groups for 7 days, and the animals were then euthanized. Differences in OM between D and ND groups were not significant (6.87± 3.55 mm and 6.81 ± 3.28 mm, respectively), but intragroup analysis revealed statistically significant differences between the P+OM groups for both SCs. Bone loss was greater in the D group (0.16 ± 0.07 mm 2 ) than in the ND group (0.10 ± 0.03 mm 2 ). In intragroup analysis of the D condition, the P+OM group differed statistically from the other groups, while in the ND condition, the P+OM group was different from the C and OM groups. There was a statistically significant difference in bone density between D and ND conditions (18.03 ± 8.09% and 22.53 ± 7.72%) in the C, P, and P+OM groups. DM has deleterious effects on bone density and bone loss in the furcation region. These effects are maximized when associated with ligature-induced periodontitis with orthodontic movement. © 2018 American Academy of Periodontology.
The need for conducting forensic analysis of decommissioned bridges.
DOT National Transportation Integrated Search
2014-01-01
A limiting factor in current bridge management programs is a lack of detailed knowledge of bridge deterioration : mechanisms and processes. The current state of the art is to predict future condition using statistical forecasting : models based upon ...
12 & 15 passenger vans tire pressure study : preliminary results
DOT National Transportation Integrated Search
2005-05-01
A study was conducted by the National Highway Traffic Safety Administration's (NHTSA's) National Center for Statistics and Analysis (NCSA) to determine the extent of underinflation and observe the tire condition in 12- and 15-passenger vans. This Res...
Analysis of conditional genetic effects and variance components in developmental genetics.
Zhu, J
1995-12-01
A genetic model with additive-dominance effects and genotype x environment interactions is presented for quantitative traits with time-dependent measures. The genetic model for phenotypic means at time t conditional on phenotypic means measured at previous time (t-1) is defined. Statistical methods are proposed for analyzing conditional genetic effects and conditional genetic variance components. Conditional variances can be estimated by minimum norm quadratic unbiased estimation (MINQUE) method. An adjusted unbiased prediction (AUP) procedure is suggested for predicting conditional genetic effects. A worked example from cotton fruiting data is given for comparison of unconditional and conditional genetic variances and additive effects.
Analysis of Conditional Genetic Effects and Variance Components in Developmental Genetics
Zhu, J.
1995-01-01
A genetic model with additive-dominance effects and genotype X environment interactions is presented for quantitative traits with time-dependent measures. The genetic model for phenotypic means at time t conditional on phenotypic means measured at previous time (t - 1) is defined. Statistical methods are proposed for analyzing conditional genetic effects and conditional genetic variance components. Conditional variances can be estimated by minimum norm quadratic unbiased estimation (MINQUE) method. An adjusted unbiased prediction (AUP) procedure is suggested for predicting conditional genetic effects. A worked example from cotton fruiting data is given for comparison of unconditional and conditional genetic variances and additive effects. PMID:8601500
EEG source analysis of data from paralysed subjects
NASA Astrophysics Data System (ADS)
Carabali, Carmen A.; Willoughby, John O.; Fitzgibbon, Sean P.; Grummett, Tyler; Lewis, Trent; DeLosAngeles, Dylan; Pope, Kenneth J.
2015-12-01
One of the limitations of Encephalography (EEG) data is its quality, as it is usually contaminated with electric signal from muscle. This research intends to study results of two EEG source analysis methods applied to scalp recordings taken in paralysis and in normal conditions during the performance of a cognitive task. The aim is to determinate which types of analysis are appropriate for dealing with EEG data containing myogenic components. The data used are the scalp recordings of six subjects in normal conditions and during paralysis while performing different cognitive tasks including the oddball task which is the object of this research. The data were pre-processed by filtering it and correcting artefact, then, epochs of one second long for targets and distractors were extracted. Distributed source analysis was performed in BESA Research 6.0, using its results and information from the literature, 9 ideal locations for source dipoles were identified. The nine dipoles were used to perform discrete source analysis, fitting them to the averaged epochs for obtaining source waveforms. The results were statistically analysed comparing the outcomes before and after the subjects were paralysed. Finally, frequency analysis was performed for better explain the results. The findings were that distributed source analysis could produce confounded results for EEG contaminated with myogenic signals, conversely, statistical analysis of the results from discrete source analysis showed that this method could help for dealing with EEG data contaminated with muscle electrical signal.
Testing Genetic Pleiotropy with GWAS Summary Statistics for Marginal and Conditional Analyses.
Deng, Yangqing; Pan, Wei
2017-12-01
There is growing interest in testing genetic pleiotropy, which is when a single genetic variant influences multiple traits. Several methods have been proposed; however, these methods have some limitations. First, all the proposed methods are based on the use of individual-level genotype and phenotype data; in contrast, for logistical, and other, reasons, summary statistics of univariate SNP-trait associations are typically only available based on meta- or mega-analyzed large genome-wide association study (GWAS) data. Second, existing tests are based on marginal pleiotropy, which cannot distinguish between direct and indirect associations of a single genetic variant with multiple traits due to correlations among the traits. Hence, it is useful to consider conditional analysis, in which a subset of traits is adjusted for another subset of traits. For example, in spite of substantial lowering of low-density lipoprotein cholesterol (LDL) with statin therapy, some patients still maintain high residual cardiovascular risk, and, for these patients, it might be helpful to reduce their triglyceride (TG) level. For this purpose, in order to identify new therapeutic targets, it would be useful to identify genetic variants with pleiotropic effects on LDL and TG after adjusting the latter for LDL; otherwise, a pleiotropic effect of a genetic variant detected by a marginal model could simply be due to its association with LDL only, given the well-known correlation between the two types of lipids. Here, we develop a new pleiotropy testing procedure based only on GWAS summary statistics that can be applied for both marginal analysis and conditional analysis. Although the main technical development is based on published union-intersection testing methods, care is needed in specifying conditional models to avoid invalid statistical estimation and inference. In addition to the previously used likelihood ratio test, we also propose using generalized estimating equations under the working independence model for robust inference. We provide numerical examples based on both simulated and real data, including two large lipid GWAS summary association datasets based on ∼100,000 and ∼189,000 samples, respectively, to demonstrate the difference between marginal and conditional analyses, as well as the effectiveness of our new approach. Copyright © 2017 by the Genetics Society of America.
INFORMATION: THEORY, BRAIN, AND BEHAVIOR
Jensen, Greg; Ward, Ryan D.; Balsam, Peter D.
2016-01-01
In the 65 years since its formal specification, information theory has become an established statistical paradigm, providing powerful tools for quantifying probabilistic relationships. Behavior analysis has begun to adopt these tools as a novel means of measuring the interrelations between behavior, stimuli, and contingent outcomes. This approach holds great promise for making more precise determinations about the causes of behavior and the forms in which conditioning may be encoded by organisms. In addition to providing an introduction to the basics of information theory, we review some of the ways that information theory has informed the studies of Pavlovian conditioning, operant conditioning, and behavioral neuroscience. In addition to enriching each of these empirical domains, information theory has the potential to act as a common statistical framework by which results from different domains may be integrated, compared, and ultimately unified. PMID:24122456
Fathiazar, Elham; Anemuller, Jorn; Kretzberg, Jutta
2016-08-01
Voltage-Sensitive Dye (VSD) imaging is an optical imaging method that allows measuring the graded voltage changes of multiple neurons simultaneously. In neuroscience, this method is used to reveal networks of neurons involved in certain tasks. However, the recorded relative dye fluorescence changes are usually low and signals are superimposed by noise and artifacts. Therefore, establishing a reliable method to identify which cells are activated by specific stimulus conditions is the first step to identify functional networks. In this paper, we present a statistical method to identify stimulus-activated network nodes as cells, whose activities during sensory network stimulation differ significantly from the un-stimulated control condition. This method is demonstrated based on voltage-sensitive dye recordings from up to 100 neurons in a ganglion of the medicinal leech responding to tactile skin stimulation. Without relying on any prior physiological knowledge, the network nodes identified by our statistical analysis were found to match well with published cell types involved in tactile stimulus processing and to be consistent across stimulus conditions and preparations.
Using radar imagery for crop discrimination: a statistical and conditional probability study
Haralick, R.M.; Caspall, F.; Simonett, D.S.
1970-01-01
A number of the constraints with which remote sensing must contend in crop studies are outlined. They include sensor, identification accuracy, and congruencing constraints; the nature of the answers demanded of the sensor system; and the complex temporal variances of crops in large areas. Attention is then focused on several methods which may be used in the statistical analysis of multidimensional remote sensing data.Crop discrimination for radar K-band imagery is investigated by three methods. The first one uses a Bayes decision rule, the second a nearest-neighbor spatial conditional probability approach, and the third the standard statistical techniques of cluster analysis and principal axes representation.Results indicate that crop type and percent of cover significantly affect the strength of the radar return signal. Sugar beets, corn, and very bare ground are easily distinguishable, sorghum, alfalfa, and young wheat are harder to distinguish. Distinguishability will be improved if the imagery is examined in time sequence so that changes between times of planning, maturation, and harvest provide additional discriminant tools. A comparison between radar and photography indicates that radar performed surprisingly well in crop discrimination in western Kansas and warrants further study.
Gait patterns for crime fighting: statistical evaluation
NASA Astrophysics Data System (ADS)
Sulovská, Kateřina; Bělašková, Silvie; Adámek, Milan
2013-10-01
The criminality is omnipresent during the human history. Modern technology brings novel opportunities for identification of a perpetrator. One of these opportunities is an analysis of video recordings, which may be taken during the crime itself or before/after the crime. The video analysis can be classed as identification analyses, respectively identification of a person via externals. The bipedal locomotion focuses on human movement on the basis of their anatomical-physiological features. Nowadays, the human gait is tested by many laboratories to learn whether the identification via bipedal locomotion is possible or not. The aim of our study is to use 2D components out of 3D data from the VICON Mocap system for deep statistical analyses. This paper introduces recent results of a fundamental study focused on various gait patterns during different conditions. The study contains data from 12 participants. Curves obtained from these measurements were sorted, averaged and statistically tested to estimate the stability and distinctiveness of this biometrics. Results show satisfactory distinctness of some chosen points, while some do not embody significant difference. However, results presented in this paper are of initial phase of further deeper and more exacting analyses of gait patterns under different conditions.
NASA Technical Reports Server (NTRS)
Perry, Boyd, III; Pototzky, Anthony S.; Woods, Jessica A.
1989-01-01
This paper presents the results of a NASA investigation of a claimed 'Overlap' between two gust response analysis methods: the Statistical Discrete Gust (SDG) method and the Power Spectral Density (PSD) method. The claim is that the ratio of an SDG response to the corresponding PSD response is 10.4. Analytical results presented in this paper for several different airplanes at several different flight conditions indicate that such an 'Overlap' does appear to exist. However, the claim was not met precisely: a scatter of up to about 10 percent about the 10.4 factor can be expected.
The CTS 11.7 GHz angle of arrival experiment
NASA Technical Reports Server (NTRS)
Kwan, B. W.; Hodge, D. B.
1981-01-01
The objective of the experiment was to determine the statistical behavior of attenuation and angle of arrival on an Earth-space propagation path using the CTS 11.7 GHz beacon. Measurements performed from 1976 to 1978 form the data base for analysis. The statistics of the signal attenuation and phase variations due to atmospheric disturbances are presented. Rainfall rate distributions are also included to provide a link between the above effects on wave propagation and meteorological conditions.
Cox, Tony; Popken, Douglas; Ricci, Paolo F
2013-01-01
Exposures to fine particulate matter (PM2.5) in air (C) have been suspected of contributing causally to increased acute (e.g., same-day or next-day) human mortality rates (R). We tested this causal hypothesis in 100 United States cities using the publicly available NMMAPS database. Although a significant, approximately linear, statistical C-R association exists in simple statistical models, closer analysis suggests that it is not causal. Surprisingly, conditioning on other variables that have been extensively considered in previous analyses (usually using splines or other smoothers to approximate their effects), such as month of the year and mean daily temperature, suggests that they create strong, nonlinear confounding that explains the statistical association between PM2.5 and mortality rates in this data set. As this finding disagrees with conventional wisdom, we apply several different techniques to examine it. Conditional independence tests for potential causation, non-parametric classification tree analysis, Bayesian Model Averaging (BMA), and Granger-Sims causality testing, show no evidence that PM2.5 concentrations have any causal impact on increasing mortality rates. This apparent absence of a causal C-R relation, despite their statistical association, has potentially important implications for managing and communicating the uncertain health risks associated with, but not necessarily caused by, PM2.5 exposures. PMID:23983662
Predicting the Ability of Marine Mammal Populations to Compensate for Behavioral Disturbances
2015-09-30
approaches, including simple theoretical models as well as statistical analysis of data rich conditions. Building on models developed for PCoD [2,3], we...conditions is population trajectory most likely to be affected (the central aim of PCoD ). For the revised model presented here, we include a population...averaged condition individuals (here used as a proxy for individual health as defined in PCoD ), and E is the quality of the environment in which the
Thin layer asphaltic concrete density measuring using nuclear gages.
DOT National Transportation Integrated Search
1989-03-01
A Troxler 4640 thin layer nuclear gage was evaluated under field conditions to determine if it would provide improved accuracy of density measurements on asphalt overlays of 1-3/4 and 2 inches in thickness. Statistical analysis shows slightly improve...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoon Sohn; Charles Farrar; Norman Hunter
2001-01-01
This report summarizes the analysis of fiber-optic strain gauge data obtained from a surface-effect fast patrol boat being studied by the staff at the Norwegian Defense Research Establishment (NDRE) in Norway and the Naval Research Laboratory (NRL) in Washington D.C. Data from two different structural conditions were provided to the staff at Los Alamos National Laboratory. The problem was then approached from a statistical pattern recognition paradigm. This paradigm can be described as a four-part process: (1) operational evaluation, (2) data acquisition & cleansing, (3) feature extraction and data reduction, and (4) statistical model development for feature discrimination. Given thatmore » the first two portions of this paradigm were mostly completed by the NDRE and NRL staff, this study focused on data normalization, feature extraction, and statistical modeling for feature discrimination. The feature extraction process began by looking at relatively simple statistics of the signals and progressed to using the residual errors from auto-regressive (AR) models fit to the measured data as the damage-sensitive features. Data normalization proved to be the most challenging portion of this investigation. A novel approach to data normalization, where the residual errors in the AR model are considered to be an unmeasured input and an auto-regressive model with exogenous inputs (ARX) is then fit to portions of the data exhibiting similar waveforms, was successfully applied to this problem. With this normalization procedure, a clear distinction between the two different structural conditions was obtained. A false-positive study was also run, and the procedure developed herein did not yield any false-positive indications of damage. Finally, the results must be qualified by the fact that this procedure has only been applied to very limited data samples. A more complete analysis of additional data taken under various operational and environmental conditions as well as other structural conditions is necessary before one can definitively state that the procedure is robust enough to be used in practice.« less
Potentiation Effects of Half-Squats Performed in a Ballistic or Nonballistic Manner.
Suchomel, Timothy J; Sato, Kimitake; DeWeese, Brad H; Ebben, William P; Stone, Michael H
2016-06-01
This study examined and compared the acute effects of ballistic and nonballistic concentric-only half-squats (COHSs) on squat jump performance. Fifteen resistance-trained men performed a squat jump 2 minutes after a control protocol or 2 COHSs at 90% of their 1 repetition maximum (1RM) COHS performed in a ballistic or nonballistic manner. Jump height (JH), peak power (PP), and allometrically scaled peak power (PPa) were compared using three 3 × 2 repeated-measures analyses of variance. Statistically significant condition × time interaction effects existed for JH (p = 0.037), PP (p = 0.041), and PPa (p = 0.031). Post hoc analysis revealed that the ballistic condition produced statistically greater JH (p = 0.017 and p = 0.036), PP (p = 0.031 and p = 0.026), and PPa (p = 0.024 and p = 0.023) than the control and nonballistic conditions, respectively. Small effect sizes for JH, PP, and PPa existed during the ballistic condition (d = 0.28-0.44), whereas trivial effect sizes existed during the control (d = 0.0-0.18) and nonballistic (d = 0.0-0.17) conditions. Large statistically significant relationships existed between the JH potentiation response and the subject's relative back squat 1RM (r = 0.520; p = 0.047) and relative COHS 1RM (r = 0.569; p = 0.027) during the ballistic condition. In addition, large statistically significant relationship existed between JH potentiation response and the subject's relative back squat strength (r = 0.633; p = 0.011), whereas the moderate relationship with the subject's relative COHS strength trended toward significance (r = 0.483; p = 0.068). Ballistic COHS produced superior potentiation effects compared with COHS performed in a nonballistic manner. Relative strength may contribute to the elicited potentiation response after ballistic and nonballistic COHS.
Toppi, J; Petti, M; Vecchiato, G; Cincotti, F; Salinari, S; Mattia, D; Babiloni, F; Astolfi, L
2013-01-01
Partial Directed Coherence (PDC) is a spectral multivariate estimator for effective connectivity, relying on the concept of Granger causality. Even if its original definition derived directly from information theory, two modifies were introduced in order to provide better physiological interpretations of the estimated networks: i) normalization of the estimator according to rows, ii) squared transformation. In the present paper we investigated the effect of PDC normalization on the performances achieved by applying the statistical validation process on investigated connectivity patterns under different conditions of Signal to Noise ratio (SNR) and amount of data available for the analysis. Results of the statistical analysis revealed an effect of PDC normalization only on the percentages of type I and type II errors occurred by using Shuffling procedure for the assessment of connectivity patterns. No effects of the PDC formulation resulted on the performances achieved during the validation process executed instead by means of Asymptotic Statistic approach. Moreover, the percentages of both false positives and false negatives committed by Asymptotic Statistic are always lower than those achieved by Shuffling procedure for each type of normalization.
Statistical Optimality in Multipartite Ranking and Ordinal Regression.
Uematsu, Kazuki; Lee, Yoonkyung
2015-05-01
Statistical optimality in multipartite ranking is investigated as an extension of bipartite ranking. We consider the optimality of ranking algorithms through minimization of the theoretical risk which combines pairwise ranking errors of ordinal categories with differential ranking costs. The extension shows that for a certain class of convex loss functions including exponential loss, the optimal ranking function can be represented as a ratio of weighted conditional probability of upper categories to lower categories, where the weights are given by the misranking costs. This result also bridges traditional ranking methods such as proportional odds model in statistics with various ranking algorithms in machine learning. Further, the analysis of multipartite ranking with different costs provides a new perspective on non-smooth list-wise ranking measures such as the discounted cumulative gain and preference learning. We illustrate our findings with simulation study and real data analysis.
Semi-supervised vibration-based classification and condition monitoring of compressors
NASA Astrophysics Data System (ADS)
Potočnik, Primož; Govekar, Edvard
2017-09-01
Semi-supervised vibration-based classification and condition monitoring of the reciprocating compressors installed in refrigeration appliances is proposed in this paper. The method addresses the problem of industrial condition monitoring where prior class definitions are often not available or difficult to obtain from local experts. The proposed method combines feature extraction, principal component analysis, and statistical analysis for the extraction of initial class representatives, and compares the capability of various classification methods, including discriminant analysis (DA), neural networks (NN), support vector machines (SVM), and extreme learning machines (ELM). The use of the method is demonstrated on a case study which was based on industrially acquired vibration measurements of reciprocating compressors during the production of refrigeration appliances. The paper presents a comparative qualitative analysis of the applied classifiers, confirming the good performance of several nonlinear classifiers. If the model parameters are properly selected, then very good classification performance can be obtained from NN trained by Bayesian regularization, SVM and ELM classifiers. The method can be effectively applied for the industrial condition monitoring of compressors.
Differential principal component analysis of ChIP-seq.
Ji, Hongkai; Li, Xia; Wang, Qian-fei; Ning, Yang
2013-04-23
We propose differential principal component analysis (dPCA) for analyzing multiple ChIP-sequencing datasets to identify differential protein-DNA interactions between two biological conditions. dPCA integrates unsupervised pattern discovery, dimension reduction, and statistical inference into a single framework. It uses a small number of principal components to summarize concisely the major multiprotein synergistic differential patterns between the two conditions. For each pattern, it detects and prioritizes differential genomic loci by comparing the between-condition differences with the within-condition variation among replicate samples. dPCA provides a unique tool for efficiently analyzing large amounts of ChIP-sequencing data to study dynamic changes of gene regulation across different biological conditions. We demonstrate this approach through analyses of differential chromatin patterns at transcription factor binding sites and promoters as well as allele-specific protein-DNA interactions.
Hybrid statistics-simulations based method for atom-counting from ADF STEM images.
De Wael, Annelies; De Backer, Annick; Jones, Lewys; Nellist, Peter D; Van Aert, Sandra
2017-06-01
A hybrid statistics-simulations based method for atom-counting from annular dark field scanning transmission electron microscopy (ADF STEM) images of monotype crystalline nanostructures is presented. Different atom-counting methods already exist for model-like systems. However, the increasing relevance of radiation damage in the study of nanostructures demands a method that allows atom-counting from low dose images with a low signal-to-noise ratio. Therefore, the hybrid method directly includes prior knowledge from image simulations into the existing statistics-based method for atom-counting, and accounts in this manner for possible discrepancies between actual and simulated experimental conditions. It is shown by means of simulations and experiments that this hybrid method outperforms the statistics-based method, especially for low electron doses and small nanoparticles. The analysis of a simulated low dose image of a small nanoparticle suggests that this method allows for far more reliable quantitative analysis of beam-sensitive materials. Copyright © 2017 Elsevier B.V. All rights reserved.
Wang, Mingyu; Han, Lijuan; Liu, Shasha; Zhao, Xuebing; Yang, Jinghua; Loh, Soh Kheang; Sun, Xiaomin; Zhang, Chenxi; Fang, Xu
2015-09-01
Renewable energy from lignocellulosic biomass has been deemed an alternative to depleting fossil fuels. In order to improve this technology, we aim to develop robust mathematical models for the enzymatic lignocellulose degradation process. By analyzing 96 groups of previously published and newly obtained lignocellulose saccharification results and fitting them to Weibull distribution, we discovered Weibull statistics can accurately predict lignocellulose saccharification data, regardless of the type of substrates, enzymes and saccharification conditions. A mathematical model for enzymatic lignocellulose degradation was subsequently constructed based on Weibull statistics. Further analysis of the mathematical structure of the model and experimental saccharification data showed the significance of the two parameters in this model. In particular, the λ value, defined the characteristic time, represents the overall performance of the saccharification system. This suggestion was further supported by statistical analysis of experimental saccharification data and analysis of the glucose production levels when λ and n values change. In conclusion, the constructed Weibull statistics-based model can accurately predict lignocellulose hydrolysis behavior and we can use the λ parameter to assess the overall performance of enzymatic lignocellulose degradation. Advantages and potential applications of the model and the λ value in saccharification performance assessment were discussed. Copyright © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Identifiability of PBPK Models with Applications to ...
Any statistical model should be identifiable in order for estimates and tests using it to be meaningful. We consider statistical analysis of physiologically-based pharmacokinetic (PBPK) models in which parameters cannot be estimated precisely from available data, and discuss different types of identifiability that occur in PBPK models and give reasons why they occur. We particularly focus on how the mathematical structure of a PBPK model and lack of appropriate data can lead to statistical models in which it is impossible to estimate at least some parameters precisely. Methods are reviewed which can determine whether a purely linear PBPK model is globally identifiable. We propose a theorem which determines when identifiability at a set of finite and specific values of the mathematical PBPK model (global discrete identifiability) implies identifiability of the statistical model. However, we are unable to establish conditions that imply global discrete identifiability, and conclude that the only safe approach to analysis of PBPK models involves Bayesian analysis with truncated priors. Finally, computational issues regarding posterior simulations of PBPK models are discussed. The methodology is very general and can be applied to numerous PBPK models which can be expressed as linear time-invariant systems. A real data set of a PBPK model for exposure to dimethyl arsinic acid (DMA(V)) is presented to illustrate the proposed methodology. We consider statistical analy
Neurosphere and adherent culture conditions are equivalent for malignant glioma stem cell lines.
Rahman, Maryam; Reyner, Karina; Deleyrolle, Loic; Millette, Sebastien; Azari, Hassan; Day, Bryan W; Stringer, Brett W; Boyd, Andrew W; Johns, Terrance G; Blot, Vincent; Duggal, Rohit; Reynolds, Brent A
2015-03-01
Certain limitations of the neurosphere assay (NSA) have resulted in a search for alternative culture techniques for brain tumor-initiating cells (TICs). Recently, reports have described growing glioblastoma (GBM) TICs as a monolayer using laminin. We performed a side-by-side analysis of the NSA and laminin (adherent) culture conditions to compare the growth and expansion of GBM TICs. GBM cells were grown using the NSA and adherent culture conditions. Comparisons were made using growth in culture, apoptosis assays, protein expression, limiting dilution clonal frequency assay, genetic affymetrix analysis, and tumorigenicity in vivo. In vitro expansion curves for the NSA and adherent culture conditions were virtually identical (P=0.24) and the clonogenic frequencies (5.2% for NSA vs. 5.0% for laminin, P=0.9) were similar as well. Likewise, markers of differentiation (glial fibrillary acidic protein and beta tubulin III) and proliferation (Ki67 and MCM2) revealed no statistical difference between the sphere and attachment methods. Several different methods were used to determine the numbers of dead or dying cells (trypan blue, DiIC, caspase-3, and annexin V) with none of the assays noting a meaningful variance between the two methods. In addition, genetic expression analysis with microarrays revealed no significant differences between the two groups. Finally, glioma cells derived from both methods of expansion formed large invasive tumors exhibiting GBM features when implanted in immune-compromised animals. A detailed functional, protein and genetic characterization of human GBM cells cultured in serum-free defined conditions demonstrated no statistically meaningful differences when grown using sphere (NSA) or adherent conditions. Hence, both methods are functionally equivalent and remain suitable options for expanding primary high-grade gliomas in tissue culture.
Neurosphere and adherent culture conditions are equivalent for malignant glioma stem cell lines
Reyner, Karina; Deleyrolle, Loic; Millette, Sebastien; Azari, Hassan; Day, Bryan W.; Stringer, Brett W.; Boyd, Andrew W.; Johns, Terrance G.; Blot, Vincent; Duggal, Rohit; Reynolds, Brent A.
2015-01-01
Certain limitations of the neurosphere assay (NSA) have resulted in a search for alternative culture techniques for brain tumor-initiating cells (TICs). Recently, reports have described growing glioblastoma (GBM) TICs as a monolayer using laminin. We performed a side-by-side analysis of the NSA and laminin (adherent) culture conditions to compare the growth and expansion of GBM TICs. GBM cells were grown using the NSA and adherent culture conditions. Comparisons were made using growth in culture, apoptosis assays, protein expression, limiting dilution clonal frequency assay, genetic affymetrix analysis, and tumorigenicity in vivo. In vitro expansion curves for the NSA and adherent culture conditions were virtually identical (P=0.24) and the clonogenic frequencies (5.2% for NSA vs. 5.0% for laminin, P=0.9) were similar as well. Likewise, markers of differentiation (glial fibrillary acidic protein and beta tubulin III) and proliferation (Ki67 and MCM2) revealed no statistical difference between the sphere and attachment methods. Several different methods were used to determine the numbers of dead or dying cells (trypan blue, DiIC, caspase-3, and annexin V) with none of the assays noting a meaningful variance between the two methods. In addition, genetic expression analysis with microarrays revealed no significant differences between the two groups. Finally, glioma cells derived from both methods of expansion formed large invasive tumors exhibiting GBM features when implanted in immune-compromised animals. A detailed functional, protein and genetic characterization of human GBM cells cultured in serum-free defined conditions demonstrated no statistically meaningful differences when grown using sphere (NSA) or adherent conditions. Hence, both methods are functionally equivalent and remain suitable options for expanding primary high-grade gliomas in tissue culture. PMID:25806119
Evaluation of force degradation characteristics of orthodontic latex elastics in vitro and in vivo.
Wang, Tong; Zhou, Gang; Tan, Xianfeng; Dong, Yaojun
2007-07-01
To evaluate the characteristics of force degradation of latex elastics in clinical applications and in vitro studies. Samples of 3/16-inch latex elastics were investigated, and 12 students between the ages of 12 and 15 years were selected for the intermaxillary and intramaxillary tractions. The elastics in the control groups were set in artificial saliva and dry room conditions and were stretched 20 mm. The repeated-measure two-way analysis of variance and nonlinear regression analysis were used to identify statistical significance. Overall, there were statistically significant differences between the different methods and observation intervals. At 24- and 48-hour time intervals, the force decreased during in vivo testing and in artificial saliva (P < .001), whereas there were no significant differences in dry room conditions (P > .05). In intermaxillary traction the percentage of initial force remaining after 48 hours was 61%. In intramaxillary traction and in artificial saliva the percentage of initial force remaining was 71%, and in room conditions 86% of initial force remained. Force degradation of latex elastics was different according to their environmental conditions. There was significantly more force degradation in intermaxillary traction than in intramaxillary traction. The dry room condition caused the least force loss. There were some differences among groups in the different times to start wearing elastics in intermaxillary traction but no significant differences in intramaxillary traction.
NASA Astrophysics Data System (ADS)
Romanov, V. S.; Goldstein, V. G.
2018-01-01
In the organization of production and operation of submersible electric motors (ESP), as the most essential element of electric submersible plants (ESP) in the oil industry, it is necessary to consider specific operating conditions. The submersible electric motors (SEM) as most essential element of electrosubmersible installations (EI) in oil branch accounting of operation specific conditions is necessary in the process production and operation. They are determined by the conditions under which the EPU is operated. They are defined by the EPU operation conditions. For a complete picture the current state of the SED fleet in oil production, the results of its statistical analysis are given. For a comprehensive idea of the SEM park current state the results of statistical analysis are given in oil production. Currently, assessed the performance of submersible equipment produced by major manufacturers. Currently the operational characteristics assessment of the submersible equipment released by the main producers is given. It is stated that standard equipment cannot fully ensure efficient operation with the help of serial EIs, therefore new technologies and corresponding equipment are required to be developed. It is noted that the standard equipment could not provide fully effective operation by means of serial ESP therefore new technologies development and the corresponding equipment are required.
McSwain, Kristen Bukowski; Strickland, A.G.
2010-01-01
Groundwater conditions in Brunswick County, North Carolina, have been monitored continuously since 2000 through the operation and maintenance of groundwater-level observation wells in the surficial, Castle Hayne, and Peedee aquifers of the North Atlantic Coastal Plain aquifer system. Groundwater-resource conditions for the Brunswick County area were evaluated by relating the normal range (25th to 75th percentile) monthly mean groundwater-level and precipitation data for water years 2001 to 2008 to median monthly mean groundwater levels and monthly sum of daily precipitation for water year 2008. Summaries of precipitation and groundwater conditions for the Brunswick County area and hydrographs and statistics of continuous groundwater levels collected during the 2008 water year are presented in this report. Groundwater levels varied by aquifer and geographic location within Brunswick County, but were influenced by drought conditions and groundwater withdrawals. Water levels were normal in two of the eight observation wells and below normal in the remaining six wells. Seasonal Kendall trend analysis performed on more than 9 years of monthly mean groundwater-level data collected in an observation well located within the Brunswick County well field indicated there is a strong downward trend, with water levels declining at a rate of about 2.2 feet per year.
Martinet, Simon; Liu, Yao; Louis, Cédric; Tassel, Patrick; Perret, Pascal; Chaumond, Agnès; André, Michel
2017-05-16
This study aims to measure and analyze unregulated compound emissions for two Euro 6 diesel and gasoline vehicles. The vehicles were tested on a chassis dynamometer under various driving cycles: Artemis driving cycles (urban, road, and motorway), the New European Driving Cycle (NEDC) and the World Harmonized Light-Duty Test Cycle (WLTC) for Europe, and world approval cycles. The emissions of unregulated compounds (such as total particle number (PN) (over 5.6 nm); black carbon (BC); NO 2 ; benzene, toluene, ethylbenzene, and xylene (BTEX); carbonyl compounds; and polycyclic aromatic hydrocarbons (PAHs)) were measured with several online devices, and different samples were collected using cartridges and quartz filters. Furthermore, a preliminary statistical analysis was performed on eight Euro 4-6 diesel and gasoline vehicles to study the impacts of driving conditions and after-treatment and engine technologies on emissions of regulated and unregulated pollutants. The results indicate that urban conditions with cold starts induce high emissions of BTEX and carbonyl compounds. Motorway conditions are characterized by high emissions of particle numbers and CO, which mainly induced by gasoline vehicles. Compared with gasoline vehicles, diesel vehicles equipped with catalyzed or additive DPF emit fewer particles but more NO x and carbonyl compounds.
NASA Astrophysics Data System (ADS)
Bonetto, P.; Qi, Jinyi; Leahy, R. M.
2000-08-01
Describes a method for computing linear observer statistics for maximum a posteriori (MAP) reconstructions of PET images. The method is based on a theoretical approximation for the mean and covariance of MAP reconstructions. In particular, the authors derive here a closed form for the channelized Hotelling observer (CHO) statistic applied to 2D MAP images. The theoretical analysis models both the Poission statistics of PET data and the inhomogeneity of tracer uptake. The authors show reasonably good correspondence between these theoretical results and Monte Carlo studies. The accuracy and low computational cost of the approximation allow the authors to analyze the observer performance over a wide range of operating conditions and parameter settings for the MAP reconstruction algorithm.
Konukoglu, Ender; Coutu, Jean-Philippe; Salat, David H; Fischl, Bruce
2016-07-01
Diffusion magnetic resonance imaging (dMRI) is a unique technology that allows the noninvasive quantification of microstructural tissue properties of the human brain in healthy subjects as well as the probing of disease-induced variations. Population studies of dMRI data have been essential in identifying pathological structural changes in various conditions, such as Alzheimer's and Huntington's diseases (Salat et al., 2010; Rosas et al., 2006). The most common form of dMRI involves fitting a tensor to the underlying imaging data (known as diffusion tensor imaging, or DTI), then deriving parametric maps, each quantifying a different aspect of the underlying microstructure, e.g. fractional anisotropy and mean diffusivity. To date, the statistical methods utilized in most DTI population studies either analyzed only one such map or analyzed several of them, each in isolation. However, it is most likely that variations in the microstructure due to pathology or normal variability would affect several parameters simultaneously, with differing variations modulating the various parameters to differing degrees. Therefore, joint analysis of the available diffusion maps can be more powerful in characterizing histopathology and distinguishing between conditions than the widely used univariate analysis. In this article, we propose a multivariate approach for statistical analysis of diffusion parameters that uses partial least squares correlation (PLSC) analysis and permutation testing as building blocks in a voxel-wise fashion. Stemming from the common formulation, we present three different multivariate procedures for group analysis, regressing-out nuisance parameters and comparing effects of different conditions. We used the proposed procedures to study the effects of non-demented aging, Alzheimer's disease and mild cognitive impairment on the white matter. Here, we present results demonstrating that the proposed PLSC-based approach can differentiate between effects of different conditions in the same region as well as uncover spatial variations of effects across the white matter. The proposed procedures were able to answer questions on structural variations such as: "are there regions in the white matter where Alzheimer's disease has a different effect than aging or similar effect as aging?" and "are there regions in the white matter that are affected by both mild cognitive impairment and Alzheimer's disease but with differing multivariate effects?" Copyright © 2016 Elsevier Inc. All rights reserved.
Konukoglu, Ender; Coutu, Jean-Philippe; Salat, David H.; Fischl, Bruce
2016-01-01
Diffusion magnetic resonance imaging (dMRI) is a unique technology that allows the noninvasive quantification of microstructural tissue properties of the human brain in healthy subjects as well as the probing of disease-induced variations. Population studies of dMRI data have been essential in identifying pathological structural changes in various conditions, such as Alzheimer’s and Huntington’s diseases1,2. The most common form of dMRI involves fitting a tensor to the underlying imaging data (known as Diffusion Tensor Imaging, or DTI), then deriving parametric maps, each quantifying a different aspect of the underlying microstructure, e.g. fractional anisotropy and mean diffusivity. To date, the statistical methods utilized in most DTI population studies either analyzed only one such map or analyzed several of them, each in isolation. However, it is most likely that variations in the microstructure due to pathology or normal variability would affect several parameters simultaneously, with differing variations modulating the various parameters to differing degrees. Therefore, joint analysis of the available diffusion maps can be more powerful in characterizing histopathology and distinguishing between conditions than the widely used univariate analysis. In this article, we propose a multivariate approach for statistical analysis of diffusion parameters that uses partial least squares correlation (PLSC) analysis and permutation testing as building blocks in a voxel-wise fashion. Stemming from the common formulation, we present three different multivariate procedures for group analysis, regressing-out nuisance parameters and comparing effects of different conditions. We used the proposed procedures to study the effects of non-demented aging, Alzheimer’s disease and mild cognitive impairment on the white matter. Here, we present results demonstrating that the proposed PLSC-based approach can differentiate between effects of different conditions in the same region as well as uncover spatial variations of effects across the white matter. The proposed procedures were able to answer questions on structural variations such as: “are there regions in the white matter where Alzheimer’s disease has a different effect than aging or similar effect as aging?” and “are there regions in the white matter that are affected by both mild cognitive impairment and Alzheimer’s disease but with differing multivariate effects?” PMID:27103138
Journal of Naval Science. Volume 2, Number 1
1976-01-01
has defined a probability distribution function which fits this type of data and forms the basis for statistical analysis of test results (see...Conditions to Assess the Performance of Fire-Resistant Fluids’. Wear, 28 (1974) 29. J.N.S., Vol. 2, No. 1 APPENDIX A Analysis of Fatigue Test Data...used to produce the impulse response and the equipment required for the analysis is relatively simple. The methods that must be used to produce
Dale D. Gormanson; Scott A. Pugh; Charles J. Barnett; Patrick D. Miles; Randall S. Morin; Paul A. Sowers; James A. Westfall
2018-01-01
The U.S. Forest Service Forest Inventory and Analysis (FIA) program collects sample plot data on all forest ownerships across the United States. FIAâs primary objective is to determine the extent, condition, volume, growth, and use of trees on the Nationâs forest land through a comprehensive inventory and analysis of the Nationâs forest resources. The FIA program...
ERIC Educational Resources Information Center
Rampey, B.D.; Lutkus, Anthony D.; Weiner, Arlene W.; Rahman, Taslima
2006-01-01
The National Indian Education Study is a two-part study designed to describe the condition of education for American Indian/Alaska Native students in the United States. The study was conducted by the National Center for Education Statistics for the U.S. Department of Education, with the support of the Office of Indian Education. This report, Part…
Núñez, Eutimio Gustavo Fernández; Faintuch, Bluma Linkowski; Teodoro, Rodrigo; Wiecek, Danielle Pereira; da Silva, Natanael Gomes; Papadopoulos, Minas; Pelecanou, Maria; Pirmettis, Ioannis; de Oliveira Filho, Renato Santos; Duatti, Adriano; Pasqualini, Roberto
2011-04-01
The objective of this study was the development of a statistical approach for radiolabeling optimization of cysteine-dextran conjugates with Tc-99m tricarbonyl core. This strategy has been applied to the labeling of 2-propylene-S-cysteine-dextran in the attempt to prepare a new class of tracers for sentinel lymph node detection, and can be extended to other radiopharmaceuticals for different targets. The statistical routine was based on three-level factorial design. Best labeling conditions were achieved. The specific activity reached was 5 MBq/μg. Crown Copyright © 2011. Published by Elsevier Ltd. All rights reserved.
Statistical analysis of short-term water stress conditions at Riggs Creek OzFlux tower site
NASA Astrophysics Data System (ADS)
Azmi, Mohammad; Rüdiger, Christoph; Walker, Jeffrey P.
2017-10-01
A large range of indices and proxies are available to describe the water stress conditions of an area subject to different applications, which have varying capabilities and limitations depending on the prevailing local climatic conditions and land cover. The present study uses a range of spatio-temporally high-resolution (daily and within daily) data sources to evaluate a number of drought indices (DIs) for the Riggs Creek OzFlux tower site in southeastern Australia. Therefore, the main aim of this study is to evaluate the statistical characteristics of individual DIs subject to short-term water stress conditions. In order to derive a more general and therefore representative DI, a new criterion is required to specify the statistical similarity between each pair of indices to allow determining the dominant drought types along with their representative DIs. The results show that the monitoring of water stress at this case study area can be achieved by evaluating the individual behaviour of three clusters of (i) vegetation conditions, (ii) water availability and (iii) water consumptions. This indicates that it is not necessary to assess all individual DIs one by one to derive a comprehensive and informative data set about the water stress of an area; instead, this can be achieved by analysing one of the DIs from each cluster or deriving a new combinatory index for each cluster, based on established combination methods.
Spatio-temporal conditional inference and hypothesis tests for neural ensemble spiking precision
Harrison, Matthew T.; Amarasingham, Asohan; Truccolo, Wilson
2014-01-01
The collective dynamics of neural ensembles create complex spike patterns with many spatial and temporal scales. Understanding the statistical structure of these patterns can help resolve fundamental questions about neural computation and neural dynamics. Spatio-temporal conditional inference (STCI) is introduced here as a semiparametric statistical framework for investigating the nature of precise spiking patterns from collections of neurons that is robust to arbitrarily complex and nonstationary coarse spiking dynamics. The main idea is to focus statistical modeling and inference, not on the full distribution of the data, but rather on families of conditional distributions of precise spiking given different types of coarse spiking. The framework is then used to develop families of hypothesis tests for probing the spatio-temporal precision of spiking patterns. Relationships among different conditional distributions are used to improve multiple hypothesis testing adjustments and to design novel Monte Carlo spike resampling algorithms. Of special note are algorithms that can locally jitter spike times while still preserving the instantaneous peri-stimulus time histogram (PSTH) or the instantaneous total spike count from a group of recorded neurons. The framework can also be used to test whether first-order maximum entropy models with possibly random and time-varying parameters can account for observed patterns of spiking. STCI provides a detailed example of the generic principle of conditional inference, which may be applicable in other areas of neurostatistical analysis. PMID:25380339
Statewide analysis of the drainage-area ratio method for 34 streamflow percentile ranges in Texas
Asquith, William H.; Roussel, Meghan C.; Vrabel, Joseph
2006-01-01
The drainage-area ratio method commonly is used to estimate streamflow for sites where no streamflow data are available using data from one or more nearby streamflow-gaging stations. The method is intuitive and straightforward to implement and is in widespread use by analysts and managers of surface-water resources. The method equates the ratio of streamflow at two stream locations to the ratio of the respective drainage areas. In practice, unity often is assumed as the exponent on the drainage-area ratio, and unity also is assumed as a multiplicative bias correction. These two assumptions are evaluated in this investigation through statewide analysis of daily mean streamflow in Texas. The investigation was made by the U.S. Geological Survey in cooperation with the Texas Commission on Environmental Quality. More than 7.8 million values of daily mean streamflow for 712 U.S. Geological Survey streamflow-gaging stations in Texas were analyzed. To account for the influence of streamflow probability on the drainage-area ratio method, 34 percentile ranges were considered. The 34 ranges are the 4 quartiles (0-25, 25-50, 50-75, and 75-100 percent), the 5 intervals of the lower tail of the streamflow distribution (0-1, 1-2, 2-3, 3-4, and 4-5 percent), the 20 quintiles of the 4 quartiles (0-5, 5-10, 10-15, 15-20, 20-25, 25-30, 30-35, 35-40, 40-45, 45-50, 50-55, 55-60, 60-65, 65-70, 70-75, 75-80, 80-85, 85-90, 90-95, and 95-100 percent), and the 5 intervals of the upper tail of the streamflow distribution (95-96, 96-97, 97-98, 98-99 and 99-100 percent). For each of the 253,116 (712X711/2) unique pairings of stations and for each of the 34 percentile ranges, the concurrent daily mean streamflow values available for the two stations provided for station-pair application of the drainage-area ratio method. For each station pair, specific statistical summarization (median, mean, and standard deviation) of both the exponent and bias-correction components of the drainage-area ratio method were computed. Statewide statistics (median, mean, and standard deviation) of the station-pair specific statistics subsequently were computed and are tabulated herein. A separate analysis considered conditioning station pairs to those stations within 100 miles of each other and with the absolute value of the logarithm (base-10) of the ratio of the drainage areas greater than or equal to 0.25. Statewide statistics of the conditional station-pair specific statistics were computed and are tabulated. The conditional analysis is preferable because of the anticipation that small separation distances reflect similar hydrologic conditions and the observation of large variation in exponent estimates for similar-sized drainage areas. The conditional analysis determined that the exponent is about 0.89 for streamflow percentiles from 0 to about 50 percent, is about 0.92 for percentiles from about 50 to about 65 percent, and is about 0.93 for percentiles from about 65 to about 85 percent. The exponent decreases rapidly to about 0.70 for percentiles nearing 100 percent. The computation of the bias-correction factor is sensitive to the range analysis interval (range of streamflow percentile); however, evidence suggests that in practice the drainage-area method can be considered unbiased. Finally, for general application, suggested values of the exponent are tabulated for 54 percentiles of daily mean streamflow in Texas; when these values are used, the bias correction is unity.
On an additive partial correlation operator and nonparametric estimation of graphical models.
Lee, Kuang-Yao; Li, Bing; Zhao, Hongyu
2016-09-01
We introduce an additive partial correlation operator as an extension of partial correlation to the nonlinear setting, and use it to develop a new estimator for nonparametric graphical models. Our graphical models are based on additive conditional independence, a statistical relation that captures the spirit of conditional independence without having to resort to high-dimensional kernels for its estimation. The additive partial correlation operator completely characterizes additive conditional independence, and has the additional advantage of putting marginal variation on appropriate scales when evaluating interdependence, which leads to more accurate statistical inference. We establish the consistency of the proposed estimator. Through simulation experiments and analysis of the DREAM4 Challenge dataset, we demonstrate that our method performs better than existing methods in cases where the Gaussian or copula Gaussian assumption does not hold, and that a more appropriate scaling for our method further enhances its performance.
On an additive partial correlation operator and nonparametric estimation of graphical models
Li, Bing; Zhao, Hongyu
2016-01-01
Abstract We introduce an additive partial correlation operator as an extension of partial correlation to the nonlinear setting, and use it to develop a new estimator for nonparametric graphical models. Our graphical models are based on additive conditional independence, a statistical relation that captures the spirit of conditional independence without having to resort to high-dimensional kernels for its estimation. The additive partial correlation operator completely characterizes additive conditional independence, and has the additional advantage of putting marginal variation on appropriate scales when evaluating interdependence, which leads to more accurate statistical inference. We establish the consistency of the proposed estimator. Through simulation experiments and analysis of the DREAM4 Challenge dataset, we demonstrate that our method performs better than existing methods in cases where the Gaussian or copula Gaussian assumption does not hold, and that a more appropriate scaling for our method further enhances its performance. PMID:29422689
Results of a joint NOAA/NASA sounder simulation study
NASA Technical Reports Server (NTRS)
Phillips, N.; Susskind, Joel; Mcmillin, L.
1988-01-01
This paper presents the results of a joint NOAA and NASA sounder simulation study in which the accuracies of atmospheric temperature profiles and surface skin temperature measuremnents retrieved from two sounders were compared: (1) the currently used IR temperature sounder HIRS2 (High-resolution Infrared Radiation Sounder 2); and (2) the recently proposed high-spectral-resolution IR sounder AMTS (Advanced Moisture and Temperature Sounder). Simulations were conducted for both clear and partial cloud conditions. Data were analyzed at NASA using a physical inversion technique and at NOAA using a statistical technique. Results show significant improvement of AMTS compared to HIRS2 for both clear and cloudy conditions. The improvements are indicated by both methods of data analysis, but the physical retrievals outperform the statistical retrievals.
The statistics of identifying differentially expressed genes in Expresso and TM4: a comparison
Sioson, Allan A; Mane, Shrinivasrao P; Li, Pinghua; Sha, Wei; Heath, Lenwood S; Bohnert, Hans J; Grene, Ruth
2006-01-01
Background Analysis of DNA microarray data takes as input spot intensity measurements from scanner software and returns differential expression of genes between two conditions, together with a statistical significance assessment. This process typically consists of two steps: data normalization and identification of differentially expressed genes through statistical analysis. The Expresso microarray experiment management system implements these steps with a two-stage, log-linear ANOVA mixed model technique, tailored to individual experimental designs. The complement of tools in TM4, on the other hand, is based on a number of preset design choices that limit its flexibility. In the TM4 microarray analysis suite, normalization, filter, and analysis methods form an analysis pipeline. TM4 computes integrated intensity values (IIV) from the average intensities and spot pixel counts returned by the scanner software as input to its normalization steps. By contrast, Expresso can use either IIV data or median intensity values (MIV). Here, we compare Expresso and TM4 analysis of two experiments and assess the results against qRT-PCR data. Results The Expresso analysis using MIV data consistently identifies more genes as differentially expressed, when compared to Expresso analysis with IIV data. The typical TM4 normalization and filtering pipeline corrects systematic intensity-specific bias on a per microarray basis. Subsequent statistical analysis with Expresso or a TM4 t-test can effectively identify differentially expressed genes. The best agreement with qRT-PCR data is obtained through the use of Expresso analysis and MIV data. Conclusion The results of this research are of practical value to biologists who analyze microarray data sets. The TM4 normalization and filtering pipeline corrects microarray-specific systematic bias and complements the normalization stage in Expresso analysis. The results of Expresso using MIV data have the best agreement with qRT-PCR results. In one experiment, MIV is a better choice than IIV as input to data normalization and statistical analysis methods, as it yields as greater number of statistically significant differentially expressed genes; TM4 does not support the choice of MIV input data. Overall, the more flexible and extensive statistical models of Expresso achieve more accurate analytical results, when judged by the yardstick of qRT-PCR data, in the context of an experimental design of modest complexity. PMID:16626497
Bayesian conditional-independence modeling of the AIDS epidemic in England and Wales
NASA Astrophysics Data System (ADS)
Gilks, Walter R.; De Angelis, Daniela; Day, Nicholas E.
We describe the use of conditional-independence modeling, Bayesian inference and Markov chain Monte Carlo, to model and project the HIV-AIDS epidemic in homosexual/bisexual males in England and Wales. Complexity in this analysis arises through selectively missing data, indirectly observed underlying processes, and measurement error. Our emphasis is on presentation and discussion of the concepts, not on the technicalities of this analysis, which can be found elsewhere [D. De Angelis, W.R. Gilks, N.E. Day, Bayesian projection of the the acquired immune deficiency syndrome epidemic (with discussion), Applied Statistics, in press].
Variation in reaction norms: Statistical considerations and biological interpretation.
Morrissey, Michael B; Liefting, Maartje
2016-09-01
Analysis of reaction norms, the functions by which the phenotype produced by a given genotype depends on the environment, is critical to studying many aspects of phenotypic evolution. Different techniques are available for quantifying different aspects of reaction norm variation. We examine what biological inferences can be drawn from some of the more readily applicable analyses for studying reaction norms. We adopt a strongly biologically motivated view, but draw on statistical theory to highlight strengths and drawbacks of different techniques. In particular, consideration of some formal statistical theory leads to revision of some recently, and forcefully, advocated opinions on reaction norm analysis. We clarify what simple analysis of the slope between mean phenotype in two environments can tell us about reaction norms, explore the conditions under which polynomial regression can provide robust inferences about reaction norm shape, and explore how different existing approaches may be used to draw inferences about variation in reaction norm shape. We show how mixed model-based approaches can provide more robust inferences than more commonly used multistep statistical approaches, and derive new metrics of the relative importance of variation in reaction norm intercepts, slopes, and curvatures. © 2016 The Author(s). Evolution © 2016 The Society for the Study of Evolution.
NASA Technical Reports Server (NTRS)
Dragonette, Richard A.; Suter, Joseph J.
1992-01-01
An extensive statistical analysis has been undertaken to determine if a correlation exists between changes in an NR atomic hydrogen maser's frequency offset and changes in environmental conditions. Correlation analyses have been performed comparing barometric pressure, humidity, and temperature with maser frequency offset as a function of time for periods ranging from 5.5 to 17 days. Semipartial correlation coefficients as large as -0.9 have been found between barometric pressure and maser frequency offset. Correlation between maser frequency offset and humidity was small compared to barometric pressure and unpredictable. Analysis of temperature data indicates that in the most current design, temperature does not significantly affect maser frequency offset.
Vibroacoustic optimization using a statistical energy analysis model
NASA Astrophysics Data System (ADS)
Culla, Antonio; D`Ambrogio, Walter; Fregolent, Annalisa; Milana, Silvia
2016-08-01
In this paper, an optimization technique for medium-high frequency dynamic problems based on Statistical Energy Analysis (SEA) method is presented. Using a SEA model, the subsystem energies are controlled by internal loss factors (ILF) and coupling loss factors (CLF), which in turn depend on the physical parameters of the subsystems. A preliminary sensitivity analysis of subsystem energy to CLF's is performed to select CLF's that are most effective on subsystem energies. Since the injected power depends not only on the external loads but on the physical parameters of the subsystems as well, it must be taken into account under certain conditions. This is accomplished in the optimization procedure, where approximate relationships between CLF's, injected power and physical parameters are derived. The approach is applied on a typical aeronautical structure: the cabin of a helicopter.
Do climate extreme events foster violent civil conflicts? A coincidence analysis
NASA Astrophysics Data System (ADS)
Schleussner, Carl-Friedrich; Donges, Jonathan F.; Donner, Reik V.
2014-05-01
Civil conflicts promoted by adverse environmental conditions represent one of the most important potential feedbacks in the global socio-environmental nexus. While the role of climate extremes as a triggering factor is often discussed, no consensus is yet reached about the cause-and-effect relation in the observed data record. Here we present results of a rigorous statistical coincidence analysis based on the Munich Re Inc. extreme events database and the Uppsala conflict data program. We report evidence for statistically significant synchronicity between climate extremes with high economic impact and violent conflicts for various regions, although no coherent global signal emerges from our analysis. Our results indicate the importance of regional vulnerability and might aid to identify hot-spot regions for potential climate-triggered violent social conflicts.
In a previously published study, quantitative relationships were developed between landscape metrics and sediment contamination for 25 small estuarine systems within Chesapeake Bay. Nonparametric statistical analysis (rank transformation) was used to develop an empirical relation...
Computer program documentation for the pasture/range condition assessment processor
NASA Technical Reports Server (NTRS)
Mcintyre, K. S.; Miller, T. G. (Principal Investigator)
1982-01-01
The processor which drives for the RANGE software allows the user to analyze LANDSAT data containing pasture and rangeland. Analysis includes mapping, generating statistics, calculating vegetative indexes, and plotting vegetative indexes. Routines for using the processor are given. A flow diagram is included.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoo, Jun Soo
The bubble departure diameter and bubble release frequency were obtained through the analysis of TAMU subcooled flow boiling experimental data. The numerous images of bubbles at departure were analyzed for each experimental condition to achieve the reliable statistics of the measured bubble parameters. The results are provided in this report with simple discussion.
Identifying natural flow regimes using fish communities
NASA Astrophysics Data System (ADS)
Chang, Fi-John; Tsai, Wen-Ping; Wu, Tzu-Ching; Chen, Hung-kwai; Herricks, Edwin E.
2011-10-01
SummaryModern water resources management has adopted natural flow regimes as reasonable targets for river restoration and conservation. The characterization of a natural flow regime begins with the development of hydrologic statistics from flow records. However, little guidance exists for defining the period of record needed for regime determination. In Taiwan, the Taiwan Eco-hydrological Indicator System (TEIS), a group of hydrologic statistics selected for fisheries relevance, is being used to evaluate ecological flows. The TEIS consists of a group of hydrologic statistics selected to characterize the relationships between flow and the life history of indigenous species. Using the TEIS and biosurvey data for Taiwan, this paper identifies the length of hydrologic record sufficient for natural flow regime characterization. To define the ecological hydrology of fish communities, this study connected hydrologic statistics to fish communities by using methods to define antecedent conditions that influence existing community composition. A moving average method was applied to TEIS statistics to reflect the effects of antecedent flow condition and a point-biserial correlation method was used to relate fisheries collections with TEIS statistics. The resulting fish species-TEIS (FISH-TEIS) hydrologic statistics matrix takes full advantage of historical flows and fisheries data. The analysis indicates that, in the watersheds analyzed, averaging TEIS statistics for the present year and 3 years prior to the sampling date, termed MA(4), is sufficient to develop a natural flow regime. This result suggests that flow regimes based on hydrologic statistics for the period of record can be replaced by regimes developed for sampled fish communities.
Reconnection properties in Kelvin-Helmholtz instabilities
NASA Astrophysics Data System (ADS)
Vernisse, Y.; Lavraud, B.; Eriksson, S.; Gershman, D. J.; Dorelli, J.; Pollock, C. J.; Giles, B. L.; Aunai, N.; Avanov, L. A.; Burch, J.; Chandler, M. O.; Coffey, V. N.; Dargent, J.; Ergun, R.; Farrugia, C. J.; Genot, V. N.; Graham, D.; Hasegawa, H.; Jacquey, C.; Kacem, I.; Khotyaintsev, Y. V.; Li, W.; Magnes, W.; Marchaudon, A.; Moore, T. E.; Paterson, W. R.; Penou, E.; Phan, T.; Retino, A.; Schwartz, S. J.; Saito, Y.; Sauvaud, J. A.; Schiff, C.; Torbert, R. B.; Wilder, F. D.; Yokota, S.
2017-12-01
Kelvin-Helmholtz instabilities are particular laboratories to study strong guide field reconnection processes. In particular, unlike the usual dayside magnetopause, the conditions across the magnetopause in KH vortices are quasi-symmetric, with low differences in beta and magnetic shear angle. We study these properties by means of statistical analysis of the high-resolution data of the Magnetospheric Multiscale mission. Several events of Kelvin-Helmholtz instabilities pas the terminator plane and a long lasting dayside instabilities event where used in order to produce this statistical analysis. Early results present a consistency between the data and the theory. In addition, the results emphasize the importance of the thickness of the magnetopause as a driver of magnetic reconnection in low magnetic shear events.
NASA Astrophysics Data System (ADS)
Pradeep, Krishna; Poiroux, Thierry; Scheer, Patrick; Juge, André; Gouget, Gilles; Ghibaudo, Gérard
2018-07-01
This work details the analysis of wafer level global process variability in 28 nm FD-SOI using split C-V measurements. The proposed approach initially evaluates the native on wafer process variability using efficient extraction methods on split C-V measurements. The on-wafer threshold voltage (VT) variability is first studied and modeled using a simple analytical model. Then, a statistical model based on the Leti-UTSOI compact model is proposed to describe the total C-V variability in different bias conditions. This statistical model is finally used to study the contribution of each process parameter to the total C-V variability.
Statistical Analysis for Collision-free Boson Sampling.
Huang, He-Liang; Zhong, Han-Sen; Li, Tan; Li, Feng-Guang; Fu, Xiang-Qun; Zhang, Shuo; Wang, Xiang; Bao, Wan-Su
2017-11-10
Boson sampling is strongly believed to be intractable for classical computers but solvable with photons in linear optics, which raises widespread concern as a rapid way to demonstrate the quantum supremacy. However, due to its solution is mathematically unverifiable, how to certify the experimental results becomes a major difficulty in the boson sampling experiment. Here, we develop a statistical analysis scheme to experimentally certify the collision-free boson sampling. Numerical simulations are performed to show the feasibility and practicability of our scheme, and the effects of realistic experimental conditions are also considered, demonstrating that our proposed scheme is experimentally friendly. Moreover, our broad approach is expected to be generally applied to investigate multi-particle coherent dynamics beyond the boson sampling.
NASA Technical Reports Server (NTRS)
Stephens, J. B.; Sloan, J. C.
1976-01-01
A method is described for developing a statistical air quality assessment for the launch of an aerospace vehicle from the Kennedy Space Center in terms of existing climatological data sets. The procedure can be refined as developing meteorological conditions are identified for use with the NASA-Marshall Space Flight Center Rocket Exhaust Effluent Diffusion (REED) description. Classical climatological regimes for the long range analysis can be narrowed as the synoptic and mesoscale structure is identified. Only broad synoptic regimes are identified at this stage of analysis. As the statistical data matrix is developed, synoptic regimes will be refined in terms of the resulting eigenvectors as applicable to aerospace air quality predictions.
Gorman, Dennis M; Huber, J Charles
2009-08-01
This study explores the possibility that any drug prevention program might be considered ;;evidence-based'' given the use of data analysis procedures that optimize the chance of producing statistically significant results by reanalyzing data from a Drug Abuse Resistance Education (DARE) program evaluation. The analysis produced a number of statistically significant differences between the DARE and control conditions on alcohol and marijuana use measures. Many of these differences occurred at cutoff points on the assessment scales for which post hoc meaningful labels were created. Our results are compared to those from evaluations of programs that appear on evidence-based drug prevention lists.
Rowlands, G J; Musoke, A J; Morzaria, S P; Nagda, S M; Ballingall, K T; McKeever, D J
2000-04-01
A statistically derived disease reaction index based on parasitological, clinical and haematological measurements observed in 309 5 to 8-month-old Boran cattle following laboratory challenge with Theileria parva is described. Principal component analysis was applied to 13 measures including first appearance of schizonts, first appearance of piroplasms and first occurrence of pyrexia, together with the duration and severity of these symptoms, and white blood cell count. The first principal component, which was based on approximately equal contributions of the 13 variables, provided the definition for the disease reaction index, defined on a scale of 0-10. As well as providing a more objective measure of the severity of the reaction, the continuous nature of the index score enables more powerful statistical analysis of the data compared with that which has been previously possible through clinically derived categories of non-, mild, moderate and severe reactions.
Zhang, Xiaoshuai; Yang, Xiaowei; Yuan, Zhongshang; Liu, Yanxun; Li, Fangyu; Peng, Bin; Zhu, Dianwen; Zhao, Jinghua; Xue, Fuzhong
2013-01-01
For genome-wide association data analysis, two genes in any pathway, two SNPs in the two linked gene regions respectively or in the two linked exons respectively within one gene are often correlated with each other. We therefore proposed the concept of gene-gene co-association, which refers to the effects not only due to the traditional interaction under nearly independent condition but the correlation between two genes. Furthermore, we constructed a novel statistic for detecting gene-gene co-association based on Partial Least Squares Path Modeling (PLSPM). Through simulation, the relationship between traditional interaction and co-association was highlighted under three different types of co-association. Both simulation and real data analysis demonstrated that the proposed PLSPM-based statistic has better performance than single SNP-based logistic model, PCA-based logistic model, and other gene-based methods. PMID:23620809
Zhang, Xiaoshuai; Yang, Xiaowei; Yuan, Zhongshang; Liu, Yanxun; Li, Fangyu; Peng, Bin; Zhu, Dianwen; Zhao, Jinghua; Xue, Fuzhong
2013-01-01
For genome-wide association data analysis, two genes in any pathway, two SNPs in the two linked gene regions respectively or in the two linked exons respectively within one gene are often correlated with each other. We therefore proposed the concept of gene-gene co-association, which refers to the effects not only due to the traditional interaction under nearly independent condition but the correlation between two genes. Furthermore, we constructed a novel statistic for detecting gene-gene co-association based on Partial Least Squares Path Modeling (PLSPM). Through simulation, the relationship between traditional interaction and co-association was highlighted under three different types of co-association. Both simulation and real data analysis demonstrated that the proposed PLSPM-based statistic has better performance than single SNP-based logistic model, PCA-based logistic model, and other gene-based methods.
NASA Astrophysics Data System (ADS)
Guimarães Nobre, Gabriela; Arnbjerg-Nielsen, Karsten; Rosbjerg, Dan; Madsen, Henrik
2016-04-01
Traditionally, flood risk assessment studies have been carried out from a univariate frequency analysis perspective. However, statistical dependence between hydrological variables, such as extreme rainfall and extreme sea surge, is plausible to exist, since both variables to some extent are driven by common meteorological conditions. Aiming to overcome this limitation, multivariate statistical techniques has the potential to combine different sources of flooding in the investigation. The aim of this study was to apply a range of statistical methodologies for analyzing combined extreme hydrological variables that can lead to coastal and urban flooding. The study area is the Elwood Catchment, which is a highly urbanized catchment located in the city of Port Phillip, Melbourne, Australia. The first part of the investigation dealt with the marginal extreme value distributions. Two approaches to extract extreme value series were applied (Annual Maximum and Partial Duration Series), and different probability distribution functions were fit to the observed sample. Results obtained by using the Generalized Pareto distribution demonstrate the ability of the Pareto family to model the extreme events. Advancing into multivariate extreme value analysis, first an investigation regarding the asymptotic properties of extremal dependence was carried out. As a weak positive asymptotic dependence between the bivariate extreme pairs was found, the Conditional method proposed by Heffernan and Tawn (2004) was chosen. This approach is suitable to model bivariate extreme values, which are relatively unlikely to occur together. The results show that the probability of an extreme sea surge occurring during a one-hour intensity extreme precipitation event (or vice versa) can be twice as great as what would occur when assuming independent events. Therefore, presuming independence between these two variables would result in severe underestimation of the flooding risk in the study area.
1988-09-01
S P a .E REPORT DOCUMENTATION PAGE OMR;oJ ’ , CRR Eo Dale n2 ;R6 ’a 4EPOR- SCRFT CASS F.C.T ON ’b RES’RICTI’,E MARKINGS Unclassified a ECRIT y...and selection of test waves 30. Measured prototype wave data on which a comprehensive statistical analysis of wave conditions could be based were...Tests Existing conditions 32. Prior to testing of the various improvement plans, comprehensive tests were conducted for existing conditions (Plate 1
Statistical Downscaling of WRF-Chem Model: An Air Quality Analysis over Bogota, Colombia
NASA Astrophysics Data System (ADS)
Kumar, Anikender; Rojas, Nestor
2015-04-01
Statistical downscaling is a technique that is used to extract high-resolution information from regional scale variables produced by coarse resolution models such as Chemical Transport Models (CTMs). The fully coupled WRF-Chem (Weather Research and Forecasting with Chemistry) model is used to simulate air quality over Bogota. Bogota is a tropical Andean megacity located over a high-altitude plateau in the middle of very complex terrain. The WRF-Chem model was adopted for simulating the hourly ozone concentrations. The computational domains were chosen of 120x120x32, 121x121x32 and 121x121x32 grid points with horizontal resolutions of 27, 9 and 3 km respectively. The model was initialized with real boundary conditions using NCAR-NCEP's Final Analysis (FNL) and a 1ox1o (~111 km x 111 km) resolution. Boundary conditions were updated every 6 hours using reanalysis data. The emission rates were obtained from global inventories, namely the REanalysis of the TROpospheric (RETRO) chemical composition and the Emission Database for Global Atmospheric Research (EDGAR). Multiple linear regression and artificial neural network techniques are used to downscale the model output at each monitoring stations. The results confirm that the statistically downscaled outputs reduce simulated errors by up to 25%. This study provides a general overview of statistical downscaling of chemical transport models and can constitute a reference for future air quality modeling exercises over Bogota and other Colombian cities.
ANN based Performance Evaluation of BDI for Condition Monitoring of Induction Motor Bearings
NASA Astrophysics Data System (ADS)
Patel, Raj Kumar; Giri, V. K.
2017-06-01
One of the critical parts in rotating machines is bearings and most of the failure arises from the defective bearings. Bearing failure leads to failure of a machine and the unpredicted productivity loss in the performance. Therefore, bearing fault detection and prognosis is an integral part of the preventive maintenance procedures. In this paper vibration signal for four conditions of a deep groove ball bearing; normal (N), inner race defect (IRD), ball defect (BD) and outer race defect (ORD) were acquired from a customized bearing test rig, under four different conditions and three different fault sizes. Two approaches have been opted for statistical feature extraction from the vibration signal. In the first approach, raw signal is used for statistical feature extraction and in the second approach statistical features extracted are based on bearing damage index (BDI). The proposed BDI technique uses wavelet packet node energy coefficients analysis method. Both the features are used as inputs to an ANN classifier to evaluate its performance. A comparison of ANN performance is made based on raw vibration data and data chosen by using BDI. The ANN performance has been found to be fairly higher when BDI based signals were used as inputs to the classifier.
NASA Astrophysics Data System (ADS)
Yousif, Dilon
The purpose of this study was to improve the Quality Assurance (QA) System at the Nemak Windsor Aluminum Plant (WAP). The project used Six Sigma method based on Define, Measure, Analyze, Improve, and Control (DMAIC). Analysis of in process melt at WAP was based on chemical, thermal, and mechanical testing. The control limits for the W319 Al Alloy were statistically recalculated using the composition measured under stable conditions. The "Chemistry Viewer" software was developed for statistical analysis of alloy composition. This software features the Silicon Equivalency (SiBQ) developed by the IRC. The Melt Sampling Device (MSD) was designed and evaluated at WAP to overcome traditional sampling limitations. The Thermal Analysis "Filters" software was developed for cooling curve analysis of the 3XX Al Alloy(s) using IRC techniques. The impact of low melting point impurities on the start of melting was evaluated using the Universal Metallurgical Simulator and Analyzer (UMSA).
Walden-Schreiner, Chelsey; Leung, Yu-Fai
2013-07-01
Ecological impacts associated with nature-based recreation and tourism can compromise park and protected area goals if left unrestricted. Protected area agencies are increasingly incorporating indicator-based management frameworks into their management plans to address visitor impacts. Development of indicators requires empirical evaluation of indicator measures and examining their ecological and social relevance. This study addresses the development of the informal trail indicator in Yosemite National Park by spatially characterizing visitor use in open landscapes and integrating use patterns with informal trail condition data to examine their spatial association. Informal trail and visitor use data were collected concurrently during July and August of 2011 in three, high-use meadows of Yosemite Valley. Visitor use was clustered at statistically significant levels in all three study meadows. Spatial data integration found no statistically significant differences between use patterns and trail condition class. However, statistically significant differences were found between the distance visitors were observed from informal trails and visitor activity type with active activities occurring closer to trail corridors. Gender was also found to be significant with male visitors observed further from trail corridors. Results highlight the utility of integrated spatial analysis in supporting indicator-based monitoring and informing management of open landscapes. Additional variables for future analysis and methodological improvements are discussed.
NASA Astrophysics Data System (ADS)
Walden-Schreiner, Chelsey; Leung, Yu-Fai
2013-07-01
Ecological impacts associated with nature-based recreation and tourism can compromise park and protected area goals if left unrestricted. Protected area agencies are increasingly incorporating indicator-based management frameworks into their management plans to address visitor impacts. Development of indicators requires empirical evaluation of indicator measures and examining their ecological and social relevance. This study addresses the development of the informal trail indicator in Yosemite National Park by spatially characterizing visitor use in open landscapes and integrating use patterns with informal trail condition data to examine their spatial association. Informal trail and visitor use data were collected concurrently during July and August of 2011 in three, high-use meadows of Yosemite Valley. Visitor use was clustered at statistically significant levels in all three study meadows. Spatial data integration found no statistically significant differences between use patterns and trail condition class. However, statistically significant differences were found between the distance visitors were observed from informal trails and visitor activity type with active activities occurring closer to trail corridors. Gender was also found to be significant with male visitors observed further from trail corridors. Results highlight the utility of integrated spatial analysis in supporting indicator-based monitoring and informing management of open landscapes. Additional variables for future analysis and methodological improvements are discussed.
New Developments in the Embedded Statistical Coupling Method: Atomistic/Continuum Crack Propagation
NASA Technical Reports Server (NTRS)
Saether, E.; Yamakov, V.; Glaessgen, E.
2008-01-01
A concurrent multiscale modeling methodology that embeds a molecular dynamics (MD) region within a finite element (FEM) domain has been enhanced. The concurrent MD-FEM coupling methodology uses statistical averaging of the deformation of the atomistic MD domain to provide interface displacement boundary conditions to the surrounding continuum FEM region, which, in turn, generates interface reaction forces that are applied as piecewise constant traction boundary conditions to the MD domain. The enhancement is based on the addition of molecular dynamics-based cohesive zone model (CZM) elements near the MD-FEM interface. The CZM elements are a continuum interpretation of the traction-displacement relationships taken from MD simulations using Cohesive Zone Volume Elements (CZVE). The addition of CZM elements to the concurrent MD-FEM analysis provides a consistent set of atomistically-based cohesive properties within the finite element region near the growing crack. Another set of CZVEs are then used to extract revised CZM relationships from the enhanced embedded statistical coupling method (ESCM) simulation of an edge crack under uniaxial loading.
Non-arbitrage in financial markets: A Bayesian approach for verification
NASA Astrophysics Data System (ADS)
Cerezetti, F. V.; Stern, Julio Michael
2012-10-01
The concept of non-arbitrage plays an essential role in finance theory. Under certain regularity conditions, the Fundamental Theorem of Asset Pricing states that, in non-arbitrage markets, prices of financial instruments are martingale processes. In this theoretical framework, the analysis of the statistical distributions of financial assets can assist in understanding how participants behave in the markets, and may or may not engender arbitrage conditions. Assuming an underlying Variance Gamma statistical model, this study aims to test, using the FBST - Full Bayesian Significance Test, if there is a relevant price difference between essentially the same financial asset traded at two distinct locations. Specifically, we investigate and compare the behavior of call options on the BOVESPA Index traded at (a) the Equities Segment and (b) the Derivatives Segment of BM&FBovespa. Our results seem to point out significant statistical differences. To what extent this evidence is actually the expression of perennial arbitrage opportunities is still an open question.
Uncertainty Quantification and Statistical Convergence Guidelines for PIV Data
NASA Astrophysics Data System (ADS)
Stegmeir, Matthew; Kassen, Dan
2016-11-01
As Particle Image Velocimetry has continued to mature, it has developed into a robust and flexible technique for velocimetry used by expert and non-expert users. While historical estimates of PIV accuracy have typically relied heavily on "rules of thumb" and analysis of idealized synthetic images, recently increased emphasis has been placed on better quantifying real-world PIV measurement uncertainty. Multiple techniques have been developed to provide per-vector instantaneous uncertainty estimates for PIV measurements. Often real-world experimental conditions introduce complications in collecting "optimal" data, and the effect of these conditions is important to consider when planning an experimental campaign. The current work utilizes the results of PIV Uncertainty Quantification techniques to develop a framework for PIV users to utilize estimated PIV confidence intervals to compute reliable data convergence criteria for optimal sampling of flow statistics. Results are compared using experimental and synthetic data, and recommended guidelines and procedures leveraging estimated PIV confidence intervals for efficient sampling for converged statistics are provided.
Chan, Y; Walmsley, R P
1997-12-01
When several treatment methods are available for the same problem, many clinicians are faced with the task of deciding which treatment to use. Many clinicians may have conducted informal "mini-experiments" on their own to determine which treatment is best suited for the problem. These results are usually not documented or reported in a formal manner because many clinicians feel that they are "statistically challenged." Another reason may be because clinicians do not feel they have controlled enough test conditions to warrant analysis. In this update, a statistic is described that does not involve complicated statistical assumptions, making it a simple and easy-to-use statistical method. This update examines the use of two statistics and does not deal with other issues that could affect clinical research such as issues affecting credibility. For readers who want a more in-depth examination of this topic, references have been provided. The Kruskal-Wallis one-way analysis-of-variance-by-ranks test (or H test) is used to determine whether three or more independent groups are the same or different on some variable of interest when an ordinal level of data or an interval or ratio level of data is available. A hypothetical example will be presented to explain when and how to use this statistic, how to interpret results using the statistic, the advantages and disadvantages of the statistic, and what to look for in a written report. This hypothetical example will involve the use of ratio data to demonstrate how to choose between using the nonparametric H test and the more powerful parametric F test.
Rosen, G D
2006-06-01
Meta-analysis is a vague descriptor used to encompass very diverse methods of data collection analysis, ranging from simple averages to more complex statistical methods. Holo-analysis is a fully comprehensive statistical analysis of all available data and all available variables in a specified topic, with results expressed in a holistic factual empirical model. The objectives and applications of holo-analysis include software production for prediction of responses with confidence limits, translation of research conditions to praxis (field) circumstances, exposure of key missing variables, discovery of theoretically unpredictable variables and interactions, and planning future research. Holo-analyses are cited as examples of the effects on broiler feed intake and live weight gain of exogenous phytases, which account for 70% of variation in responses in terms of 20 highly significant chronological, dietary, environmental, genetic, managemental, and nutrient variables. Even better future accountancy of variation will be facilitated if and when authors of papers routinely provide key data for currently neglected variables, such as temperatures, complete feed formulations, and mortalities.
Flocculation kinetics and aggregate structure of kaolinite mixtures in laminar tube flow.
Vaezi G, Farid; Sanders, R Sean; Masliyah, Jacob H
2011-03-01
Flocculation is commonly used in various solid-liquid separation processes in chemical and mineral industries to separate desired products or to treat waste streams. This paper presents an experimental technique to study flocculation processes in laminar tube flow. This approach allows for more realistic estimation of the shear rate to which an aggregate is exposed, as compared to more complicated shear fields (e.g. stirred tanks). A direct sampling method is used to minimize the effect of sampling on the aggregate structure. A combination of aggregate settling velocity and image analysis was used to quantify the structure of the aggregate. Aggregate size, density, and fractal dimension were found to be the most important aggregate structural parameters. The two methods used to determine aggregate fractal dimension were in good agreement. The effects of advective flow through an aggregate's porous structure and transition-regime drag coefficient on the evaluation of aggregate density were considered. The technique was applied to investigate the flocculation kinetics and the evolution of the aggregate structure of kaolin particles with an anionic flocculant under conditions similar to those of oil sands fine tailings. Aggregates were formed using a well controlled two-stage aggregation process. Detailed statistical analysis was performed to investigate the establishment of dynamic equilibrium condition in terms of aggregate size and density evolution. An equilibrium steady state condition was obtained within 90 s of the start of flocculation; after which no further change in aggregate structure was observed. Although longer flocculation times inside the shear field could conceivably cause aggregate structure conformation, statistical analysis indicated that this did not occur for the studied conditions. The results show that the technique and experimental conditions employed here produce aggregates having a well-defined, reproducible structure. Copyright © 2011. Published by Elsevier Inc.
Mercer, Theresa G; Frostick, Lynne E; Walmsley, Anthony D
2011-10-15
This paper presents a statistical technique that can be applied to environmental chemistry data where missing values and limit of detection levels prevent the application of statistics. A working example is taken from an environmental leaching study that was set up to determine if there were significant differences in levels of leached arsenic (As), chromium (Cr) and copper (Cu) between lysimeters containing preservative treated wood waste and those containing untreated wood. Fourteen lysimeters were setup and left in natural conditions for 21 weeks. The resultant leachate was analysed by ICP-OES to determine the As, Cr and Cu concentrations. However, due to the variation inherent in each lysimeter combined with the limits of detection offered by ICP-OES, the collected quantitative data was somewhat incomplete. Initial data analysis was hampered by the number of 'missing values' in the data. To recover the dataset, the statistical tool of Statistical Multiple Imputation (SMI) was applied, and the data was re-analysed successfully. It was demonstrated that using SMI did not affect the variance in the data, but facilitated analysis of the complete dataset. Copyright © 2011 Elsevier B.V. All rights reserved.
Moretti, Stefano; van Leeuwen, Danitsja; Gmuender, Hans; Bonassi, Stefano; van Delft, Joost; Kleinjans, Jos; Patrone, Fioravante; Merlo, Domenico Franco
2008-01-01
Background In gene expression analysis, statistical tests for differential gene expression provide lists of candidate genes having, individually, a sufficiently low p-value. However, the interpretation of each single p-value within complex systems involving several interacting genes is problematic. In parallel, in the last sixty years, game theory has been applied to political and social problems to assess the power of interacting agents in forcing a decision and, more recently, to represent the relevance of genes in response to certain conditions. Results In this paper we introduce a Bootstrap procedure to test the null hypothesis that each gene has the same relevance between two conditions, where the relevance is represented by the Shapley value of a particular coalitional game defined on a microarray data-set. This method, which is called Comparative Analysis of Shapley value (shortly, CASh), is applied to data concerning the gene expression in children differentially exposed to air pollution. The results provided by CASh are compared with the results from a parametric statistical test for testing differential gene expression. Both lists of genes provided by CASh and t-test are informative enough to discriminate exposed subjects on the basis of their gene expression profiles. While many genes are selected in common by CASh and the parametric test, it turns out that the biological interpretation of the differences between these two selections is more interesting, suggesting a different interpretation of the main biological pathways in gene expression regulation for exposed individuals. A simulation study suggests that CASh offers more power than t-test for the detection of differential gene expression variability. Conclusion CASh is successfully applied to gene expression analysis of a data-set where the joint expression behavior of genes may be critical to characterize the expression response to air pollution. We demonstrate a synergistic effect between coalitional games and statistics that resulted in a selection of genes with a potential impact in the regulation of complex pathways. PMID:18764936
Dittmar, John C.; Pierce, Steven; Rothstein, Rodney; Reid, Robert J. D.
2013-01-01
Genome-wide experiments often measure quantitative differences between treated and untreated cells to identify affected strains. For these studies, statistical models are typically used to determine significance cutoffs. We developed a method termed “CLIK” (Cutoff Linked to Interaction Knowledge) that overlays biological knowledge from the interactome on screen results to derive a cutoff. The method takes advantage of the fact that groups of functionally related interacting genes often respond similarly to experimental conditions and, thus, cluster in a ranked list of screen results. We applied CLIK analysis to five screens of the yeast gene disruption library and found that it defined a significance cutoff that differed from traditional statistics. Importantly, verification experiments revealed that the CLIK cutoff correlated with the position in the rank order where the rate of true positives drops off significantly. In addition, the gene sets defined by CLIK analysis often provide further biological perspectives. For example, applying CLIK analysis retrospectively to a screen for cisplatin sensitivity allowed us to identify the importance of the Hrq1 helicase in DNA crosslink repair. Furthermore, we demonstrate the utility of CLIK to determine optimal treatment conditions by analyzing genome-wide screens at multiple rapamycin concentrations. We show that CLIK is an extremely useful tool for evaluating screen quality, determining screen cutoffs, and comparing results between screens. Furthermore, because CLIK uses previously annotated interaction data to determine biologically informed cutoffs, it provides additional insights into screen results, which supplement traditional statistical approaches. PMID:23589890
Quantifying predictability in a model with statistical features of the atmosphere
Kleeman, Richard; Majda, Andrew J.; Timofeyev, Ilya
2002-01-01
The Galerkin truncated inviscid Burgers equation has recently been shown by the authors to be a simple model with many degrees of freedom, with many statistical properties similar to those occurring in dynamical systems relevant to the atmosphere. These properties include long time-correlated, large-scale modes of low frequency variability and short time-correlated “weather modes” at smaller scales. The correlation scaling in the model extends over several decades and may be explained by a simple theory. Here a thorough analysis of the nature of predictability in the idealized system is developed by using a theoretical framework developed by R.K. This analysis is based on a relative entropy functional that has been shown elsewhere by one of the authors to measure the utility of statistical predictions precisely. The analysis is facilitated by the fact that most relevant probability distributions are approximately Gaussian if the initial conditions are assumed to be so. Rather surprisingly this holds for both the equilibrium (climatological) and nonequilibrium (prediction) distributions. We find that in most cases the absolute difference in the first moments of these two distributions (the “signal” component) is the main determinant of predictive utility variations. Contrary to conventional belief in the ensemble prediction area, the dispersion of prediction ensembles is generally of secondary importance in accounting for variations in utility associated with different initial conditions. This conclusion has potentially important implications for practical weather prediction, where traditionally most attention has focused on dispersion and its variability. PMID:12429863
Asymmetric statistical features of the Chinese domestic and international gold price fluctuation
NASA Astrophysics Data System (ADS)
Cao, Guangxi; Zhao, Yingchao; Han, Yan
2015-05-01
Analyzing the statistical features of fluctuation is remarkably significant for financial risk identification and measurement. In this study, the asymmetric detrended fluctuation analysis (A-DFA) method was applied to evaluate asymmetric multifractal scaling behaviors in the Shanghai and New York gold markets. Our findings showed that the multifractal features of the Chinese and international gold spot markets were asymmetric. The gold return series persisted longer in an increasing trend than in a decreasing trend. Moreover, the asymmetric degree of multifractals in the Chinese and international gold markets decreased with the increase in fluctuation range. In addition, the empirical analysis using sliding window technology indicated that multifractal asymmetry in the Chinese and international gold markets was characterized by its time-varying feature. However, the Shanghai and international gold markets basically shared a similar asymmetric degree evolution pattern. The American subprime mortgage crisis (2008) and the European debt crisis (2010) enhanced the asymmetric degree of the multifractal features of the Chinese and international gold markets. Furthermore, we also make statistical tests for the results of multifractatity and asymmetry, and discuss the origin of them. Finally, results of the empirical analysis using the threshold autoregressive conditional heteroskedasticity (TARCH) and exponential generalized autoregressive conditional heteroskedasticity (EGARCH) models exhibited that good news had a more significant effect on the cyclical fluctuation of the gold market than bad news. Moreover, good news exerted a more significant effect on the Chinese gold market than on the international gold market.
Women's health and women's work in health services: what statistics tell us.
Hedman, B; Herner, E
1988-01-01
This article draws together statistical information in several broad areas that relate to women's health, women's reproductive activities and women's occupations in Sweden. The statistical analysis reflects the major changes that have occurred in Swedish society and that have had a major impact on the health and well-being, as well as on the social participation rate, of women. Much of the data is drawn from a recent special effort at Statistic Sweden aimed at influencing the classification, collection and presentation of statistical data in all fields in such a way that family, working, education, health and other conditions of women can be more readily and equitably compared with those of men. In addition, social changes have seen the shifting of the responsibility of health care from the unpaid duties of women in the home to health care institutions, where female employees predominate. These trends are also discussed.
2017-05-10
repertoire-wide properties. Finally, through 75 the use of appropriate statistical analyses, the repertoire profiles can be quantitatively compared and 76...cell response to eVLP and 503 quantitatively compare GC B-cell repertoires from immunization conditions. We partitioned the 504 resulting clonotype... Quantitative analysis of repertoire-scale immunoglobulin properties in vaccine-induced B-cell responses Ilja V. Khavrutskii1, Sidhartha Chaudhury*1
ERIC Educational Resources Information Center
Hidalgo, Mª Dolores; Gómez-Benito, Juana; Zumbo, Bruno D.
2014-01-01
The authors analyze the effectiveness of the R[superscript 2] and delta log odds ratio effect size measures when using logistic regression analysis to detect differential item functioning (DIF) in dichotomous items. A simulation study was carried out, and the Type I error rate and power estimates under conditions in which only statistical testing…
On damage detection in wind turbine gearboxes using outlier analysis
NASA Astrophysics Data System (ADS)
Antoniadou, Ifigeneia; Manson, Graeme; Dervilis, Nikolaos; Staszewski, Wieslaw J.; Worden, Keith
2012-04-01
The proportion of worldwide installed wind power in power systems increases over the years as a result of the steadily growing interest in renewable energy sources. Still, the advantages offered by the use of wind power are overshadowed by the high operational and maintenance costs, resulting in the low competitiveness of wind power in the energy market. In order to reduce the costs of corrective maintenance, the application of condition monitoring to gearboxes becomes highly important, since gearboxes are among the wind turbine components with the most frequent failure observations. While condition monitoring of gearboxes in general is common practice, with various methods having been developed over the last few decades, wind turbine gearbox condition monitoring faces a major challenge: the detection of faults under the time-varying load conditions prevailing in wind turbine systems. Classical time and frequency domain methods fail to detect faults under variable load conditions, due to the temporary effect that these faults have on vibration signals. This paper uses the statistical discipline of outlier analysis for the damage detection of gearbox tooth faults. A simplified two-degree-of-freedom gearbox model considering nonlinear backlash, time-periodic mesh stiffness and static transmission error, simulates the vibration signals to be analysed. Local stiffness reduction is used for the simulation of tooth faults and statistical processes determine the existence of intermittencies. The lowest level of fault detection, the threshold value, is considered and the Mahalanobis squared-distance is calculated for the novelty detection problem.
Analysis of First-Term Attrition of Non-Prior Service High-Quality U.S. Army Male Recruits
1989-12-13
the estimators. Under broad conditions ( Hanushek , 1977 ), the maximum likelihood estimators are: ( a ) consistent (b) asymptotically efficient, and...Diseases, Vol. 24, 1971, pp. 125 - 158. Hanushek , Eric ., and John E. Jackson, Statistical Methods for Social Scientists, Academic Press, New York, 1977
Corn response to nitrogen is influenced by soil texture and weather
USDA-ARS?s Scientific Manuscript database
Soil properties and weather conditions are known to affect soil nitrogen (N) availability and plant N uptake. However, studies examining N response as affected by soil and weather sometimes give conflicting results. Meta-analysis is a statistical method for estimating treatment effects in a se...
The report considers information available in the late 1990s on the economic impacts of environmental regulations on the overall economic conditions in the US, including impacts on economic growth and competitiveness.
A multi-scale analysis of landscape statistics
Douglas H. Cain; Kurt H. Riitters; Kenneth Orvis
1997-01-01
It is now feasible to monitor some aspects of landscape ecological condition nationwide using remotely- sensed imagery and indicators of land cover pattern. Previous research showed redundancies among many reported pattern indicators and identified six unique dimensions of land cover pattern. This study tested the stability of those dimensions and representative...
ERIC Educational Resources Information Center
Ferrari, Pier Alda; Barbiero, Alessandro
2012-01-01
The increasing use of ordinal variables in different fields has led to the introduction of new statistical methods for their analysis. The performance of these methods needs to be investigated under a number of experimental conditions. Procedures to simulate from ordinal variables are then required. In this article, we deal with simulation from…
Forest land area estimates from vegetation continuous fields
Mark D. Nelson; Ronald E. McRoberts; Matthew C. Hansen
2004-01-01
The USDA Forest Service's Forest Inventory and Analysis (FIA) program provides data, information, and knowledge about our Nation's forest resources. FIA regional units collect data from field plots and remotely sensed imagery to produce statistical estimates of forest extent (area); volume, growth, and removals; and health and condition. There is increasing...
Afshari, Kasra; Samavati, Vahid; Shahidi, Seyed-Ahmad
2015-03-01
The effects of ultrasonic power, extraction time, extraction temperature, and the water-to-raw material ratio on extraction yield of crude polysaccharide from the leaf of Hibiscus rosa-sinensis (HRLP) were optimized by statistical analysis using response surface methodology. The response surface methodology (RSM) was used to optimize HRLP extraction yield by implementing the Box-Behnken design (BBD). The experimental data obtained were fitted to a second-order polynomial equation using multiple regression analysis and also analyzed by appropriate statistical methods (ANOVA). Analysis of the results showed that the linear and quadratic terms of these four variables had significant effects. The optimal conditions for the highest extraction yield of HRLP were: ultrasonic power, 93.59 W; extraction time, 25.71 min; extraction temperature, 93.18°C; and the water to raw material ratio, 24.3 mL/g. Under these conditions, the experimental yield was 9.66±0.18%, which is well in close agreement with the value predicted by the model 9.526%. The results demonstrated that HRLP had strong scavenging activities in vitro on DPPH and hydroxyl radicals. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Lindsey, B.; McMahon, P.; Rupert, M.; Tesoriero, J.; Starn, J.; Anning, D.; Green, C.
2012-04-01
The U.S. Geological Survey National Water-Quality Assessment (NAWQA) Program was implemented in 1991 to provide long-term, consistent, and comparable information on the quality of surface and groundwater resources of the United States. Findings are used to support national, regional, state, and local information needs with respect to water quality. The three main goals of the program are to 1) assess the condition of the nation's streams, rivers, groundwater, and aquatic systems; 2) assess how conditions are changing over time; and 3) determine how natural features and human activities affect these conditions, and where those effects are most pronounced. As data collection progressed into the second decade, the emphasis of the interpretation of the data has shifted from primarily understanding status, to evaluation of trends. The program has conducted national and regional evaluations of change in the quality of water in streams, rivers, groundwater, and health of aquatic systems. Evaluating trends in environmental systems requires complex analytical and statistical methods, and a periodic re-evaluation of the monitoring methods used to collect these data. Examples given herein summarize the lessons learned from the evaluation of changes in water quality during the past two decades with an emphasis on the finding with respect to groundwater. The analysis of trends in groundwater is based on 56 well networks located in 22 principal aquifers of the United States. Analysis has focused on 3 approaches: 1) a statistical analysis of results of sampling over various time scales, 2) studies of factors affecting trends in groundwater quality, and 3) use of models to simulate groundwater trends and forecast future trends. Data collection for analysis of changes in groundwater-quality has focused on decadal resampling of wells. Understanding the trends in groundwater quality and the factors affecting those trends has been conducted using quarterly sampling, biennial sampling, and more recently continuous monitoring of selected parameters in a small number of wells. Models such as MODFLOW have been used for simulation and forecasting of future trends. Important outcomes from the groundwater-trends studies include issues involving statistics, sampling frequency, changes in laboratory analytical methods over time, the need for groundwater age-dating information, the value of understanding geochemical conditions and contaminant degradation, the need to understand groundwater-surface water interaction, and the value of modeling in understanding trends and forecasting potential future conditions. Statistically significant increases in chloride, dissolved solids, and nitrate concentrations were found in a large number of well networks over the first decadal sampling period. Statistically significant decreases of chloride, dissolved solids, and nitrate concentrations were found in a very small number of networks. Trends in surface-water are analyzed within 8 large major river basins within the United States with a focus on issues of regional importance. Examples of regional surface-water issues include an analysis of trends in dissolved solids in the Southeastern United States, trends in pesticides in the north-central United States, and trends in nitrate in the Mississippi River Basin. Evaluations of ecological indicators of water quality include temporal changes in stream habitat, and aquatic-invertebrate and fish assemblages.
[Organizational climate and burnout syndrome].
Lubrańska, Anna
2011-01-01
The paper addresses the issue of organizational climate and burnout syndrome. It has been assumed that burnout syndrome is dependent on work climate (organizational climate), therefore, two concepts were analyzed: by D. Kolb (organizational climate) and by Ch. Maslach (burnout syndrome). The research involved 239 persons (122 woman, 117 men), aged 21-66. In the study Maslach Burnout Inventory (MBI) and Inventory of Organizational Climate were used. The results of statistical methods (correlation analysis, one-variable analysis of variance and regression analysis) evidenced a strong relationship between organizational climate and burnout dimension. As depicted by the results, there are important differences in the level of burnout between the study participants who work in different types of organizational climate. The results of the statistical analyses indicate that the organizational climate determines burnout syndrome. Therefore, creating supportive conditions at the workplace might reduce the risk of burnout.
2017-08-30
as being three-fold: 1) a measurement of the integrity of both the central and peripheral visual processing centers; 2) an indicator of detail...visual assessment task 12 integral to the Army’s Class 1 Flight Physical (Ginsburg, 1981 and 1984; Bachman & Behar, 1986). During a Class 1 flight...systems. Meta-analysis has been defined as the statistical analysis of a collection of analytical results for the purpose of integrating the findings
Wavelet and receiver operating characteristic analysis of heart rate variability
NASA Astrophysics Data System (ADS)
McCaffery, G.; Griffith, T. M.; Naka, K.; Frennaux, M. P.; Matthai, C. C.
2002-02-01
Multiresolution wavelet analysis has been used to study the heart rate variability in two classes of patients with different pathological conditions. The scale dependent measure of Thurner et al. was found to be statistically significant in discriminating patients suffering from hypercardiomyopathy from a control set of normal subjects. We have performed Receiver Operating Characteristc (ROC) analysis and found the ROC area to be a useful measure by which to label the significance of the discrimination, as well as to describe the severity of heart dysfunction.
Intelligent Performance Analysis with a Natural Language Interface
NASA Astrophysics Data System (ADS)
Juuso, Esko K.
2017-09-01
Performance improvement is taken as the primary goal in the asset management. Advanced data analysis is needed to efficiently integrate condition monitoring data into the operation and maintenance. Intelligent stress and condition indices have been developed for control and condition monitoring by combining generalized norms with efficient nonlinear scaling. These nonlinear scaling methodologies can also be used to handle performance measures used for management since management oriented indicators can be presented in the same scale as intelligent condition and stress indices. Performance indicators are responses of the process, machine or system to the stress contributions analyzed from process and condition monitoring data. Scaled values are directly used in intelligent temporal analysis to calculate fluctuations and trends. All these methodologies can be used in prognostics and fatigue prediction. The meanings of the variables are beneficial in extracting expert knowledge and representing information in natural language. The idea of dividing the problems into the variable specific meanings and the directions of interactions provides various improvements for performance monitoring and decision making. The integrated temporal analysis and uncertainty processing facilitates the efficient use of domain expertise. Measurements can be monitored with generalized statistical process control (GSPC) based on the same scaling functions.
Statistical analysis of multivariate atmospheric variables. [cloud cover
NASA Technical Reports Server (NTRS)
Tubbs, J. D.
1979-01-01
Topics covered include: (1) estimation in discrete multivariate distributions; (2) a procedure to predict cloud cover frequencies in the bivariate case; (3) a program to compute conditional bivariate normal parameters; (4) the transformation of nonnormal multivariate to near-normal; (5) test of fit for the extreme value distribution based upon the generalized minimum chi-square; (6) test of fit for continuous distributions based upon the generalized minimum chi-square; (7) effect of correlated observations on confidence sets based upon chi-square statistics; and (8) generation of random variates from specified distributions.
LP-search and its use in analysis of the accuracy of control systems with acoustical models
NASA Technical Reports Server (NTRS)
Sergeyev, V. I.; Sobol, I. M.; Statnikov, R. B.; Statnikov, I. N.
1973-01-01
The LP-search is proposed as an analog of the Monte Carlo method for finding values in nonlinear statistical systems. It is concluded that: To attain the required accuracy in solution to the problem of control for a statistical system in the LP-search, a considerably smaller number of tests is required than in the Monte Carlo method. The LP-search allows the possibility of multiple repetitions of tests under identical conditions and observability of the output variables of the system.
NASA Astrophysics Data System (ADS)
Müller, M. F.; Thompson, S. E.
2015-09-01
The prediction of flow duration curves (FDCs) in ungauged basins remains an important task for hydrologists given the practical relevance of FDCs for water management and infrastructure design. Predicting FDCs in ungauged basins typically requires spatial interpolation of statistical or model parameters. This task is complicated if climate becomes non-stationary, as the prediction challenge now also requires extrapolation through time. In this context, process-based models for FDCs that mechanistically link the streamflow distribution to climate and landscape factors may have an advantage over purely statistical methods to predict FDCs. This study compares a stochastic (process-based) and statistical method for FDC prediction in both stationary and non-stationary contexts, using Nepal as a case study. Under contemporary conditions, both models perform well in predicting FDCs, with Nash-Sutcliffe coefficients above 0.80 in 75 % of the tested catchments. The main drives of uncertainty differ between the models: parameter interpolation was the main source of error for the statistical model, while violations of the assumptions of the process-based model represented the main source of its error. The process-based approach performed better than the statistical approach in numerical simulations with non-stationary climate drivers. The predictions of the statistical method under non-stationary rainfall conditions were poor if (i) local runoff coefficients were not accurately determined from the gauge network, or (ii) streamflow variability was strongly affected by changes in rainfall. A Monte Carlo analysis shows that the streamflow regimes in catchments characterized by a strong wet-season runoff and a rapid, strongly non-linear hydrologic response are particularly sensitive to changes in rainfall statistics. In these cases, process-based prediction approaches are strongly favored over statistical models.
NASA Astrophysics Data System (ADS)
Müller, M. F.; Thompson, S. E.
2016-02-01
The prediction of flow duration curves (FDCs) in ungauged basins remains an important task for hydrologists given the practical relevance of FDCs for water management and infrastructure design. Predicting FDCs in ungauged basins typically requires spatial interpolation of statistical or model parameters. This task is complicated if climate becomes non-stationary, as the prediction challenge now also requires extrapolation through time. In this context, process-based models for FDCs that mechanistically link the streamflow distribution to climate and landscape factors may have an advantage over purely statistical methods to predict FDCs. This study compares a stochastic (process-based) and statistical method for FDC prediction in both stationary and non-stationary contexts, using Nepal as a case study. Under contemporary conditions, both models perform well in predicting FDCs, with Nash-Sutcliffe coefficients above 0.80 in 75 % of the tested catchments. The main drivers of uncertainty differ between the models: parameter interpolation was the main source of error for the statistical model, while violations of the assumptions of the process-based model represented the main source of its error. The process-based approach performed better than the statistical approach in numerical simulations with non-stationary climate drivers. The predictions of the statistical method under non-stationary rainfall conditions were poor if (i) local runoff coefficients were not accurately determined from the gauge network, or (ii) streamflow variability was strongly affected by changes in rainfall. A Monte Carlo analysis shows that the streamflow regimes in catchments characterized by frequent wet-season runoff and a rapid, strongly non-linear hydrologic response are particularly sensitive to changes in rainfall statistics. In these cases, process-based prediction approaches are favored over statistical models.
Khan, Mohammad Jakir Hossain; Hussain, Mohd Azlan; Mujtaba, Iqbal Mohammed
2014-01-01
Propylene is one type of plastic that is widely used in our everyday life. This study focuses on the identification and justification of the optimum process parameters for polypropylene production in a novel pilot plant based fluidized bed reactor. This first-of-its-kind statistical modeling with experimental validation for the process parameters of polypropylene production was conducted by applying ANNOVA (Analysis of variance) method to Response Surface Methodology (RSM). Three important process variables i.e., reaction temperature, system pressure and hydrogen percentage were considered as the important input factors for the polypropylene production in the analysis performed. In order to examine the effect of process parameters and their interactions, the ANOVA method was utilized among a range of other statistical diagnostic tools such as the correlation between actual and predicted values, the residuals and predicted response, outlier t plot, 3D response surface and contour analysis plots. The statistical analysis showed that the proposed quadratic model had a good fit with the experimental results. At optimum conditions with temperature of 75°C, system pressure of 25 bar and hydrogen percentage of 2%, the highest polypropylene production obtained is 5.82% per pass. Hence it is concluded that the developed experimental design and proposed model can be successfully employed with over a 95% confidence level for optimum polypropylene production in a fluidized bed catalytic reactor (FBCR). PMID:28788576
2008-01-01
The causal feedback implied by urban neighborhood conditions that shape human health experiences, that in turn shape neighborhood conditions through a complex causal web, raises a challenge for traditional epidemiological causal analyses. This article introduces the loop analysis method, and builds off of a core loop model linking neighborhood property vacancy rate, resident depressive symptoms, rate of neighborhood death, and rate of neighborhood exit in a feedback network. I justify and apply loop analysis to the specific example of depressive symptoms and abandoned urban residential property to show how inquiries into the behavior of causal systems can answer different kinds of hypotheses, and thereby compliment those of causal modeling using statistical models. Neighborhood physical conditions that are only indirectly influenced by depressive symptoms may nevertheless manifest in the mental health experiences of their residents; conversely, neighborhood physical conditions may be a significant mental health risk for the population of neighborhood residents. I find that participatory greenspace programs are likely to produce adaptive responses in depressive symptoms and different neighborhood conditions, which are different in character to non-participatory greenspace interventions. PMID:17706851
pcr: an R package for quality assessment, analysis and testing of qPCR data
Ahmed, Mahmoud
2018-01-01
Background Real-time quantitative PCR (qPCR) is a broadly used technique in the biomedical research. Currently, few different analysis models are used to determine the quality of data and to quantify the mRNA level across the experimental conditions. Methods We developed an R package to implement methods for quality assessment, analysis and testing qPCR data for statistical significance. Double Delta CT and standard curve models were implemented to quantify the relative expression of target genes from CT in standard qPCR control-group experiments. In addition, calculation of amplification efficiency and curves from serial dilution qPCR experiments are used to assess the quality of the data. Finally, two-group testing and linear models were used to test for significance of the difference in expression control groups and conditions of interest. Results Using two datasets from qPCR experiments, we applied different quality assessment, analysis and statistical testing in the pcr package and compared the results to the original published articles. The final relative expression values from the different models, as well as the intermediary outputs, were checked against the expected results in the original papers and were found to be accurate and reliable. Conclusion The pcr package provides an intuitive and unified interface for its main functions to allow biologist to perform all necessary steps of qPCR analysis and produce graphs in a uniform way. PMID:29576953
Fasoula, S; Zisi, Ch; Sampsonidis, I; Virgiliou, Ch; Theodoridis, G; Gika, H; Nikitas, P; Pappa-Louisi, A
2015-03-27
In the present study a series of 45 metabolite standards belonging to four chemically similar metabolite classes (sugars, amino acids, nucleosides and nucleobases, and amines) was subjected to LC analysis on three HILIC columns under 21 different gradient conditions with the aim to explore whether the retention properties of these analytes are determined from the chemical group they belong. Two multivariate techniques, principal component analysis (PCA) and discriminant analysis (DA), were used for statistical evaluation of the chromatographic data and extraction similarities between chemically related compounds. The total variance explained by the first two principal components of PCA was found to be about 98%, whereas both statistical analyses indicated that all analytes are successfully grouped in four clusters of chemical structure based on the retention obtained in four or at least three chromatographic runs, which, however should be performed on two different HILIC columns. Moreover, leave-one-out cross-validation of the above retention data set showed that the chemical group in which an analyte belongs can be 95.6% correctly predicted when the analyte is subjected to LC analysis under the same four or three experimental conditions as the all set of analytes was run beforehand. That, in turn, may assist with disambiguation of analyte identification in complex biological extracts. Copyright © 2015 Elsevier B.V. All rights reserved.
Recurrence interval analysis of trading volumes
NASA Astrophysics Data System (ADS)
Ren, Fei; Zhou, Wei-Xing
2010-06-01
We study the statistical properties of the recurrence intervals τ between successive trading volumes exceeding a certain threshold q . The recurrence interval analysis is carried out for the 20 liquid Chinese stocks covering a period from January 2000 to May 2009, and two Chinese indices from January 2003 to April 2009. Similar to the recurrence interval distribution of the price returns, the tail of the recurrence interval distribution of the trading volumes follows a power-law scaling, and the results are verified by the goodness-of-fit tests using the Kolmogorov-Smirnov (KS) statistic, the weighted KS statistic and the Cramér-von Mises criterion. The measurements of the conditional probability distribution and the detrended fluctuation function show that both short-term and long-term memory effects exist in the recurrence intervals between trading volumes. We further study the relationship between trading volumes and price returns based on the recurrence interval analysis method. It is found that large trading volumes are more likely to occur following large price returns, and the comovement between trading volumes and price returns is more pronounced for large trading volumes.
Recurrence interval analysis of trading volumes.
Ren, Fei; Zhou, Wei-Xing
2010-06-01
We study the statistical properties of the recurrence intervals τ between successive trading volumes exceeding a certain threshold q. The recurrence interval analysis is carried out for the 20 liquid Chinese stocks covering a period from January 2000 to May 2009, and two Chinese indices from January 2003 to April 2009. Similar to the recurrence interval distribution of the price returns, the tail of the recurrence interval distribution of the trading volumes follows a power-law scaling, and the results are verified by the goodness-of-fit tests using the Kolmogorov-Smirnov (KS) statistic, the weighted KS statistic and the Cramér-von Mises criterion. The measurements of the conditional probability distribution and the detrended fluctuation function show that both short-term and long-term memory effects exist in the recurrence intervals between trading volumes. We further study the relationship between trading volumes and price returns based on the recurrence interval analysis method. It is found that large trading volumes are more likely to occur following large price returns, and the comovement between trading volumes and price returns is more pronounced for large trading volumes.
Zhang, Y; Li, D D; Chen, X W
2017-06-20
Objective: Case-control study analysis of the speech discrimination of unilateral microtia and external auditory canal atresia patients with normal hearing subjects in quiet and noisy environment. To understand the speech recognition results of patients with unilateral external auditory canal atresia and provide scientific basis for clinical early intervention. Method: Twenty patients with unilateral congenital microtia malformation combined external auditory canal atresia, 20 age matched normal subjects as control group. All subjects used Mandarin speech audiometry material, to test the speech discrimination scores (SDS) in quiet and noisy environment in sound field. Result: There's no significant difference of speech discrimination scores under the condition of quiet between two groups. There's a statistically significant difference when the speech signal in the affected side and noise in the nomalside (single syllable, double syllable, statements; S/N=0 and S/N=-10) ( P <0.05). There's no significant difference of speech discrimination scores when the speech signal in the nomalside and noise in the affected side. There's a statistically significant difference in condition of the signal and noise in the same side when used one-syllable word recognition (S/N=0 and S/N=-5) ( P <0.05), while double syllable word and statement has no statistically significant difference ( P >0.05). Conclusion: The speech discrimination scores of unilateral congenital microtia malformation patients with external auditory canal atresia under the condition of noise is lower than the normal subjects. Copyright© by the Editorial Department of Journal of Clinical Otorhinolaryngology Head and Neck Surgery.
Sando, Roy; Chase, Katherine J.
2017-03-23
A common statistical procedure for estimating streamflow statistics at ungaged locations is to develop a relational model between streamflow and drainage basin characteristics at gaged locations using least squares regression analysis; however, least squares regression methods are parametric and make constraining assumptions about the data distribution. The random forest regression method provides an alternative nonparametric method for estimating streamflow characteristics at ungaged sites and requires that the data meet fewer statistical conditions than least squares regression methods.Random forest regression analysis was used to develop predictive models for 89 streamflow characteristics using Precipitation-Runoff Modeling System simulated streamflow data and drainage basin characteristics at 179 sites in central and eastern Montana. The predictive models were developed from streamflow data simulated for current (baseline, water years 1982–99) conditions and three future periods (water years 2021–38, 2046–63, and 2071–88) under three different climate-change scenarios. These predictive models were then used to predict streamflow characteristics for baseline conditions and three future periods at 1,707 fish sampling sites in central and eastern Montana. The average root mean square error for all predictive models was about 50 percent. When streamflow predictions at 23 fish sampling sites were compared to nearby locations with simulated data, the mean relative percent difference was about 43 percent. When predictions were compared to streamflow data recorded at 21 U.S. Geological Survey streamflow-gaging stations outside of the calibration basins, the average mean absolute percent error was about 73 percent.
Statistical errors in molecular dynamics averages
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schiferl, S.K.; Wallace, D.C.
1985-11-15
A molecular dynamics calculation produces a time-dependent fluctuating signal whose average is a thermodynamic quantity of interest. The average of the kinetic energy, for example, is proportional to the temperature. A procedure is described for determining when the molecular dynamics system is in equilibrium with respect to a given variable, according to the condition that the mean and the bandwidth of the signal should be sensibly constant in time. Confidence limits for the mean are obtained from an analysis of a finite length of the equilibrium signal. The role of serial correlation in this analysis is discussed. The occurence ofmore » unstable behavior in molecular dynamics data is noted, and a statistical test for a level shift is described.« less
Evaluation of Skylab IB sensitivity to on-pad winds with turbulence
NASA Technical Reports Server (NTRS)
Coffin, T.
1972-01-01
Computer simulation was performed to estimate displacements and bending moments experienced by the SKYLAB 1B vehicle on the launch pad due to atmospheric winds. The vehicle was assumed to be a beam-like structure represented by a finite number of generalized coordinates. Wind flow across the vehicle was treated as a nonhomogeneous, stationary random process. Response computations were performed by the assumption of simple strip theory and application of generalized harmonic analysis. Displacement and bending moment statistics were obtained for six vehicle propellant loading conditions and four representative reference wind profile and turbulence levels. Means, variances and probability distributions are presented graphically for each case. A separate analysis was performed to indicate the influence of wind gradient variations on vehicle response statistics.
Vadapalli, Sriharsha Babu; Atluri, Kaleswararao; Putcha, Madhu Sudhan; Kondreddi, Sirisha; Kumar, N. Suman; Tadi, Durga Prasad
2016-01-01
Objectives: This in vitro study was designed to compare polyvinyl-siloxane (PVS) monophase and polyether (PE) monophase materials under dry and moist conditions for properties such as surface detail reproduction, dimensional stability, and gypsum compatibility. Materials and Methods: Surface detail reproduction was evaluated using two criteria. Dimensional stability was evaluated according to American Dental Association (ADA) specification no. 19. Gypsum compatibility was assessed by two criteria. All the samples were evaluated, and the data obtained were analyzed by a two-way analysis of variance (ANOVA) and Pearson's Chi-square tests. Results: When surface detail reproduction was evaluated with modification of ADA specification no. 19, both the groups under the two conditions showed no significant difference statistically. When evaluated macroscopically both the groups showed statistically significant difference. Results for dimensional stability showed that the deviation from standard was significant among the two groups, where Aquasil group showed significantly more deviation compared to Impregum group (P < 0.001). Two conditions also showed significant difference, with moist conditions showing significantly more deviation compared to dry condition (P < 0.001). The results of gypsum compatibility when evaluated with modification of ADA specification no. 19 and by giving grades to the casts for both the groups and under two conditions showed no significant difference statistically. Conclusion: Regarding dimensional stability, both impregum and aquasil performed better in dry condition than in moist; impregum performed better than aquasil in both the conditions. When tested for surface detail reproduction according to ADA specification, under dry and moist conditions both of them performed almost equally. When tested according to macroscopic evaluation, impregum and aquasil performed significantly better in dry condition compared to moist condition. In dry condition, both the materials performed almost equally. In moist condition, aquasil performed significantly better than impregum. Regarding gypsum compatibility according to ADA specification, in dry condition both the materials performed almost equally, and in moist condition aquasil performed better than impregum. When tested by macroscopic evaluation, impregum performed better than aquasil in both the conditions. PMID:27583217
Vadapalli, Sriharsha Babu; Atluri, Kaleswararao; Putcha, Madhu Sudhan; Kondreddi, Sirisha; Kumar, N Suman; Tadi, Durga Prasad
2016-01-01
This in vitro study was designed to compare polyvinyl-siloxane (PVS) monophase and polyether (PE) monophase materials under dry and moist conditions for properties such as surface detail reproduction, dimensional stability, and gypsum compatibility. Surface detail reproduction was evaluated using two criteria. Dimensional stability was evaluated according to American Dental Association (ADA) specification no. 19. Gypsum compatibility was assessed by two criteria. All the samples were evaluated, and the data obtained were analyzed by a two-way analysis of variance (ANOVA) and Pearson's Chi-square tests. When surface detail reproduction was evaluated with modification of ADA specification no. 19, both the groups under the two conditions showed no significant difference statistically. When evaluated macroscopically both the groups showed statistically significant difference. Results for dimensional stability showed that the deviation from standard was significant among the two groups, where Aquasil group showed significantly more deviation compared to Impregum group (P < 0.001). Two conditions also showed significant difference, with moist conditions showing significantly more deviation compared to dry condition (P < 0.001). The results of gypsum compatibility when evaluated with modification of ADA specification no. 19 and by giving grades to the casts for both the groups and under two conditions showed no significant difference statistically. Regarding dimensional stability, both impregum and aquasil performed better in dry condition than in moist; impregum performed better than aquasil in both the conditions. When tested for surface detail reproduction according to ADA specification, under dry and moist conditions both of them performed almost equally. When tested according to macroscopic evaluation, impregum and aquasil performed significantly better in dry condition compared to moist condition. In dry condition, both the materials performed almost equally. In moist condition, aquasil performed significantly better than impregum. Regarding gypsum compatibility according to ADA specification, in dry condition both the materials performed almost equally, and in moist condition aquasil performed better than impregum. When tested by macroscopic evaluation, impregum performed better than aquasil in both the conditions.
Metrological analysis of a virtual flowmeter-based transducer for cryogenic helium
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arpaia, P., E-mail: pasquale.arpaia@unina.it; Technology Department, European Organization for Nuclear Research; Girone, M., E-mail: mario.girone@cern.ch
2015-12-15
The metrological performance of a virtual flowmeter-based transducer for monitoring helium under cryogenic conditions is assessed. At this aim, an uncertainty model of the transducer, mainly based on a valve model, exploiting finite-element approach, and a virtual flowmeter model, based on the Sereg-Schlumberger method, are presented. The models are validated experimentally on a case study for helium monitoring in cryogenic systems at the European Organization for Nuclear Research (CERN). The impact of uncertainty sources on the transducer metrological performance is assessed by a sensitivity analysis, based on statistical experiment design and analysis of variance. In this way, the uncertainty sourcesmore » most influencing metrological performance of the transducer are singled out over the input range as a whole, at varying operating and setting conditions. This analysis turns out to be important for CERN cryogenics operation because the metrological design of the transducer is validated, and its components and working conditions with critical specifications for future improvements are identified.« less
Potentiation Following Ballistic and Nonballistic Complexes: The Effect of Strength Level.
Suchomel, Timothy J; Sato, Kimitake; DeWeese, Brad H; Ebben, William P; Stone, Michael H
2016-07-01
Suchomel, TJ, Sato, K, DeWeese, BH, Ebben, WP, and Stone, MH. Potentiation following ballistic and nonballistic complexes: the effect of strength level. J Strength Cond Res 30(7): 1825-1833, 2016-The purpose of this study was to compare the temporal profile of strong and weak subjects during ballistic and nonballistic potentiation complexes. Eight strong (relative back squat = 2.1 ± 0.1 times body mass) and 8 weak (relative back squat = 1.6 ± 0.2 times body mass) males performed squat jumps immediately and every minute up to 10 minutes following potentiation complexes that included ballistic or nonballistic concentric-only half-squat (COHS) performed at 90% of their 1 repetition maximum COHS. Jump height (JH) and allometrically scaled peak power (PPa) were compared using a series of 2 × 12 repeated measures analyses of variance. No statistically significant strength level main effects for JH (p = 0.442) or PPa (p = 0.078) existed during the ballistic condition. In contrast, statistically significant main effects for time existed for both JH (p = 0.014) and PPa (p < 0.001); however, no statistically significant pairwise comparisons were present (p > 0.05). Statistically significant strength level main effects existed for PPa (p = 0.039) but not for JH (p = 0.137) during the nonballistic condition. Post hoc analysis revealed that the strong subjects produced statistically greater PPa than the weaker subjects (p = 0.039). Statistically significant time main effects existed for time existed for PPa (p = 0.015), but not for JH (p = 0.178). No statistically significant strength level × time interaction effects for JH (p = 0.319) or PPa (p = 0.203) were present for the ballistic or nonballistic conditions. Practical significance indicated by effect sizes and the relationships between maximum potentiation and relative strength suggest that stronger subjects potentiate earlier and to a greater extent than weaker subjects during ballistic and nonballistic potentiation complexes.
New methods in hydrologic modeling and decision support for culvert flood risk under climate change
NASA Astrophysics Data System (ADS)
Rosner, A.; Letcher, B. H.; Vogel, R. M.; Rees, P. S.
2015-12-01
Assessing culvert flood vulnerability under climate change poses an unusual combination of challenges. We seek a robust method of planning for an uncertain future, and therefore must consider a wide range of plausible future conditions. Culverts in our case study area, northwestern Massachusetts, USA, are predominantly found in small, ungaged basins. The need to predict flows both at numerous sites and under numerous plausible climate conditions requires a statistical model with low data and computational requirements. We present a statistical streamflow model that is driven by precipitation and temperature, allowing us to predict flows without reliance on reference gages of observed flows. The hydrological analysis is used to determine each culvert's risk of failure under current conditions. We also explore the hydrological response to a range of plausible future climate conditions. These results are used to determine the tolerance of each culvert to future increases in precipitation. In a decision support context, current flood risk as well as tolerance to potential climate changes are used to provide a robust assessment and prioritization for culvert replacements.
Siddiqua, Shaila; Mamun, Abdullah Al; Enayetul Babar, Sheikh Md
2015-01-01
Renewable biodiesels are needed as an alternative to petroleum-derived transport fuels, which contribute to global warming and are of limited availability. Algae biomass, are a potential source of renewable energy, and they can be converted into energy such as biofuels. This study introduces an integrated method for the production of biodiesel from Chara vulgaris algae collected from the coastal region of Bangladesh. The Box-Behnken design based on response surface methods (RSM) used as the statistical tool to optimize three variables for predicting the best performing conditions (calorific value and yield) of algae biodiesel. The three parameters for production condition were chloroform (X1), sodium chloride concentration (X2) and temperature (X3). Optimal conditions were estimated by the aid of statistical regression analysis and surface plot chart. The optimal condition of biodiesel production parameter for 12 g of dry algae biomass was observed to be 198 ml chloroform with 0.75 % sodium chloride at 65 °C temperature, where the calorific value of biodiesel is 9255.106 kcal/kg and yield 3.6 ml.
Wear behavior of AA 5083/SiC nano-particle metal matrix composite: Statistical analysis
NASA Astrophysics Data System (ADS)
Hussain Idrisi, Amir; Ismail Mourad, Abdel-Hamid; Thekkuden, Dinu Thomas; Christy, John Victor
2018-03-01
This paper reports study on statistical analysis of the wear characteristics of AA5083/SiC nanocomposite. The aluminum matrix composites with different wt % (0%, 1% and 2%) of SiC nanoparticles were fabricated by using stir casting route. The developed composites were used in the manufacturing of spur gears on which the study was conducted. A specially designed test rig was used in testing the wear performance of the gears. The wear was investigated under different conditions of applied load (10N, 20N, and 30N) and operation time (30 mins, 60 mins, 90 mins, and 120mins). The analysis carried out at room temperature under constant speed of 1450 rpm. The wear parameters were optimized by using Taguchi’s method. During this statistical approach, L27 Orthogonal array was selected for the analysis of output. Furthermore, analysis of variance (ANOVA) was used to investigate the influence of applied load, operation time and SiC wt. % on wear behaviour. The wear resistance was analyzed by selecting “smaller is better” characteristics as the objective of the model. From this research, it is observed that experiment time and SiC wt % have the most significant effect on the wear performance followed by the applied load.
Robust Strategy for Rocket Engine Health Monitoring
NASA Technical Reports Server (NTRS)
Santi, L. Michael
2001-01-01
Monitoring the health of rocket engine systems is essentially a two-phase process. The acquisition phase involves sensing physical conditions at selected locations, converting physical inputs to electrical signals, conditioning the signals as appropriate to establish scale or filter interference, and recording results in a form that is easy to interpret. The inference phase involves analysis of results from the acquisition phase, comparison of analysis results to established health measures, and assessment of health indications. A variety of analytical tools may be employed in the inference phase of health monitoring. These tools can be separated into three broad categories: statistical, rule based, and model based. Statistical methods can provide excellent comparative measures of engine operating health. They require well-characterized data from an ensemble of "typical" engines, or "golden" data from a specific test assumed to define the operating norm in order to establish reliable comparative measures. Statistical methods are generally suitable for real-time health monitoring because they do not deal with the physical complexities of engine operation. The utility of statistical methods in rocket engine health monitoring is hindered by practical limits on the quantity and quality of available data. This is due to the difficulty and high cost of data acquisition, the limited number of available test engines, and the problem of simulating flight conditions in ground test facilities. In addition, statistical methods incur a penalty for disregarding flow complexity and are therefore limited in their ability to define performance shift causality. Rule based methods infer the health state of the engine system based on comparison of individual measurements or combinations of measurements with defined health norms or rules. This does not mean that rule based methods are necessarily simple. Although binary yes-no health assessment can sometimes be established by relatively simple rules, the causality assignment needed for refined health monitoring often requires an exceptionally complex rule base involving complicated logical maps. Structuring the rule system to be clear and unambiguous can be difficult, and the expert input required to maintain a large logic network and associated rule base can be prohibitive.
[Mathematical modeling for conditionality of cardiovascular disease by housing conditions].
Meshkov, N A
2014-01-01
There was studied the influence of living conditions (housing area per capita, availability of housing water supply, sewerage and central heating) on the morbidity of the cardiovascular diseases in child and adult population. With the method of regression analysis the morbidity rate was established to significantly decrease with the increase in the area of housing, constructed models are statistically significant, respectively, p = 0.01 and p = 0.02. There was revealed the relationship of the morbidity rate of cardiovascular diseases in children and adults with the supply with housing central heating (p = 0.02 and p = 0.009).
Demonstration of Wavelet Techniques in the Spectral Analysis of Bypass Transition Data
NASA Technical Reports Server (NTRS)
Lewalle, Jacques; Ashpis, David E.; Sohn, Ki-Hyeon
1997-01-01
A number of wavelet-based techniques for the analysis of experimental data are developed and illustrated. A multiscale analysis based on the Mexican hat wavelet is demonstrated as a tool for acquiring physical and quantitative information not obtainable by standard signal analysis methods. Experimental data for the analysis came from simultaneous hot-wire velocity traces in a bypass transition of the boundary layer on a heated flat plate. A pair of traces (two components of velocity) at one location was excerpted. A number of ensemble and conditional statistics related to dominant time scales for energy and momentum transport were calculated. The analysis revealed a lack of energy-dominant time scales inside turbulent spots but identified transport-dominant scales inside spots that account for the largest part of the Reynolds stress. Momentum transport was much more intermittent than were energetic fluctuations. This work is the first step in a continuing study of the spatial evolution of these scale-related statistics, the goal being to apply the multiscale analysis results to improve the modeling of transitional and turbulent industrial flows.
Statistical Compression for Climate Model Output
NASA Astrophysics Data System (ADS)
Hammerling, D.; Guinness, J.; Soh, Y. J.
2017-12-01
Numerical climate model simulations run at high spatial and temporal resolutions generate massive quantities of data. As our computing capabilities continue to increase, storing all of the data is not sustainable, and thus is it important to develop methods for representing the full datasets by smaller compressed versions. We propose a statistical compression and decompression algorithm based on storing a set of summary statistics as well as a statistical model describing the conditional distribution of the full dataset given the summary statistics. We decompress the data by computing conditional expectations and conditional simulations from the model given the summary statistics. Conditional expectations represent our best estimate of the original data but are subject to oversmoothing in space and time. Conditional simulations introduce realistic small-scale noise so that the decompressed fields are neither too smooth nor too rough compared with the original data. Considerable attention is paid to accurately modeling the original dataset-one year of daily mean temperature data-particularly with regard to the inherent spatial nonstationarity in global fields, and to determining the statistics to be stored, so that the variation in the original data can be closely captured, while allowing for fast decompression and conditional emulation on modest computers.
NASA Technical Reports Server (NTRS)
Perkins, Porter J
1955-01-01
A statistical survey and a preliminary analysis are made of icing data collected from scheduled flights over the United States and Canada from November 1951 to June 1952 by airline aircraft equipped with NACA pressure-type icing-rate meters. This interim report presents information obtained from a continuing program sponsored by the NACA with the cooperation of the airlines. An analysis of over 600 icing encounters logged by three airlines operating in the United States, one operating in Canada and one operating up the coast to Alaska, is presented. The icing conditions encountered provided relative frequencies of many icing-cloud variables, such as horizontal extent, vertical thickness, temperatures, icing rate, liquid-water content, and total ice accumulation. Liquid-water contents were higher than data from earlier research flights in layer-type clouds but slightly lower than previous data from cumulus clouds. Broken-cloud conditions, indicated by intermittent icing, accounted for nearly one-half of all the icing encounters. About 90 percent of the encounters did not exceed a distance of 120 miles, and continuous icing did not exceed 50 miles for 90 percent of the unbroken conditions. Icing cloud thicknesses measured during climbs and descents were less than 4500 feet for 90 percent of the vertical cloud traverses.
Synthesis-Structure-Activity Relationships in Co3O4 Catalyzed CO Oxidation
NASA Astrophysics Data System (ADS)
Mingle, Kathleen; Lauterbach, Jochen
2018-05-01
In this work, a statistical design and analysis platform was used to develop cobalt oxide based oxidation catalysts prepared via one pot metal salt reduction. An emphasis was placed upon understanding the effects of synthesis conditions, such as heating regimen and Co2+ concentration on the metal salt reduction mechanism, the resultant nanomaterial properties (i.e. size, crystal structure, and crystal faceting), and the catalytic activity in CO oxidation. This was accomplished by carrying out XRD, TEM, and FTIR studies on synthesis intermediates and products. Additionally, high-throughput experimentation was employed to study the performance of Co3O4 oxidation catalysts over a wide range of reaction conditions using a 16-channel fixed bed reactor equipped with a parallel infrared imaging system. Specifically, Co3O4 nanomaterials of varying properties were evaluated for their performance as CO oxidation catalysts. Figure-of-merits including light-off temperatures and activation energies were measured and mapped back to the catalyst properties and synthesis conditions. Statistical analysis methods were used to elucidate significant property-activity relationships as well as the design rules relevant in the synthesis of active catalysts. It was found that CO oxidation light off temperatures could be decreased to <90°C by utilizing the discovered synthesis-structure-activity relationships.
Koerner, Tess K; Zhang, Yang
2017-02-27
Neurophysiological studies are often designed to examine relationships between measures from different testing conditions, time points, or analysis techniques within the same group of participants. Appropriate statistical techniques that can take into account repeated measures and multivariate predictor variables are integral and essential to successful data analysis and interpretation. This work implements and compares conventional Pearson correlations and linear mixed-effects (LME) regression models using data from two recently published auditory electrophysiology studies. For the specific research questions in both studies, the Pearson correlation test is inappropriate for determining strengths between the behavioral responses for speech-in-noise recognition and the multiple neurophysiological measures as the neural responses across listening conditions were simply treated as independent measures. In contrast, the LME models allow a systematic approach to incorporate both fixed-effect and random-effect terms to deal with the categorical grouping factor of listening conditions, between-subject baseline differences in the multiple measures, and the correlational structure among the predictor variables. Together, the comparative data demonstrate the advantages as well as the necessity to apply mixed-effects models to properly account for the built-in relationships among the multiple predictor variables, which has important implications for proper statistical modeling and interpretation of human behavior in terms of neural correlates and biomarkers.
Climate Change Assessment of Precipitation in Tandula Reservoir System
NASA Astrophysics Data System (ADS)
Jaiswal, Rahul Kumar; Tiwari, H. L.; Lohani, A. K.
2018-02-01
The precipitation is the principle input of hydrological cycle affect availability of water in spatial and temporal scale of basin due to widely accepted climate change. The present study deals with the statistical downscaling using Statistical Down Scaling Model for rainfall of five rain gauge stations (Ambagarh, Bhanpura, Balod, Chamra and Gondli) in Tandula, Kharkhara and Gondli reservoirs of Chhattisgarh state of India to forecast future rainfall in three different periods under SRES A1B and A2 climatic forcing conditions. In the analysis, twenty-six climatic variables obtained from National Centers for Environmental Prediction were used and statistically tested for selection of best-fit predictors. The conditional process based statistical correlation was used to evolve multiple linear relations in calibration for period of 1981-1995 was tested with independent data of 1996-2003 for validation. The developed relations were further used to predict future rainfall scenarios for three different periods 2020-2035 (FP-1), 2046-2064 (FP-2) and 2081-2100 (FP-3) and compared with monthly rainfalls during base period (1981-2003) for individual station and all three reservoir catchments. From the analysis, it has been found that most of the rain gauge stations and all three reservoir catchments may receive significant less rainfall in future. The Thiessen polygon based annual and seasonal rainfall for different catchments confirmed a reduction of seasonal rainfall from 5.1 to 14.1% in Tandula reservoir, 11-19.2% in Kharkhara reservoir and 15.1-23.8% in Gondli reservoir. The Gondli reservoir may be affected the most in term of water availability in future prediction periods.
Banayot, Riyad G
2016-04-05
Eye diseases are important causes of medical consultations, with the spectrum varying in different regions. This hospital-based descriptive study aimed to determine the profile of childhood eye conditions at St. John tertiary Eye hospital serving in Hebron, Palestine. Files of all new patients less than 16 years old who presented to St. John Eye Hospital-Hebron, Palestine between January 2013 and December 2013 were retrospectively reviewed. Age at presentation, sex, and clinical diagnosis were extracted from medical records. Data were stored and analyzed using Wizard data analysis version 1.6.0 by Evan Miller. The Chi square test was used to compare variables and a p value of less than 0.05 was considered statistically significant. We evaluated the records of 1102 patients, with a female: male ratio of 1:1.1. Patients aged 0-5 years old were the largest group (40.2%). Refractive errors were the most common ocular disorders seen (31.6%), followed by conjunctival diseases (23.7%) and strabismus and amblyopia (13.8%). Refractive errors were recorded more frequently and statistically significant (p < 0.001) among (11-15) age group. Within the conjunctival diseases category, conjunctivitis and dry eyes was more prominent and statistically significant (p < 0.001) among the 6-10 year old age group. Within the strabismus and amblyopia category, convergent strabismus was more common and statistically significant among the youngest age group (0-5 years old). The most common causes of ocular morbidity are largely treatable or preventable. These results suggest the need for awareness campaigns and early intervention programs.
Safety Measures of L-Carnitine L-Tartrate Supplementation in Healthy Men.
ERIC Educational Resources Information Center
Rubin, Martyn R.; Volek, Jeff S.; Gomez, Ana L.; Ratamess, Nicholas A.; French, Duncan N.; Sharman, Matthew J.; Kraemer, William J.
2001-01-01
Examined the effects of ingesting the dietary supplement L- CARNIPURE on liver and renal function and blood hematology among healthy men. Analysis of blood samples indicated that there were no statistically significant differences between the L-CARNIPURE and placebo conditions for any variables examined, suggesting there are no safety concerns…
The Teacher Shortage: A Case of Wrong Diagnosis and Wrong Prescription.
ERIC Educational Resources Information Center
Ingersoll, Richard M.
2002-01-01
Investigates the possibility that the organizational characteristics and conditions of schools are driving teacher turnover. Analysis of data from the National Center for Education Statistics (NCES) indicates that the amount of turnover accounted for by retirement is relatively minor when compared with that associated with other factors such as…
Monitoring Urban Quality of Life: The Porto Experience
ERIC Educational Resources Information Center
Santos, Luis Delfim; Martins, Isabel
2007-01-01
This paper describes the monitoring system of the urban quality of life developed by the Porto City Council, a new tool being used to support urban planning and management. The two components of this system--a quantitative approach based on statistical indicators and a qualitative analysis based on the citizens' perceptions of the conditions of…
Modeling Conditional Probabilities in Complex Educational Assessments. CSE Technical Report.
ERIC Educational Resources Information Center
Mislevy, Robert J.; Almond, Russell; Dibello, Lou; Jenkins, Frank; Steinberg, Linda; Yan, Duanli; Senturk, Deniz
An active area in psychometric research is coordinated task design and statistical analysis built around cognitive models. Compared with classical test theory and item response theory, there is often less information from observed data about the measurement-model parameters. On the other hand, there is more information from the grounding…
ERIC Educational Resources Information Center
Hamaker, Ellen L.; Dolan, Conor V.; Molenaar, Peter C. M.
2005-01-01
Results obtained with interindividual techniques in a representative sample of a population are not necessarily generalizable to the individual members of this population. In this article the specific condition is presented that must be satisfied to generalize from the interindividual level to the intraindividual level. A way to investigate…
Negotiated Wages and Working Conditions in Ontario Hospitals: 1973.
ERIC Educational Resources Information Center
Ontario Dept. of Labour, Toronto. Research Branch.
This report is a statistical analysis of provisions in collective agreements covering approximately 38,000 full-time employees in 156 hospitals in the Province of Ontario. Part 1 consists of 56 tables giving information on the geographical distribution of hospital contracts, the unions that are party to them, their duration, and the sizes and…
ERIC Educational Resources Information Center
Wang, Shudong; Wang, Ning; Hoadley, David
2007-01-01
This study used confirmatory factor analysis (CFA) to examine the comparability of the National Nurse Aide Assessment Program (NNAAP[TM]) test scores across language and administration condition groups for calibration and validation samples that were randomly drawn from the same population. Fit statistics supported both the calibration and…
Du, Jiabi; Shen, Jian; Park, Kyeong; Wang, Ya Ping; Yu, Xin
2018-07-15
There are increasing concerns about the impact of worsened physical condition on hypoxia in a variety of coastal systems, especially considering the influence of changing climate. In this study, an EOF analysis of the DO data for 1985-2012, a long-term numerical simulation of vertical exchange, and statistical analysis were applied to understand the underlying mechanisms for the variation of DO condition in Chesapeake Bay. Three types of analysis consistently demonstrated that both biological and physical conditions contribute equally to seasonal and interannual variations of the hypoxic condition in Chesapeake Bay. We found the physical condition (vertical exchange+temperature) determines the spatial and seasonal pattern of the hypoxia in Chesapeake Bay. The EOF analysis showed that the first mode, which was highly related to the physical forcings and correlated with the summer hypoxia volume, can be well explained by seasonal and interannual variations of physical variables and biological activities, while the second mode is significantly correlated with the estuarine circulation and river discharge. The weakened vertical exchange and increased water temperature since the 1980s demonstrated a worsened physical condition over the past few decades. Under changing climate (e.g., warming, accelerated sea-level rise, altered precipitation and wind patterns), Chesapeake Bay is likely to experience a worsened physical condition, which will amplify the negative impact of anthropogenic inputs on eutrophication and consequently require more efforts for nutrient reduction to improve the water quality condition in Chesapeake Bay. Copyright © 2018 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Vagge, Greta; Cutroneo, Laura; Gandolfi, Daniela; Ferretti, Gabriele; Scafidi, Davide; Capello, Marco
2018-05-01
A historical set of daily meteorological data collected at the Hanbury Botanical Gardens (Liguria, northwestern Italy) from 1900 to 1940 was recovered from five manually entered registers. They were digitised and statistically analysed to check their reliability and study their trends and variations. In particular, air temperature, precipitation and their extreme values were considered, together with wind direction, sea state, sky conditions and relative humidity. The results show a decreasing trend in mean annual temperature of approximately 0.2 °C/decade due to a decrease in maximum air temperature. Annual cumulative precipitation increased by 65.2 mm/decade over the study period. The data analysis showed a summer temperature decrease in 1912 and a severe drought in 1921. Moreover, the years with most days with extreme temperatures were associated to the negative phases of the North Atlantic oscillation (NAO). During the study period, SW winds were prevailing. Sky conditions followed seasonal trends, while slight sea was the most frequent sea state.
Analysis of Acoustic Emission Parameters from Corrosion of AST Bottom Plate in Field Testing
NASA Astrophysics Data System (ADS)
Jomdecha, C.; Jirarungsatian, C.; Suwansin, W.
Field testing of aboveground storage tank (AST) to monitor corrosion of the bottom plate is presented in this chapter. AE testing data of the ten AST with different sizes, materials, and products were employed to monitor the bottom plate condition. AE sensors of 30 and 150 kHz were used to monitor the corrosion activity of up to 24 channels including guard sensors. Acoustic emission (AE) parameters were analyzed to explore the AE parameter patterns of occurring corrosion compared to the laboratory results. Amplitude, count, duration, and energy were main parameters of analysis. Pattern recognition technique with statistical was implemented to eliminate the electrical and environmental noises. The results showed the specific AE patterns of corrosion activities related to the empirical results. In addition, plane algorithm was utilized to locate the significant AE events from corrosion. Both results of parameter patterns and AE event locations can be used to interpret and locate the corrosion activities. Finally, basic statistical grading technique was used to evaluate the bottom plate condition of the AST.
Gama, Silvana Granado Nogueira da; Szwarcwald, Célia Landmann; Sabroza, Adriane Reis; Castelo Branco, Viviane; Leal, Maria do Carmo
2004-01-01
This study characterizes the women receiving precarious prenatal care according to socio-demographic variables, mother's reproductive history, family support, satisfaction with pregnancy, and risk behavior during pregnancy. A total of 1,967 adolescents were interviewed in the immediate post-partum in public and outsourced maternity hospitals in the City of Rio de Janeiro. The dependent variable was the number of prenatal appointments (0-3; 4-6; 7 or more). The statistical analysis aimed to test the hypothesis of homogeneity of proportions, including bi- and multivariate analysis, using multinomial logistic regression, in which the reference category for the response variable was 7 or more prenatal visits. Higher (and statistically significant) proportions of insufficient number of prenatal visits (0-3) were associated with: precarious sanitation conditions; not living with the child's father; attempted abortion; and smoking, drinking, and/or drug use during pregnancy. The results strongly indicate that mothers with worse living conditions and risk behavior during pregnancy were the same who lacked access to prenatal care.
Abdoli, Sherwin; Ho, Leon C; Zhang, Jevin W; Dong, Celia M; Lau, Condon; Wu, Ed X
2016-12-01
This study investigated neuroanatomical changes following long-term acoustic exposure at moderate sound pressure level (SPL) under passive conditions, without coupled behavioral training. The authors utilized diffusion tensor imaging (DTI) to detect morphological changes in white matter. DTIs from adult rats (n = 8) exposed to continuous acoustic exposure at moderate SPL for 2 months were compared with DTIs from rats (n = 8) reared under standard acoustic conditions. Two distinct forms of DTI analysis were applied in a sequential manner. First, DTI images were analyzed using voxel-based statistics which revealed greater fractional anisotropy (FA) of the pyramidal tract and decreased FA of the tectospinal tract and trigeminothalamic tract of the exposed rats. Region of interest analysis confirmed (p < 0.05) that FA had increased in the pyramidal tract but did not show a statistically significant difference in the FA of the tectospinal or trigeminothalamic tract. The results of the authors show that long-term and passive acoustic exposure at moderate SPL increases the organization of white matter in the pyramidal tract.
NASA Astrophysics Data System (ADS)
Ghannadpour, Seyyed Saeed; Hezarkhani, Ardeshir
2016-03-01
The U-statistic method is one of the most important structural methods to separate the anomaly from the background. It considers the location of samples and carries out the statistical analysis of the data without judging from a geochemical point of view and tries to separate subpopulations and determine anomalous areas. In the present study, to use U-statistic method in three-dimensional (3D) condition, U-statistic is applied on the grade of two ideal test examples, by considering sample Z values (elevation). So far, this is the first time that this method has been applied on a 3D condition. To evaluate the performance of 3D U-statistic method and in order to compare U-statistic with one non-structural method, the method of threshold assessment based on median and standard deviation (MSD method) is applied on the two example tests. Results show that the samples indicated by U-statistic method as anomalous are more regular and involve less dispersion than those indicated by the MSD method. So that, according to the location of anomalous samples, denser areas of them can be determined as promising zones. Moreover, results show that at a threshold of U = 0, the total error of misclassification for U-statistic method is much smaller than the total error of criteria of bar {x}+n× s. Finally, 3D model of two test examples for separating anomaly from background using 3D U-statistic method is provided. The source code for a software program, which was developed in the MATLAB programming language in order to perform the calculations of the 3D U-spatial statistic method, is additionally provided. This software is compatible with all the geochemical varieties and can be used in similar exploration projects.
NASA Astrophysics Data System (ADS)
Torres Torres, N. I.; Howard, J.; Padilla, I. Y.; Torres, P.; Cotto, I.; Irizarry, C.
2012-12-01
The karst system of northern Puerto Rico is the most productive aquifer of the island. It serves freshwater to industrial, domestic and agricultural purposes, and contributes to the ecological integrity of the region. The same characteristics that make this a highly productive aquifer, make it vulnerable to contamination of groundwater. Of particular importance is contamination with chlorinated volatile organic compounds (CVOCs), which have been related to preterm birth problems. A great extent of CVOC contamination has been seen in the North Coast of Puerto Rico since the 1970s. The main purposes of this study are (1) to relate the water quality of wells and springs with the hydrogeological conditions in the north coast limestone aquifer of Puerto Rico, and (2) to make a statistical analysis of the historical groundwater contamination in that area. To achieve these objectives, groundwater samples are collected from wells and springs during dry and wet seasons. Results show that trichloroethylene (TCE), tetrachloroethylene (PCE) and chloroform (TCM) are frequently detected in groundwater samples. A greater detection of CVOCs is detected during the wet season than the dry season. This is attributed to a greater capacity to flush stored contaminants during the wet season. Historical analysis of contamination in the north coast of Puerto Rico shows a high capacity of the aquifer to store and release contaminants. Future work will be focused the statistical analysis of the historical groundwater contamination data to understand the behavior of the contaminants in different hydrologic conditions.
NASA Astrophysics Data System (ADS)
Počakal, Damir; Štalec, Janez
In the continental part of Croatia, operational hail suppression has been conducted for more than 30 years. The current protected area is 25,177 km 2 and has about 492 hail suppression stations which are managed with eight weather radar centres. This paper present a statistical analysis of parameters connected with hail occurrence on hail suppression stations in the western part of protected area in 1981-2000 period. This analysis compares data of two periods with different intensity of hail suppression activity and is made as a part of a project for assessment of hail suppression efficiency in Croatia. Because of disruption in hail suppression system during the independence war in Croatia (1991-1995), lack of rockets and other objective circumstances, it is considered that in the 1991-2000 period, hail suppression system could not act properly. Because of that, a comparison of hail suppression data for two periods was made. The first period (1981-1990), which is characterised with full application of hail suppression technology is compared with the second period (1991-2000). The protected area is divided into quadrants (9×9 km), such that every quadrant has at least one hail suppression station and intercomparison is more precise. Discriminant analysis was performed for the yearly values of each quadrant. These values included number of cases with solid precipitation, hail damage, heavy hail damage, number of active hail suppression stations, number of days with solid precipitation, solid precipitation damage, heavy solid precipitation damage and the number and duration of air traffic control bans. The discriminant analysis shows that there is a significant difference between the two periods. Average values of observed periods on isolated discriminant function 1 are for the first period (1981-1990) -0.36 and for the second period +0.23 standard deviation of all observations. The analysis for all eight variables shows statistically substantial differences in the number of hail suppression stations (which have a positive correlation) and in the number of cases with air traffic control ban, which have, like all other variables, a negative correlation. Results of statistical analysis for two periods show positive influence of hail suppression system. The discriminant analysis made for three periods shows that these three periods can not be compared because of the short time period, the difference in hail suppression technology, working conditions and possible differences in meteorological conditions. Therefore, neither the effectiveness nor ineffectiveness of hail suppression operations nor their efficiency can be statistically proven. For an exact assessment of hail suppression effectiveness, it is necessary to develop a project, which would take into consideration all the parameters used in such previous projects around the world—a hailpad polygon.
Exercise reduces depressive symptoms in adults with arthritis: Evidential value.
Kelley, George A; Kelley, Kristi S
2016-07-12
To determine whether evidential value exists that exercise reduces depression in adults with arthritis and other rheumatic conditions. Utilizing data derived from a prior meta-analysis of 29 randomized controlled trials comprising 2449 participants (1470 exercise, 979 control) with fibromyalgia, osteoarthritis, rheumatoid arthritis or systemic lupus erythematosus, a new method, P -curve, was utilized to assess for evidentiary worth as well as dismiss the possibility of discriminating reporting of statistically significant results regarding exercise and depression in adults with arthritis and other rheumatic conditions. Using the method of Stouffer, Z -scores were calculated to examine selective-reporting bias. An alpha ( P ) value < 0.05 was deemed statistically significant. In addition, average power of the tests included in P -curve, adjusted for publication bias, was calculated. Fifteen of 29 studies (51.7%) with exercise and depression results were statistically significant ( P < 0.05) while none of the results were statistically significant with respect to exercise increasing depression in adults with arthritis and other rheumatic conditions. Right-skew to dismiss selective reporting was identified ( Z = -5.28, P < 0.0001). In addition, the included studies did not lack evidential value ( Z = 2.39, P = 0.99), nor did they lack evidential value and were P -hacked ( Z = 5.28, P > 0.99). The relative frequencies of P -values were 66.7% at 0.01, 6.7% each at 0.02 and 0.03, 13.3% at 0.04 and 6.7% at 0.05. The average power of the tests included in P -curve, corrected for publication bias, was 69%. Diagnostic plot results revealed that the observed power estimate was a better fit than the alternatives. Evidential value results provide additional support that exercise reduces depression in adults with arthritis and other rheumatic conditions.
Exercise reduces depressive symptoms in adults with arthritis: Evidential value
Kelley, George A; Kelley, Kristi S
2016-01-01
AIM To determine whether evidential value exists that exercise reduces depression in adults with arthritis and other rheumatic conditions. METHODS Utilizing data derived from a prior meta-analysis of 29 randomized controlled trials comprising 2449 participants (1470 exercise, 979 control) with fibromyalgia, osteoarthritis, rheumatoid arthritis or systemic lupus erythematosus, a new method, P-curve, was utilized to assess for evidentiary worth as well as dismiss the possibility of discriminating reporting of statistically significant results regarding exercise and depression in adults with arthritis and other rheumatic conditions. Using the method of Stouffer, Z-scores were calculated to examine selective-reporting bias. An alpha (P) value < 0.05 was deemed statistically significant. In addition, average power of the tests included in P-curve, adjusted for publication bias, was calculated. RESULTS Fifteen of 29 studies (51.7%) with exercise and depression results were statistically significant (P < 0.05) while none of the results were statistically significant with respect to exercise increasing depression in adults with arthritis and other rheumatic conditions. Right-skew to dismiss selective reporting was identified (Z = −5.28, P < 0.0001). In addition, the included studies did not lack evidential value (Z = 2.39, P = 0.99), nor did they lack evidential value and were P-hacked (Z = 5.28, P > 0.99). The relative frequencies of P-values were 66.7% at 0.01, 6.7% each at 0.02 and 0.03, 13.3% at 0.04 and 6.7% at 0.05. The average power of the tests included in P-curve, corrected for publication bias, was 69%. Diagnostic plot results revealed that the observed power estimate was a better fit than the alternatives. CONCLUSION Evidential value results provide additional support that exercise reduces depression in adults with arthritis and other rheumatic conditions. PMID:27489782
Crans, Gerald G; Shuster, Jonathan J
2008-08-15
The debate as to which statistical methodology is most appropriate for the analysis of the two-sample comparative binomial trial has persisted for decades. Practitioners who favor the conditional methods of Fisher, Fisher's exact test (FET), claim that only experimental outcomes containing the same amount of information should be considered when performing analyses. Hence, the total number of successes should be fixed at its observed level in hypothetical repetitions of the experiment. Using conditional methods in clinical settings can pose interpretation difficulties, since results are derived using conditional sample spaces rather than the set of all possible outcomes. Perhaps more importantly from a clinical trial design perspective, this test can be too conservative, resulting in greater resource requirements and more subjects exposed to an experimental treatment. The actual significance level attained by FET (the size of the test) has not been reported in the statistical literature. Berger (J. R. Statist. Soc. D (The Statistician) 2001; 50:79-85) proposed assessing the conservativeness of conditional methods using p-value confidence intervals. In this paper we develop a numerical algorithm that calculates the size of FET for sample sizes, n, up to 125 per group at the two-sided significance level, alpha = 0.05. Additionally, this numerical method is used to define new significance levels alpha(*) = alpha+epsilon, where epsilon is a small positive number, for each n, such that the size of the test is as close as possible to the pre-specified alpha (0.05 for the current work) without exceeding it. Lastly, a sample size and power calculation example are presented, which demonstrates the statistical advantages of implementing the adjustment to FET (using alpha(*) instead of alpha) in the two-sample comparative binomial trial. 2008 John Wiley & Sons, Ltd
NASA Astrophysics Data System (ADS)
O'Connor, Alison; Kirtman, Benjamin; Harrison, Scott; Gorman, Joe
2016-05-01
The US Navy faces several limitations when planning operations in regard to forecasting environmental conditions. Currently, mission analysis and planning tools rely heavily on short-term (less than a week) forecasts or long-term statistical climate products. However, newly available data in the form of weather forecast ensembles provides dynamical and statistical extended-range predictions that can produce more accurate predictions if ensemble members can be combined correctly. Charles River Analytics is designing the Climatological Observations for Maritime Prediction and Analysis Support Service (COMPASS), which performs data fusion over extended-range multi-model ensembles, such as the North American Multi-Model Ensemble (NMME), to produce a unified forecast for several weeks to several seasons in the future. We evaluated thirty years of forecasts using machine learning to select predictions for an all-encompassing and superior forecast that can be used to inform the Navy's decision planning process.
A Flexible Approach for the Statistical Visualization of Ensemble Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Potter, K.; Wilson, A.; Bremer, P.
2009-09-29
Scientists are increasingly moving towards ensemble data sets to explore relationships present in dynamic systems. Ensemble data sets combine spatio-temporal simulation results generated using multiple numerical models, sampled input conditions and perturbed parameters. While ensemble data sets are a powerful tool for mitigating uncertainty, they pose significant visualization and analysis challenges due to their complexity. We present a collection of overview and statistical displays linked through a high level of interactivity to provide a framework for gaining key scientific insight into the distribution of the simulation results as well as the uncertainty associated with the data. In contrast to methodsmore » that present large amounts of diverse information in a single display, we argue that combining multiple linked statistical displays yields a clearer presentation of the data and facilitates a greater level of visual data analysis. We demonstrate this approach using driving problems from climate modeling and meteorology and discuss generalizations to other fields.« less
NASA Astrophysics Data System (ADS)
Cabalín, L. M.; González, A.; Ruiz, J.; Laserna, J. J.
2010-08-01
Statistical uncertainty in the quantitative analysis of solid samples in motion by laser-induced breakdown spectroscopy (LIBS) has been assessed. For this purpose, a LIBS demonstrator was designed and constructed in our laboratory. The LIBS system consisted of a laboratory-scale conveyor belt, a compact optical module and a Nd:YAG laser operating at 532 nm. The speed of the conveyor belt was variable and could be adjusted up to a maximum speed of 2 m s - 1 . Statistical uncertainty in the analytical measurements was estimated in terms of precision (reproducibility and repeatability) and accuracy. The results obtained by LIBS on shredded scrap samples under real conditions have demonstrated that the analytical precision and accuracy of LIBS is dependent on the sample geometry, position on the conveyor belt and surface cleanliness. Flat, relatively clean scrap samples exhibited acceptable reproducibility and repeatability; by contrast, samples with an irregular shape or a dirty surface exhibited a poor relative standard deviation.
Clark, D Angus; Bowles, Ryan P
2018-04-23
In exploratory item factor analysis (IFA), researchers may use model fit statistics and commonly invoked fit thresholds to help determine the dimensionality of an assessment. However, these indices and thresholds may mislead as they were developed in a confirmatory framework for models with continuous, not categorical, indicators. The present study used Monte Carlo simulation methods to investigate the ability of popular model fit statistics (chi-square, root mean square error of approximation, the comparative fit index, and the Tucker-Lewis index) and their standard cutoff values to detect the optimal number of latent dimensions underlying sets of dichotomous items. Models were fit to data generated from three-factor population structures that varied in factor loading magnitude, factor intercorrelation magnitude, number of indicators, and whether cross loadings or minor factors were included. The effectiveness of the thresholds varied across fit statistics, and was conditional on many features of the underlying model. Together, results suggest that conventional fit thresholds offer questionable utility in the context of IFA.
Daniel Goodman’s empirical approach to Bayesian statistics
Gerrodette, Tim; Ward, Eric; Taylor, Rebecca L.; Schwarz, Lisa K.; Eguchi, Tomoharu; Wade, Paul; Himes Boor, Gina
2016-01-01
Bayesian statistics, in contrast to classical statistics, uses probability to represent uncertainty about the state of knowledge. Bayesian statistics has often been associated with the idea that knowledge is subjective and that a probability distribution represents a personal degree of belief. Dr. Daniel Goodman considered this viewpoint problematic for issues of public policy. He sought to ground his Bayesian approach in data, and advocated the construction of a prior as an empirical histogram of “similar” cases. In this way, the posterior distribution that results from a Bayesian analysis combined comparable previous data with case-specific current data, using Bayes’ formula. Goodman championed such a data-based approach, but he acknowledged that it was difficult in practice. If based on a true representation of our knowledge and uncertainty, Goodman argued that risk assessment and decision-making could be an exact science, despite the uncertainties. In his view, Bayesian statistics is a critical component of this science because a Bayesian analysis produces the probabilities of future outcomes. Indeed, Goodman maintained that the Bayesian machinery, following the rules of conditional probability, offered the best legitimate inference from available data. We give an example of an informative prior in a recent study of Steller sea lion spatial use patterns in Alaska.
Statistical quality control through overall vibration analysis
NASA Astrophysics Data System (ADS)
Carnero, M. a. Carmen; González-Palma, Rafael; Almorza, David; Mayorga, Pedro; López-Escobar, Carlos
2010-05-01
The present study introduces the concept of statistical quality control in automotive wheel bearings manufacturing processes. Defects on products under analysis can have a direct influence on passengers' safety and comfort. At present, the use of vibration analysis on machine tools for quality control purposes is not very extensive in manufacturing facilities. Noise and vibration are common quality problems in bearings. These failure modes likely occur under certain operating conditions and do not require high vibration amplitudes but relate to certain vibration frequencies. The vibration frequencies are affected by the type of surface problems (chattering) of ball races that are generated through grinding processes. The purpose of this paper is to identify grinding process variables that affect the quality of bearings by using statistical principles in the field of machine tools. In addition, an evaluation of the quality results of the finished parts under different combinations of process variables is assessed. This paper intends to establish the foundations to predict the quality of the products through the analysis of self-induced vibrations during the contact between the grinding wheel and the parts. To achieve this goal, the overall self-induced vibration readings under different combinations of process variables are analysed using statistical tools. The analysis of data and design of experiments follows a classical approach, considering all potential interactions between variables. The analysis of data is conducted through analysis of variance (ANOVA) for data sets that meet normality and homoscedasticity criteria. This paper utilizes different statistical tools to support the conclusions such as chi squared, Shapiro-Wilks, symmetry, Kurtosis, Cochran, Hartlett, and Hartley and Krushal-Wallis. The analysis presented is the starting point to extend the use of predictive techniques (vibration analysis) for quality control. This paper demonstrates the existence of predictive variables (high-frequency vibration displacements) that are sensible to the processes setup and the quality of the products obtained. Based on the result of this overall vibration analysis, a second paper will analyse self-induced vibration spectrums in order to define limit vibration bands, controllable every cycle or connected to permanent vibration-monitoring systems able to adjust sensible process variables identified by ANOVA, once the vibration readings exceed established quality limits.
Investigation of synchronization between musical beat and heartbeat with cardio-music synchrogram
NASA Astrophysics Data System (ADS)
Fukumoto, Makoto; Nomura, Shusaku; Sawai, Masahiro; Imai, Jun-Ichi; Nagashima, Tomomasa
To illuminate the synchronization phenomena between heartbeat and music, the effects of a sedative music of variable tempo on heart rates were investigated. In the experiment, nine subjects were exposed to the sedative music with having changes in its tempo. The tempo gradually increases, decreases, or stands stable in the music (hereafter these experimental condition are named as Up, Down, and Flat condition). With regard to the analysis of synchronization, we introduced our formerly developed Cardio-Music Synchrogram, which was used to extract statistically significant synchronization period between heartbeat and music. As a result, it was suggested that the sedative music in Down condition induced synchronization more frequently than Flat and Up conditions.
Kyle, Richard G; Kukanova, Marina; Campbell, Malcolm; Wolfe, Ingrid; Powell, Peter; Callery, Peter
2011-03-01
To determine whether emergency hospital admission rates (EAR) for common paediatric conditions in Greater London are associated with measures of child well-being and deprivation. Retrospective analysis of hospital episode statistics and secondary analysis of the Index of Multiple Deprivation (IMD) 2007 and Local Index of Child Well-Being (CWI) 2009. 31 Greater London primary care trusts (PCTs). EAR in PCTs for breathing difficulty, feverish illness and/or diarrhoea. 24,481 children under 15 years of age were discharged following emergency admission for breathing difficulty, feverish illness and/or diarrhoea during 2007/2008. The EAR for breathing difficulty was associated with the IMD (Spearman's rho 0.59, p<0.001) and IMD indicators of: overcrowding (Spearman's rho 0.62, p<0.001), houses in poor condition (Spearman's rho 0.55, p=0.001), air quality (Spearman's rho 0.53, p=0.002), homelessness (Spearman's rho 0.44, p=0.013), and domains of the CWI: housing (Spearman's rho 0.64, p<0.001), children in need (Spearman's rho 0.62, p<0.001), material (Spearman's rho 0.58, p=0.001) and environment (Spearman's rho 0.53, p=0.002). There were no statistically significant relationships between the EAR of children admitted for feverish illness and diarrhoea or aged under 1 year for any condition, and the IMD, either IMD indicators or CWI domains. Housing and environmental factors are associated with children's demand for hospital admission for breathing difficulty. Some associations are stronger with the CWI than the IMD. The CWI has potential to identify priority PCTs for housing and environment interventions that could have specific public health benefits for respiratory conditions.
Management system of occupational diseases in Korea: statistics, report and monitoring system.
Rhee, Kyung Yong; Choe, Seong Weon
2010-12-01
The management system of occupational diseases in Korea can be assessed from the perspective of a surveillance system. Workers' compensation insurance reports are used to produce official statistics on occupational diseases in Korea. National working conditions surveys are used to monitor the magnitude of work-related symptoms and signs in the labor force. A health examination program was introduced to detect occupational diseases through both selective and mass screening programs. The Working Environment Measurement Institution assesses workers' exposure to hazards in the workplace. Government regulates that the employer should do health examinations and working conditions measurement through contracted private agencies and following the Occupational Safety and Health Act. It is hoped that these institutions may be able to effectively detect and monitor occupational diseases and hazards in the workplace. In view of this, the occupational management system in Korea is well designed, except for the national survey system. In the future, national surveys for detection of hazards and ill-health outcomes in workers should be developed. The existing surveillance system for occupational disease can be improved by providing more refined information through statistical analysis of surveillance data.
González-López, Antonio; Vera-Sánchez, Juan Antonio; Ruiz-Morales, Carmen
2016-05-01
This note studies the statistical relationships between color channels in radiochromic film readings with flatbed scanners. The same relationships are studied for noise. Finally, their implications for multichannel film dosimetry are discussed. Radiochromic films exposed to wedged fields of 6 MV energy were read in a flatbed scanner. The joint histograms of pairs of color channels were used to obtain the joint and conditional probability density functions between channels. Then, the conditional expectations and variances of one channel given another channel were obtained. Noise was extracted from film readings by means of a multiresolution analysis. Two different dose ranges were analyzed, the first one ranging from 112 to 473 cGy and the second one from 52 to 1290 cGy. For the smallest dose range, the conditional expectations of one channel given another channel can be approximated by linear functions, while the conditional variances are fairly constant. The slopes of the linear relationships between channels can be used to simplify the expression that estimates the dose by means of the multichannel method. The slopes of the linear relationships between each channel and the red one can also be interpreted as weights in the final contribution to dose estimation. However, for the largest dose range, the conditional expectations of one channel given another channel are no longer linear functions. Finally, noises in different channels were found to correlate weakly. Signals present in different channels of radiochromic film readings show a strong statistical dependence. By contrast, noise correlates weakly between channels. For the smallest dose range analyzed, the linear behavior between the conditional expectation of one channel given another channel can be used to simplify calculations in multichannel film dosimetry.
Postural Stability of Special Warfare Combatant-Craft Crewmen With Tactical Gear.
Morgan, Paul M; Williams, Valerie J; Sell, Timothy C
The US Naval Special Warfare's Special Warfare Combatant-Craft Crewmen (SWCC) operate on small, high-speed boats while wearing tactical gear (TG). The TG increases mission safety and success but may affect postural stability, potentially increasing risk for musculoskeletal injury. Therefore, the purpose of this study was to examine the effects of TG on postural stability during the Sensory Organization Test (SOT). Eight SWCC performed the SOT on NeuroCom's Balance Manager with TG and with no tactical gear (NTG). The status of gear was performed in randomized order. The SOT consisted of six different conditions that challenge sensory systems responsible for postural stability. Each condition was performed for three trials, resulting in a total of 18 trials. Overall performance, each individual condition, and sensory system analysis (somatosensory, visual, vestibular, preference) were scored. Data were not normally distributed therefore Wilcoxon signed-rank tests were used to compare each variable (ρ = .05). No significant differences were found between NTG and TG tests. No statistically significant differences were detected under the two TG conditions. This may be due to low statistical power, or potentially insensitivity of the assessment. Also, the amount and distribution of weight worn during the TG conditions, and the SWCC's unstable occupational platform, may have contributed to the findings. The data from this sample will be used in future research to better understand how TG affects SWCC. The data show that the addition of TG used in our study did not affect postural stability of SWCC during the SOT. Although no statistically significant differences were observed, there are clinical reasons for continued study of the effect of increased load on postural stability, using more challenging conditions, greater surface perturbations, dynamic tasks, and heavier loads. 2016.
DOE Office of Scientific and Technical Information (OSTI.GOV)
González-López, Antonio, E-mail: antonio.gonzalez7@carm.es; Vera-Sánchez, Juan Antonio; Ruiz-Morales, Carmen
Purpose: This note studies the statistical relationships between color channels in radiochromic film readings with flatbed scanners. The same relationships are studied for noise. Finally, their implications for multichannel film dosimetry are discussed. Methods: Radiochromic films exposed to wedged fields of 6 MV energy were read in a flatbed scanner. The joint histograms of pairs of color channels were used to obtain the joint and conditional probability density functions between channels. Then, the conditional expectations and variances of one channel given another channel were obtained. Noise was extracted from film readings by means of a multiresolution analysis. Two different dosemore » ranges were analyzed, the first one ranging from 112 to 473 cGy and the second one from 52 to 1290 cGy. Results: For the smallest dose range, the conditional expectations of one channel given another channel can be approximated by linear functions, while the conditional variances are fairly constant. The slopes of the linear relationships between channels can be used to simplify the expression that estimates the dose by means of the multichannel method. The slopes of the linear relationships between each channel and the red one can also be interpreted as weights in the final contribution to dose estimation. However, for the largest dose range, the conditional expectations of one channel given another channel are no longer linear functions. Finally, noises in different channels were found to correlate weakly. Conclusions: Signals present in different channels of radiochromic film readings show a strong statistical dependence. By contrast, noise correlates weakly between channels. For the smallest dose range analyzed, the linear behavior between the conditional expectation of one channel given another channel can be used to simplify calculations in multichannel film dosimetry.« less
1998-01-01
Ferrography on High Performance Aircraft Engine Lubricating Oils Allison M. Toms, Sharon 0. Hem, Tim Yarborough Joint Oil Analysis Program Technical...turbine engines by spectroscopy (AES and FT-IR) and direct reading and analytical ferrography . A statistical analysis of the data collected is...presented. Key Words: Analytical ferrography ; atomic emission spectroscopy; condition monitoring; direct reading ferrography ; Fourier transform infrared
Probabilistic structural analysis methods and applications
NASA Technical Reports Server (NTRS)
Cruse, T. A.; Wu, Y.-T.; Dias, B.; Rajagopal, K. R.
1988-01-01
An advanced algorithm for simulating the probabilistic distribution of structural responses due to statistical uncertainties in loads, geometry, material properties, and boundary conditions is reported. The method effectively combines an advanced algorithm for calculating probability levels for multivariate problems (fast probability integration) together with a general-purpose finite-element code for stress, vibration, and buckling analysis. Application is made to a space propulsion system turbine blade for which the geometry and material properties are treated as random variables.
Remote Sensing/gis Integration for Site Planning and Resource Management
NASA Technical Reports Server (NTRS)
Fellows, J. D.
1982-01-01
The development of an interactive/batch gridded information system (array of cells georeferenced to USGS quad sheets) and interfacing application programs (e.g., hydrologic models) is discussed. This system allows non-programer users to request any data set(s) stored in the data base by inputing any random polygon's (watershed, political zone) boundary points. The data base information contained within this polygon can be used to produce maps, statistics, and define model parameters for the area. Present/proposed conditions for the area may be compared by inputing future usage (land cover, soils, slope, etc.). This system, known as the Hydrologic Analysis Program (HAP), is especially effective in the real time analysis of proposed land cover changes on runoff hydrographs and graphics/statistics resource inventories of random study area/watersheds.
Foldnes, Njål; Olsson, Ulf Henning
2016-01-01
We present and investigate a simple way to generate nonnormal data using linear combinations of independent generator (IG) variables. The simulated data have prespecified univariate skewness and kurtosis and a given covariance matrix. In contrast to the widely used Vale-Maurelli (VM) transform, the obtained data are shown to have a non-Gaussian copula. We analytically obtain asymptotic robustness conditions for the IG distribution. We show empirically that popular test statistics in covariance analysis tend to reject true models more often under the IG transform than under the VM transform. This implies that overly optimistic evaluations of estimators and fit statistics in covariance structure analysis may be tempered by including the IG transform for nonnormal data generation. We provide an implementation of the IG transform in the R environment.
Six Guidelines for Interesting Research.
Gray, Kurt; Wegner, Daniel M
2013-09-01
There are many guides on proper psychology, but far fewer on interesting psychology. This article presents six guidelines for interesting research. The first three-Phenomena First, Be Surprising, and Grandmothers, Not Scientists-suggest how to choose your research question; the last three-Be The Participant, Simple Statistics, and Powerful Beginnings-suggest how to answer your research question and offer perspectives on experimental design, statistical analysis, and effective communication. These guidelines serve as reminders that replicability is necessary but not sufficient for compelling psychological science. Interesting research considers subjective experience; it listens to the music of the human condition. © The Author(s) 2013.
Booth, Brian G; Keijsers, Noël L W; Sijbers, Jan; Huysmans, Toon
2018-05-03
Pedobarography produces large sets of plantar pressure samples that are routinely subsampled (e.g. using regions of interest) or aggregated (e.g. center of pressure trajectories, peak pressure images) in order to simplify statistical analysis and provide intuitive clinical measures. We hypothesize that these data reductions discard gait information that can be used to differentiate between groups or conditions. To test the hypothesis of null information loss, we created an implementation of statistical parametric mapping (SPM) for dynamic plantar pressure datasets (i.e. plantar pressure videos). Our SPM software framework brings all plantar pressure videos into anatomical and temporal correspondence, then performs statistical tests at each sampling location in space and time. Novelly, we introduce non-linear temporal registration into the framework in order to normalize for timing differences within the stance phase. We refer to our software framework as STAPP: spatiotemporal analysis of plantar pressure measurements. Using STAPP, we tested our hypothesis on plantar pressure videos from 33 healthy subjects walking at different speeds. As walking speed increased, STAPP was able to identify significant decreases in plantar pressure at mid-stance from the heel through the lateral forefoot. The extent of these plantar pressure decreases has not previously been observed using existing plantar pressure analysis techniques. We therefore conclude that the subsampling of plantar pressure videos - a task which led to the discarding of gait information in our study - can be avoided using STAPP. Copyright © 2018 Elsevier B.V. All rights reserved.
Bayesian inference for joint modelling of longitudinal continuous, binary and ordinal events.
Li, Qiuju; Pan, Jianxin; Belcher, John
2016-12-01
In medical studies, repeated measurements of continuous, binary and ordinal outcomes are routinely collected from the same patient. Instead of modelling each outcome separately, in this study we propose to jointly model the trivariate longitudinal responses, so as to take account of the inherent association between the different outcomes and thus improve statistical inferences. This work is motivated by a large cohort study in the North West of England, involving trivariate responses from each patient: Body Mass Index, Depression (Yes/No) ascertained with cut-off score not less than 8 at the Hospital Anxiety and Depression Scale, and Pain Interference generated from the Medical Outcomes Study 36-item short-form health survey with values returned on an ordinal scale 1-5. There are some well-established methods for combined continuous and binary, or even continuous and ordinal responses, but little work was done on the joint analysis of continuous, binary and ordinal responses. We propose conditional joint random-effects models, which take into account the inherent association between the continuous, binary and ordinal outcomes. Bayesian analysis methods are used to make statistical inferences. Simulation studies show that, by jointly modelling the trivariate outcomes, standard deviations of the estimates of parameters in the models are smaller and much more stable, leading to more efficient parameter estimates and reliable statistical inferences. In the real data analysis, the proposed joint analysis yields a much smaller deviance information criterion value than the separate analysis, and shows other good statistical properties too. © The Author(s) 2014.
[The evaluation of costs: standards of medical care and clinical statistic groups].
Semenov, V Iu; Samorodskaia, I V
2014-01-01
The article presents the comparative analysis of techniques of evaluation of costs of hospital treatment using medical economic standards of medical care and clinical statistical groups. The technique of evaluation of costs on the basis of clinical statistical groups was developed almost fifty years ago and is largely applied in a number of countries. Nowadays, in Russia the payment for completed case of treatment on the basis of medical economic standards is the main mode of payment for medical care in hospital. It is very conditionally a Russian analogue of world-wide prevalent system of diagnostic related groups. The tariffs for these cases of treatment as opposed to clinical statistical groups are counted on basis of standards of provision of medical care approved by Minzdrav of Russia. The information derived from generalization of cases of treatment of real patients is not applied.
Dynamic heterogeneity and non-Gaussian statistics for acetylcholine receptors on live cell membrane
NASA Astrophysics Data System (ADS)
He, W.; Song, H.; Su, Y.; Geng, L.; Ackerson, B. J.; Peng, H. B.; Tong, P.
2016-05-01
The Brownian motion of molecules at thermal equilibrium usually has a finite correlation time and will eventually be randomized after a long delay time, so that their displacement follows the Gaussian statistics. This is true even when the molecules have experienced a complex environment with a finite correlation time. Here, we report that the lateral motion of the acetylcholine receptors on live muscle cell membranes does not follow the Gaussian statistics for normal Brownian diffusion. From a careful analysis of a large volume of the protein trajectories obtained over a wide range of sampling rates and long durations, we find that the normalized histogram of the protein displacements shows an exponential tail, which is robust and universal for cells under different conditions. The experiment indicates that the observed non-Gaussian statistics and dynamic heterogeneity are inherently linked to the slow-active remodelling of the underlying cortical actin network.
An astronomer's guide to period searching
NASA Astrophysics Data System (ADS)
Schwarzenberg-Czerny, A.
2003-03-01
We concentrate on analysis of unevenly sampled time series, interrupted by periodic gaps, as often encountered in astronomy. While some of our conclusions may appear surprising, all are based on classical statistical principles of Fisher & successors. Except for discussion of the resolution issues, it is best for the reader to forget temporarily about Fourier transforms and to concentrate on problems of fitting of a time series with a model curve. According to their statistical content we divide the issues into several sections, consisting of: (ii) statistical numerical aspects of model fitting, (iii) evaluation of fitted models as hypotheses testing, (iv) the role of the orthogonal models in signal detection (v) conditions for equivalence of periodograms (vi) rating sensitivity by test power. An experienced observer working with individual objects would benefit little from formalized statistical approach. However, we demonstrate the usefulness of this approach in evaluation of performance of periodograms and in quantitative design of large variability surveys.
Zhu, Zhaozhong; Anttila, Verneri; Smoller, Jordan W; Lee, Phil H
2018-01-01
Advances in recent genome wide association studies (GWAS) suggest that pleiotropic effects on human complex traits are widespread. A number of classic and recent meta-analysis methods have been used to identify genetic loci with pleiotropic effects, but the overall performance of these methods is not well understood. In this work, we use extensive simulations and case studies of GWAS datasets to investigate the power and type-I error rates of ten meta-analysis methods. We specifically focus on three conditions commonly encountered in the studies of multiple traits: (1) extensive heterogeneity of genetic effects; (2) characterization of trait-specific association; and (3) inflated correlation of GWAS due to overlapping samples. Although the statistical power is highly variable under distinct study conditions, we found the superior power of several methods under diverse heterogeneity. In particular, classic fixed-effects model showed surprisingly good performance when a variant is associated with more than a half of study traits. As the number of traits with null effects increases, ASSET performed the best along with competitive specificity and sensitivity. With opposite directional effects, CPASSOC featured the first-rate power. However, caution is advised when using CPASSOC for studying genetically correlated traits with overlapping samples. We conclude with a discussion of unresolved issues and directions for future research.
Xie, Jiangan; Zhao, Lili; Zhou, Shangbo; He, Yongqun
2016-01-01
Vaccinations often induce various adverse events (AEs), and sometimes serious AEs (SAEs). While many vaccines are used in combination, the effects of vaccine-vaccine interactions (VVIs) on vaccine AEs are rarely studied. In this study, AE profiles induced by hepatitis A vaccine (Havrix), hepatitis B vaccine (Engerix-B), and hepatitis A and B combination vaccine (Twinrix) were studied using the VAERS data. From May 2001 to January 2015, VAERS recorded 941, 3,885, and 1,624 AE case reports where patients aged at least 18 years old were vaccinated with only Havrix, Engerix-B, and Twinrix, respectively. Using these data, our statistical analysis identified 46, 69, and 82 AEs significantly associated with Havrix, Engerix-B, and Twinrix, respectively. Based on the Ontology of Adverse Events (OAE) hierarchical classification, these AEs were enriched in the AEs related to behavioral and neurological conditions, immune system, and investigation results. Twenty-nine AEs were classified as SAEs and mainly related to immune conditions. Using a logistic regression model accompanied with MCMC sampling, 13 AEs (e.g., hepatosplenomegaly) were identified to result from VVI synergistic effects. Classifications of these 13 AEs using OAE and MedDRA hierarchies confirmed the advantages of the OAE-based method over MedDRA in AE term hierarchical analysis. PMID:27694888
Zhu, Li; Bharadwaj, Hari; Xia, Jing; Shinn-Cunningham, Barbara
2013-01-01
Two experiments, both presenting diotic, harmonic tone complexes (100 Hz fundamental), were conducted to explore the envelope-related component of the frequency-following response (FFRENV), a measure of synchronous, subcortical neural activity evoked by a periodic acoustic input. Experiment 1 directly compared two common analysis methods, computing the magnitude spectrum and the phase-locking value (PLV). Bootstrapping identified which FFRENV frequency components were statistically above the noise floor for each metric and quantified the statistical power of the approaches. Across listeners and conditions, the two methods produced highly correlated results. However, PLV analysis required fewer processing stages to produce readily interpretable results. Moreover, at the fundamental frequency of the input, PLVs were farther above the metric's noise floor than spectral magnitudes. Having established the advantages of PLV analysis, the efficacy of the approach was further demonstrated by investigating how different acoustic frequencies contribute to FFRENV, analyzing responses to complex tones composed of different acoustic harmonics of 100 Hz (Experiment 2). Results show that the FFRENV response is dominated by peripheral auditory channels responding to unresolved harmonics, although low-frequency channels driven by resolved harmonics also contribute. These results demonstrate the utility of the PLV for quantifying the strength of FFRENV across conditions. PMID:23862815
Statistical inference methods for sparse biological time series data.
Ndukum, Juliet; Fonseca, Luís L; Santos, Helena; Voit, Eberhard O; Datta, Susmita
2011-04-25
Comparing metabolic profiles under different biological perturbations has become a powerful approach to investigating the functioning of cells. The profiles can be taken as single snapshots of a system, but more information is gained if they are measured longitudinally over time. The results are short time series consisting of relatively sparse data that cannot be analyzed effectively with standard time series techniques, such as autocorrelation and frequency domain methods. In this work, we study longitudinal time series profiles of glucose consumption in the yeast Saccharomyces cerevisiae under different temperatures and preconditioning regimens, which we obtained with methods of in vivo nuclear magnetic resonance (NMR) spectroscopy. For the statistical analysis we first fit several nonlinear mixed effect regression models to the longitudinal profiles and then used an ANOVA likelihood ratio method in order to test for significant differences between the profiles. The proposed methods are capable of distinguishing metabolic time trends resulting from different treatments and associate significance levels to these differences. Among several nonlinear mixed-effects regression models tested, a three-parameter logistic function represents the data with highest accuracy. ANOVA and likelihood ratio tests suggest that there are significant differences between the glucose consumption rate profiles for cells that had been--or had not been--preconditioned by heat during growth. Furthermore, pair-wise t-tests reveal significant differences in the longitudinal profiles for glucose consumption rates between optimal conditions and heat stress, optimal and recovery conditions, and heat stress and recovery conditions (p-values <0.0001). We have developed a nonlinear mixed effects model that is appropriate for the analysis of sparse metabolic and physiological time profiles. The model permits sound statistical inference procedures, based on ANOVA likelihood ratio tests, for testing the significance of differences between short time course data under different biological perturbations.
NASA Astrophysics Data System (ADS)
Antón, M.; Román, R.; Sanchez-Lorenzo, A.; Calbó, J.; Vaquero, J. M.
2017-07-01
This study focuses on the analysis of the daily global solar radiation (GSR) reconstructed from sunshine duration measurements at Madrid (Spain) from 1887 to 1950. Additionally, cloud cover information recorded simultaneously by human observations for the study period was also analyzed and used to select cloud-free days. First, the day-to-day variability of reconstructed GSR data was evaluated, finding a strong relationship between GSR and cloudiness. The second step was to analyze the long-term evolution of the GSR data which exhibited two clear trends with opposite sign: a marked negative trend of - 36 kJ/m2 per year for 1887-1915 period and a moderate positive trend of + 13 kJ/m2 per year for 1916-1950 period, both statistically significant at the 95% confidence level. Therefore, there is evidence of "early dimming" and "early brightening" periods in the reconstructed GSR data for all-sky conditions in Madrid from the late 19th to the mid-20th centuries. Unlike the long-term evolution of GSR data, cloud cover showed non-statistically significant trends for the two analyzed sub-periods, 1887-1915 and 1916-1950. Finally, GSR trends were analyzed exclusively under cloud-free conditions in summer by means of the determination of the clearness index for those days with all cloud cover observations equal to zero oktas. The long-term evolution of the clearness index was in accordance with the "early dimming" and "early brightening" periods, showing smaller trends but still statistically significant. This result points out that aerosol load variability could have had a non-negligible influence on the long-term evolution of GSR even as far as from the late 19th century.
Gomes, Manuel; Hatfield, Laura; Normand, Sharon-Lise
2016-09-20
Meta-analysis of individual participant data (IPD) is increasingly utilised to improve the estimation of treatment effects, particularly among different participant subgroups. An important concern in IPD meta-analysis relates to partially or completely missing outcomes for some studies, a problem exacerbated when interest is on multiple discrete and continuous outcomes. When leveraging information from incomplete correlated outcomes across studies, the fully observed outcomes may provide important information about the incompleteness of the other outcomes. In this paper, we compare two models for handling incomplete continuous and binary outcomes in IPD meta-analysis: a joint hierarchical model and a sequence of full conditional mixed models. We illustrate how these approaches incorporate the correlation across the multiple outcomes and the between-study heterogeneity when addressing the missing data. Simulations characterise the performance of the methods across a range of scenarios which differ according to the proportion and type of missingness, strength of correlation between outcomes and the number of studies. The joint model provided confidence interval coverage consistently closer to nominal levels and lower mean squared error compared with the fully conditional approach across the scenarios considered. Methods are illustrated in a meta-analysis of randomised controlled trials comparing the effectiveness of implantable cardioverter-defibrillator devices alone to implantable cardioverter-defibrillator combined with cardiac resynchronisation therapy for treating patients with chronic heart failure. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.
Scaling Laws in Canopy Flows: A Wind-Tunnel Analysis
NASA Astrophysics Data System (ADS)
Segalini, Antonio; Fransson, Jens H. M.; Alfredsson, P. Henrik
2013-08-01
An analysis of velocity statistics and spectra measured above a wind-tunnel forest model is reported. Several measurement stations downstream of the forest edge have been investigated and it is observed that, while the mean velocity profile adjusts quickly to the new canopy boundary condition, the turbulence lags behind and shows a continuous penetration towards the free stream along the canopy model. The statistical profiles illustrate this growth and do not collapse when plotted as a function of the vertical coordinate. However, when the statistics are plotted as function of the local mean velocity (normalized with a characteristic velocity scale), they do collapse, independently of the streamwise position and freestream velocity. A new scaling for the spectra of all three velocity components is proposed based on the velocity variance and integral time scale. This normalization improves the collapse of the spectra compared to existing scalings adopted in atmospheric measurements, and allows the determination of a universal function that provides the velocity spectrum. Furthermore, a comparison of the proposed scaling laws for two different canopy densities is shown, demonstrating that the vertical velocity variance is the most sensible statistical quantity to the characteristics of the canopy roughness.
Fusco, Diana; Barnum, Timothy J.; Bruno, Andrew E.; Luft, Joseph R.; Snell, Edward H.; Mukherjee, Sayan; Charbonneau, Patrick
2014-01-01
X-ray crystallography is the predominant method for obtaining atomic-scale information about biological macromolecules. Despite the success of the technique, obtaining well diffracting crystals still critically limits going from protein to structure. In practice, the crystallization process proceeds through knowledge-informed empiricism. Better physico-chemical understanding remains elusive because of the large number of variables involved, hence little guidance is available to systematically identify solution conditions that promote crystallization. To help determine relationships between macromolecular properties and their crystallization propensity, we have trained statistical models on samples for 182 proteins supplied by the Northeast Structural Genomics consortium. Gaussian processes, which capture trends beyond the reach of linear statistical models, distinguish between two main physico-chemical mechanisms driving crystallization. One is characterized by low levels of side chain entropy and has been extensively reported in the literature. The other identifies specific electrostatic interactions not previously described in the crystallization context. Because evidence for two distinct mechanisms can be gleaned both from crystal contacts and from solution conditions leading to successful crystallization, the model offers future avenues for optimizing crystallization screens based on partial structural information. The availability of crystallization data coupled with structural outcomes analyzed through state-of-the-art statistical models may thus guide macromolecular crystallization toward a more rational basis. PMID:24988076
Fusco, Diana; Barnum, Timothy J; Bruno, Andrew E; Luft, Joseph R; Snell, Edward H; Mukherjee, Sayan; Charbonneau, Patrick
2014-01-01
X-ray crystallography is the predominant method for obtaining atomic-scale information about biological macromolecules. Despite the success of the technique, obtaining well diffracting crystals still critically limits going from protein to structure. In practice, the crystallization process proceeds through knowledge-informed empiricism. Better physico-chemical understanding remains elusive because of the large number of variables involved, hence little guidance is available to systematically identify solution conditions that promote crystallization. To help determine relationships between macromolecular properties and their crystallization propensity, we have trained statistical models on samples for 182 proteins supplied by the Northeast Structural Genomics consortium. Gaussian processes, which capture trends beyond the reach of linear statistical models, distinguish between two main physico-chemical mechanisms driving crystallization. One is characterized by low levels of side chain entropy and has been extensively reported in the literature. The other identifies specific electrostatic interactions not previously described in the crystallization context. Because evidence for two distinct mechanisms can be gleaned both from crystal contacts and from solution conditions leading to successful crystallization, the model offers future avenues for optimizing crystallization screens based on partial structural information. The availability of crystallization data coupled with structural outcomes analyzed through state-of-the-art statistical models may thus guide macromolecular crystallization toward a more rational basis.
NASA Astrophysics Data System (ADS)
Shan, X.; Zhang, K.; Zhuang, Y.; Fu, R.; Hong, Y.
2017-12-01
Seasonal prediction of rainfall during the dry-to-wet transition season in austral spring (September-November) over southern Amazonia is central for improving planting crops and fire mitigation in that region. Previous studies have identified the key large-scale atmospheric dynamic and thermodynamics pre-conditions during the dry season (June-August) that influence the rainfall anomalies during the dry to wet transition season over Southern Amazonia. Based on these key pre-conditions during dry season, we have evaluated several statistical models and developed a Neural Network based statistical prediction system to predict rainfall during the dry to wet transition for Southern Amazonia (5-15°S, 50-70°W). Multivariate Empirical Orthogonal Function (EOF) Analysis is applied to the following four fields during JJA from the ECMWF Reanalysis (ERA-Interim) spanning from year 1979 to 2015: geopotential height at 200 hPa, surface relative humidity, convective inhibition energy (CIN) index and convective available potential energy (CAPE), to filter out noise and highlight the most coherent spatial and temporal variations. The first 10 EOF modes are retained for inputs to the statistical models, accounting for at least 70% of the total variance in the predictor fields. We have tested several linear and non-linear statistical methods. While the regularized Ridge Regression and Lasso Regression can generally capture the spatial pattern and magnitude of rainfall anomalies, we found that that Neural Network performs best with an accuracy greater than 80%, as expected from the non-linear dependence of the rainfall on the large-scale atmospheric thermodynamic conditions and circulation. Further tests of various prediction skill metrics and hindcasts also suggest this Neural Network prediction approach can significantly improve seasonal prediction skill than the dynamic predictions and regression based statistical predictions. Thus, this statistical prediction system could have shown potential to improve real-time seasonal rainfall predictions in the future.
A new statistical PCA-ICA algorithm for location of R-peaks in ECG.
Chawla, M P S; Verma, H K; Kumar, Vinod
2008-09-16
The success of ICA to separate the independent components from the mixture depends on the properties of the electrocardiogram (ECG) recordings. This paper discusses some of the conditions of independent component analysis (ICA) that could affect the reliability of the separation and evaluation of issues related to the properties of the signals and number of sources. Principal component analysis (PCA) scatter plots are plotted to indicate the diagnostic features in the presence and absence of base-line wander in interpreting the ECG signals. In this analysis, a newly developed statistical algorithm by authors, based on the use of combined PCA-ICA for two correlated channels of 12-channel ECG data is proposed. ICA technique has been successfully implemented in identifying and removal of noise and artifacts from ECG signals. Cleaned ECG signals are obtained using statistical measures like kurtosis and variance of variance after ICA processing. This analysis also paper deals with the detection of QRS complexes in electrocardiograms using combined PCA-ICA algorithm. The efficacy of the combined PCA-ICA algorithm lies in the fact that the location of the R-peaks is bounded from above and below by the location of the cross-over points, hence none of the peaks are ignored or missed.
NASA Astrophysics Data System (ADS)
Koshigai, Masaru; Marui, Atsunao
Water table provides important information for the evaluation of groundwater resource. Recently, the estimation of water table in wide area is required for effective evaluation of groundwater resources. However, evaluation process is met with difficulties due to technical and economic constraints. Regression analysis for the prediction of groundwater levels based on geomorphologic and geologic conditions is considered as a reliable tool for the estimation of water table of wide area. Data of groundwater levels were extracted from the public database of geotechnical information. It was observed that changes in groundwater level depend on climate conditions. It was also observed and confirmed that there exist variations of groundwater levels according to geomorphologic and geologic conditions. The objective variable of the regression analysis was groundwater level. And the explanatory variables were elevation and the dummy variable consisting of group number. The constructed regression formula was significant according to the determination coefficients and analysis of the variance. Therefore, combining the regression formula and mesh map, the statistical method to estimate the water table based on geomorphologic and geologic condition for the whole country could be established.
Time-Frequency Analysis of Rocket Nozzle Wall Pressures During Start-up Transients
NASA Technical Reports Server (NTRS)
Baars, Woutijn J.; Tinney, Charles E.; Ruf, Joseph H.
2011-01-01
Surveys of the fluctuating wall pressure were conducted on a sub-scale, thrust- optimized parabolic nozzle in order to develop a physical intuition for its Fourier-azimuthal mode behavior during fixed and transient start-up conditions. These unsteady signatures are driven by shock wave turbulent boundary layer interactions which depend on the nozzle pressure ratio and nozzle geometry. The focus however, is on the degree of similarity between the spectral footprints of these modes obtained from transient start-ups as opposed to a sequence of fixed nozzle pressure ratio conditions. For the latter, statistically converged spectra are computed using conventional Fourier analyses techniques, whereas the former are investigated by way of time-frequency analysis. The findings suggest that at low nozzle pressure ratios -- where the flow resides in a Free Shock Separation state -- strong spectral similarities occur between fixed and transient conditions. Conversely, at higher nozzle pressure ratios -- where the flow resides in Restricted Shock Separation -- stark differences are observed between the fixed and transient conditions and depends greatly on the ramping rate of the transient period. And so, it appears that an understanding of the dynamics during transient start-up conditions cannot be furnished by a way of fixed flow analysis.
Time-dynamics of the two-color emission from vertical-external-cavity surface-emitting lasers
NASA Astrophysics Data System (ADS)
Chernikov, A.; Wichmann, M.; Shakfa, M. K.; Scheller, M.; Moloney, J. V.; Koch, S. W.; Koch, M.
2012-01-01
The temporal stability of a two-color vertical-external-cavity surface-emitting laser is studied using single-shot streak-camera measurements. The collected data is evaluated via quantitative statistical analysis schemes. Dynamically stable and unstable regions for the two-color operation are identified and the dependence on the pump conditions is analyzed.
ERIC Educational Resources Information Center
Roberge, Pasquale; Marchand, Andre; Reinharz, Daniel; Savard, Pierre
2008-01-01
A randomized, controlled trial was conducted to examine the cost-effectiveness of cognitive-behavioral treatment (CBT) for panic disorder with agoraphobia. A total of 100 participants were randomly assigned to standard (n = 33), group (n = 35), and brief (n = 32) treatment conditions. Results show significant clinical and statistical improvement…
1987-09-01
Nautical- Metorological Annuals (Yearbooks), Charlottenlund, Copenhagen. Jokill, 1953-67: Reports of sea ice off the Icelandic coasts (Annual reports...Proceeding of 7th annual climate diagnostic workshop (NOAA) pub. Washington, D.C., 189-195. * Weeks, W. F., 1978: Sea ice conditions in the Arctic. In
Evaluation of Differential DependencY (EDDY) is a statistical test for the differential dependency relationship of a set of genes between two given conditions. For each condition, possible dependency network structures are enumerated and their likelihoods are computed to represent a probability distribution of dependency networks. The difference between the probability distributions of dependency networks is computed between conditions, and its statistical significance is evaluated with random permutations of condition labels on the samples.
Evaluation of Differential DependencY (EDDY) is a statistical test for the differential dependency relationship of a set of genes between two given conditions. For each condition, possible dependency network structures are enumerated and their likelihoods are computed to represent a probability distribution of dependency networks. The difference between the probability distributions of dependency networks is computed between conditions, and its statistical significance is evaluated with random permutations of condition labels on the samples.
Capturing rogue waves by multi-point statistics
NASA Astrophysics Data System (ADS)
Hadjihosseini, A.; Wächter, Matthias; Hoffmann, N. P.; Peinke, J.
2016-01-01
As an example of a complex system with extreme events, we investigate ocean wave states exhibiting rogue waves. We present a statistical method of data analysis based on multi-point statistics which for the first time allows the grasping of extreme rogue wave events in a highly satisfactory statistical manner. The key to the success of the approach is mapping the complexity of multi-point data onto the statistics of hierarchically ordered height increments for different time scales, for which we can show that a stochastic cascade process with Markov properties is governed by a Fokker-Planck equation. Conditional probabilities as well as the Fokker-Planck equation itself can be estimated directly from the available observational data. With this stochastic description surrogate data sets can in turn be generated, which makes it possible to work out arbitrary statistical features of the complex sea state in general, and extreme rogue wave events in particular. The results also open up new perspectives for forecasting the occurrence probability of extreme rogue wave events, and even for forecasting the occurrence of individual rogue waves based on precursory dynamics.
Jager, Tjalling
2013-02-05
The individuals of a species are not equal. These differences frustrate experimental biologists and ecotoxicologists who wish to study the response of a species (in general) to a treatment. In the analysis of data, differences between model predictions and observations on individual animals are usually treated as random measurement error around the true response. These deviations, however, are mainly caused by real differences between the individuals (e.g., differences in physiology and in initial conditions). Understanding these intraspecies differences, and accounting for them in the data analysis, will improve our understanding of the response to the treatment we are investigating and allow for a more powerful, less biased, statistical analysis. Here, I explore a basic scheme for statistical inference to estimate parameters governing stress that allows individuals to differ in their basic physiology. This scheme is illustrated using a simple toxicokinetic-toxicodynamic model and a data set for growth of the springtail Folsomia candida exposed to cadmium in food. This article should be seen as proof of concept; a first step in bringing more realism into the statistical inference for process-based models in ecotoxicology.
Complexity quantification of dense array EEG using sample entropy analysis.
Ramanand, Pravitha; Nampoori, V P N; Sreenivasan, R
2004-09-01
In this paper, a time series complexity analysis of dense array electroencephalogram signals is carried out using the recently introduced Sample Entropy (SampEn) measure. This statistic quantifies the regularity in signals recorded from systems that can vary from the purely deterministic to purely stochastic realm. The present analysis is conducted with an objective of gaining insight into complexity variations related to changing brain dynamics for EEG recorded from the three cases of passive, eyes closed condition, a mental arithmetic task and the same mental task carried out after a physical exertion task. It is observed that the statistic is a robust quantifier of complexity suited for short physiological signals such as the EEG and it points to the specific brain regions that exhibit lowered complexity during the mental task state as compared to a passive, relaxed state. In the case of mental tasks carried out before and after the performance of a physical exercise, the statistic can detect the variations brought in by the intermediate fatigue inducing exercise period. This enhances its utility in detecting subtle changes in the brain state that can find wider scope for applications in EEG based brain studies.
Mahler, Barbara J.
2008-01-01
The statistical analyses taken together indicate that the geochemistry at the freshwater-zone wells is more variable than that at the transition-zone wells. The geochemical variability at the freshwater-zone wells might result from dilution of ground water by meteoric water. This is indicated by relatively constant major ion molar ratios; a preponderance of positive correlations between SC, major ions, and trace elements; and a principal components analysis in which the major ions are strongly loaded on the first principal component. Much of the variability at three of the four transition-zone wells might result from the use of different laboratory analytical methods or reporting procedures during the period of sampling. This is reflected by a lack of correlation between SC and major ion concentrations at the transition-zone wells and by a principal components analysis in which the variability is fairly evenly distributed across several principal components. The statistical analyses further indicate that, although the transition-zone wells are less well connected to surficial hydrologic conditions than the freshwater-zone wells, there is some connection but the response time is longer.
NASA Technical Reports Server (NTRS)
Steele, Glen; Lansdowne, Chatwin; Zucha, Joan; Schlensinger, Adam
2013-01-01
The Soft Decision Analyzer (SDA) is an instrument that combines hardware, firmware, and software to perform realtime closed-loop end-to-end statistical analysis of single- or dual- channel serial digital RF communications systems operating in very low signal-to-noise conditions. As an innovation, the unique SDA capabilities allow it to perform analysis of situations where the receiving communication system slips bits due to low signal-to-noise conditions or experiences constellation rotations resulting in channel polarity in versions or channel assignment swaps. SDA s closed-loop detection allows it to instrument a live system and correlate observations with frame, codeword, and packet losses, as well as Quality of Service (QoS) and Quality of Experience (QoE) events. The SDA s abilities are not confined to performing analysis in low signal-to-noise conditions. Its analysis provides in-depth insight of a communication system s receiver performance in a variety of operating conditions. The SDA incorporates two techniques for identifying slips. The first is an examination of content of the received data stream s relation to the transmitted data content and the second is a direct examination of the receiver s recovered clock signals relative to a reference. Both techniques provide benefits in different ways and allow the communication engineer evaluating test results increased confidence and understanding of receiver performance. Direct examination of data contents is performed by two different data techniques, power correlation or a modified Massey correlation, and can be applied to soft decision data widths 1 to 12 bits wide over a correlation depth ranging from 16 to 512 samples. The SDA detects receiver bit slips within a 4 bits window and can handle systems with up to four quadrants (QPSK, SQPSK, and BPSK systems). The SDA continuously monitors correlation results to characterize slips and quadrant change and is capable of performing analysis even when the receiver under test is subjected to conditions where its performance degrades to high error rates (30 percent or beyond). The design incorporates a number of features, such as watchdog triggers that permit the SDA system to recover from large receiver upsets automatically and continue accumulating performance analysis unaided by operator intervention. This accommodates tests that can last in the order of days in order to gain statistical confidence in results and is also useful for capturing snapshots of rare events.
Analysis of North Atlantic tropical cyclone intensify change using data mining
NASA Astrophysics Data System (ADS)
Tang, Jiang
Tropical cyclones (TC), especially when their intensity reaches hurricane scale, can become a costly natural hazard. Accurate prediction of tropical cyclone intensity is very difficult because of inadequate observations on TC structures, poor understanding of physical processes, coarse model resolution and inaccurate initial conditions, etc. This study aims to tackle two factors that account for the underperformance of current TC intensity forecasts: (1) inadequate observations of TC structures, and (2) deficient understanding of the underlying physical processes governing TC intensification. To tackle the problem of inadequate observations of TC structures, efforts have been made to extract vertical and horizontal structural parameters of latent heat release from Tropical Rainfall Measuring Mission (TRMM) Precipitation Radar (PR) data products. A case study of Hurricane Isabel (2003) was conducted first to explore the feasibility of using the 3D TC structure information in predicting TC intensification. Afterwards, several structural parameters were extracted from 53 TRMM PR 2A25 observations on 25 North Atlantic TCs during the period of 1998 to 2003. A new generation of multi-correlation data mining algorithm (Apriori and its variations) was applied to find roles of the latent heat release structure in TC intensification. The results showed that the buildup of TC energy is indicated by the height of the convective tower, and the relative low latent heat release at the core area and around the outer band. Adverse conditions which prevent TC intensification include the following: (1) TC entering a higher latitude area where the underlying sea is relative cold, (2) TC moving too fast to absorb the thermal energy from the underlying sea, or (3) strong energy loss at the outer band. When adverse conditions and amicable conditions reached equilibrium status, tropical cyclone intensity would remain stable. The dataset from Statistical Hurricane Intensity Prediction Scheme (SHIPS) covering the period of 1982-2003 and the Apriori-based association rule mining algorithm were used to study the associations of underlying geophysical characteristics with the intensity change of tropical cyclones. The data have been stratified into 6 TC categories from tropical depression to category 4 hurricanes based on their strength. The result showed that the persistence of intensity change in the past and the strength of vertical shear in the environment are the most prevalent factors for all of the 6 TC categories. Hyper-edge searching had found 3 sets of parameters which showed strong intramural binds. Most of the parameters used in SHIPS model have a consistent "I-W" relation over different TC categories, indicating a consistent function of those parameters in TC development. However, the "I-W" relations of the relative momentum flux and the meridional motion change from tropical storm stage to hurricane stage, indicating a change in the role of those two parameters in TC development. Because rapid intensification (RI) is a major source of errors when predicting hurricane intensity, the association rule mining algorithm was performed on RI versus non-RI tropical cyclone cases using the same SHIPS dataset. The results had been compared with those from the traditional statistical analysis conducted by Kaplan and DeMaria (2003). The rapid intensification rule with 5 RI conditions proposed by the traditional statistical analysis was found by the association rule mining in this study as well. However, further analysis showed that the 5 RI conditions can be replaced by another association rule using fewer conditions but with a higher RI probability (RIP). This means that the rule with all 5 constraints found by Kaplan and DeMaria is not optimal, and the association rule mining technique can find a rule with fewer constraints yet fits more RI cases. The further analysis with the highest RIPs over different numbers of conditions has demonstrated that the interactions among multiple factors are responsible for the RI process of TCs. However, the influence of factors saturates at certain numbers. This study has shown successful data mining examples in studying tropical cyclone intensification using association rules. The higher RI probability with fewer conditions found by association rule technique is significant. This work demonstrated that data mining techniques can be used as an efficient exploration method to generate hypotheses, and that statistical analysis should be performed to confirm the hypotheses, as is generally expected for data mining applications.
Evaluating statistical cloud schemes: What can we gain from ground-based remote sensing?
NASA Astrophysics Data System (ADS)
Grützun, V.; Quaas, J.; Morcrette, C. J.; Ament, F.
2013-09-01
Statistical cloud schemes with prognostic probability distribution functions have become more important in atmospheric modeling, especially since they are in principle scale adaptive and capture cloud physics in more detail. While in theory the schemes have a great potential, their accuracy is still questionable. High-resolution three-dimensional observational data of water vapor and cloud water, which could be used for testing them, are missing. We explore the potential of ground-based remote sensing such as lidar, microwave, and radar to evaluate prognostic distribution moments using the "perfect model approach." This means that we employ a high-resolution weather model as virtual reality and retrieve full three-dimensional atmospheric quantities and virtual ground-based observations. We then use statistics from the virtual observation to validate the modeled 3-D statistics. Since the data are entirely consistent, any discrepancy occurring is due to the method. Focusing on total water mixing ratio, we find that the mean ratio can be evaluated decently but that it strongly depends on the meteorological conditions as to whether the variance and skewness are reliable. Using some simple schematic description of different synoptic conditions, we show how statistics obtained from point or line measurements can be poor at representing the full three-dimensional distribution of water in the atmosphere. We argue that a careful analysis of measurement data and detailed knowledge of the meteorological situation is necessary to judge whether we can use the data for an evaluation of higher moments of the humidity distribution used by a statistical cloud scheme.
A Monte Carlo Simulation Study of the Reliability of Intraindividual Variability
Estabrook, Ryne; Grimm, Kevin J.; Bowles, Ryan P.
2012-01-01
Recent research has seen intraindividual variability (IIV) become a useful technique to incorporate trial-to-trial variability into many types of psychological studies. IIV as measured by individual standard deviations (ISDs) has shown unique prediction to several types of positive and negative outcomes (Ram, Rabbit, Stollery, & Nesselroade, 2005). One unanswered question regarding measuring intraindividual variability is its reliability and the conditions under which optimal reliability is achieved. Monte Carlo simulation studies were conducted to determine the reliability of the ISD compared to the intraindividual mean. The results indicate that ISDs generally have poor reliability and are sensitive to insufficient measurement occasions, poor test reliability, and unfavorable amounts and distributions of variability in the population. Secondary analysis of psychological data shows that use of individual standard deviations in unfavorable conditions leads to a marked reduction in statistical power, although careful adherence to underlying statistical assumptions allows their use as a basic research tool. PMID:22268793
Rank and independence in contingency table
NASA Astrophysics Data System (ADS)
Tsumoto, Shusaku
2004-04-01
A contingency table summarizes the conditional frequencies of two attributes and shows how these two attributes are dependent on each other. Thus, this table is a fundamental tool for pattern discovery with conditional probabilities, such as rule discovery. In this paper, a contingency table is interpreted from the viewpoint of statistical independence and granular computing. The first important observation is that a contingency table compares two attributes with respect to the number of equivalence classes. For example, a n x n table compares two attributes with the same granularity, while a m x n(m >= n) table compares two attributes with different granularities. The second important observation is that matrix algebra is a key point of analysis of this table. Especially, the degree of independence, rank plays a very important role in evaluating the degree of statistical independence. Relations between rank and the degree of dependence are also investigated.
NASA Astrophysics Data System (ADS)
Mahmood, Ehab A.; Rana, Sohel; Hussin, Abdul Ghapor; Midi, Habshah
2016-06-01
The circular regression model may contain one or more data points which appear to be peculiar or inconsistent with the main part of the model. This may be occur due to recording errors, sudden short events, sampling under abnormal conditions etc. The existence of these data points "outliers" in the data set cause lot of problems in the research results and the conclusions. Therefore, we should identify them before applying statistical analysis. In this article, we aim to propose a statistic to identify outliers in the both of the response and explanatory variables of the simple circular regression model. Our proposed statistic is robust circular distance RCDxy and it is justified by the three robust measurements such as proportion of detection outliers, masking and swamping rates.
Wang, Jiang; Luo, Dongjiao; Sun, Aihua; Yan, Jie
2008-07-01
Lipoproteins LipL32 and LipL21 and transmembrane protein OMPL1 have been confirmed as the superficial genus-specific antigens of Leptospira interrogans, which can be used as antigens for developing a universal genetic engineering vaccine. In order to obtain high expression of an artificial fusion gene lipL32/1-lipL21-ompL1/2, we optimized prokaryotic expression conditions. We used surface response analysis based on the central composite design to optimize culture conditions of a new antigen protein by recombinant Escherichia coli DE3.The culture conditions included initial pH, induction start time, post-induction time, Isopropyl beta-D-thiogalactopyranoside (IPTG) concentration, and temperature. The maximal production of antigen protein was 37.78 mg/l. The optimal culture conditions for high recombinant fusion protein was determined: initial pH 7.9, induction start time 2.5 h, a post-induction time of 5.38 h, 0.20 mM IPTG, and a post-induction temperature of 31 degrees C. Surface response analysis based on CCD increased the target production. This statistical method reduced the number of experiments required for optimization and enabled rapid identification and integration of the key culture condition parameters for optimizing recombinant protein expression.
Karami, Manoochehr; Khazaei, Salman
2017-12-06
Clinical decision makings according studies result require the valid and correct data collection, andanalysis. However, there are some common methodological and statistical issues which may ignore by authors. In individual matched case- control design bias arising from the unconditional analysis instead of conditional analysis. Using an unconditional logistic for matched data causes the imposition of a large number of nuisance parameters which may result in seriously biased estimates.
Development of a funding, cost, and spending model for satellite projects
NASA Technical Reports Server (NTRS)
Johnson, Jesse P.
1989-01-01
The need for a predictive budget/funging model is obvious. The current models used by the Resource Analysis Office (RAO) are used to predict the total costs of satellite projects. An effort to extend the modeling capabilities from total budget analysis to total budget and budget outlays over time analysis was conducted. A statistical based and data driven methodology was used to derive and develop the model. Th budget data for the last 18 GSFC-sponsored satellite projects were analyzed and used to build a funding model which would describe the historical spending patterns. This raw data consisted of dollars spent in that specific year and their 1989 dollar equivalent. This data was converted to the standard format used by the RAO group and placed in a database. A simple statistical analysis was performed to calculate the gross statistics associated with project length and project cost ant the conditional statistics on project length and project cost. The modeling approach used is derived form the theory of embedded statistics which states that properly analyzed data will produce the underlying generating function. The process of funding large scale projects over extended periods of time is described by Life Cycle Cost Models (LCCM). The data was analyzed to find a model in the generic form of a LCCM. The model developed is based on a Weibull function whose parameters are found by both nonlinear optimization and nonlinear regression. In order to use this model it is necessary to transform the problem from a dollar/time space to a percentage of total budget/time space. This transformation is equivalent to moving to a probability space. By using the basic rules of probability, the validity of both the optimization and the regression steps are insured. This statistically significant model is then integrated and inverted. The resulting output represents a project schedule which relates the amount of money spent to the percentage of project completion.
Adaptation in Coding by Large Populations of Neurons in the Retina
NASA Astrophysics Data System (ADS)
Ioffe, Mark L.
A comprehensive theory of neural computation requires an understanding of the statistical properties of the neural population code. The focus of this work is the experimental study and theoretical analysis of the statistical properties of neural activity in the tiger salamander retina. This is an accessible yet complex system, for which we control the visual input and record from a substantial portion--greater than a half--of the ganglion cell population generating the spiking output. Our experiments probe adaptation of the retina to visual statistics: a central feature of sensory systems which have to adjust their limited dynamic range to a far larger space of possible inputs. In Chapter 1 we place our work in context with a brief overview of the relevant background. In Chapter 2 we describe the experimental methodology of recording from 100+ ganglion cells in the tiger salamander retina. In Chapter 3 we first present the measurements of adaptation of individual cells to changes in stimulation statistics and then investigate whether pairwise correlations in fluctuations of ganglion cell activity change across different stimulation conditions. We then transition to a study of the population-level probability distribution of the retinal response captured with maximum-entropy models. Convergence of the model inference is presented in Chapter 4. In Chapter 5 we first test the empirical presence of a phase transition in such models fitting the retinal response to different experimental conditions, and then proceed to develop other characterizations which are sensitive to complexity in the interaction matrix. This includes an analysis of the dynamics of sampling at finite temperature, which demonstrates a range of subtle attractor-like properties in the energy landscape. These are largely conserved when ambient illumination is varied 1000-fold, a result not necessarily apparent from the measured low-order statistics of the distribution. Our results form a consistent picture which is discussed at the end of Chapter 5. We conclude with a few future directions related to this thesis.
Statistical wind analysis for near-space applications
NASA Astrophysics Data System (ADS)
Roney, Jason A.
2007-09-01
Statistical wind models were developed based on the existing observational wind data for near-space altitudes between 60 000 and 100 000 ft (18 30 km) above ground level (AGL) at two locations, Akon, OH, USA, and White Sands, NM, USA. These two sites are envisioned as playing a crucial role in the first flights of high-altitude airships. The analysis shown in this paper has not been previously applied to this region of the stratosphere for such an application. Standard statistics were compiled for these data such as mean, median, maximum wind speed, and standard deviation, and the data were modeled with Weibull distributions. These statistics indicated, on a yearly average, there is a lull or a “knee” in the wind between 65 000 and 72 000 ft AGL (20 22 km). From the standard statistics, trends at both locations indicated substantial seasonal variation in the mean wind speed at these heights. The yearly and monthly statistical modeling indicated that Weibull distributions were a reasonable model for the data. Forecasts and hindcasts were done by using a Weibull model based on 2004 data and comparing the model with the 2003 and 2005 data. The 2004 distribution was also a reasonable model for these years. Lastly, the Weibull distribution and cumulative function were used to predict the 50%, 95%, and 99% winds, which are directly related to the expected power requirements of a near-space station-keeping airship. These values indicated that using only the standard deviation of the mean may underestimate the operational conditions.
Nonlinear analysis of pupillary dynamics.
Onorati, Francesco; Mainardi, Luca Tommaso; Sirca, Fabiola; Russo, Vincenzo; Barbieri, Riccardo
2016-02-01
Pupil size reflects autonomic response to different environmental and behavioral stimuli, and its dynamics have been linked to other autonomic correlates such as cardiac and respiratory rhythms. The aim of this study is to assess the nonlinear characteristics of pupil size of 25 normal subjects who participated in a psychophysiological experimental protocol with four experimental conditions, namely “baseline”, “anger”, “joy”, and “sadness”. Nonlinear measures, such as sample entropy, correlation dimension, and largest Lyapunov exponent, were computed on reconstructed signals of spontaneous fluctuations of pupil dilation. Nonparametric statistical tests were performed on surrogate data to verify that the nonlinear measures are an intrinsic characteristic of the signals. We then developed and applied a piecewise linear regression model to detrended fluctuation analysis (DFA). Two joinpoints and three scaling intervals were identified: slope α0, at slow time scales, represents a persistent nonstationary long-range correlation, whereas α1 and α2, at middle and fast time scales, respectively, represent long-range power-law correlations, similarly to DFA applied to heart rate variability signals. Of the computed complexity measures, α0 showed statistically significant differences among experimental conditions (p<0.001). Our results suggest that (a) pupil size at constant light condition is characterized by nonlinear dynamics, (b) three well-defined and distinct long-memory processes exist at different time scales, and (c) autonomic stimulation is partially reflected in nonlinear dynamics. (c) autonomic stimulation is partially reflected in nonlinear dynamics.
Koerner, Tess K.; Zhang, Yang
2017-01-01
Neurophysiological studies are often designed to examine relationships between measures from different testing conditions, time points, or analysis techniques within the same group of participants. Appropriate statistical techniques that can take into account repeated measures and multivariate predictor variables are integral and essential to successful data analysis and interpretation. This work implements and compares conventional Pearson correlations and linear mixed-effects (LME) regression models using data from two recently published auditory electrophysiology studies. For the specific research questions in both studies, the Pearson correlation test is inappropriate for determining strengths between the behavioral responses for speech-in-noise recognition and the multiple neurophysiological measures as the neural responses across listening conditions were simply treated as independent measures. In contrast, the LME models allow a systematic approach to incorporate both fixed-effect and random-effect terms to deal with the categorical grouping factor of listening conditions, between-subject baseline differences in the multiple measures, and the correlational structure among the predictor variables. Together, the comparative data demonstrate the advantages as well as the necessity to apply mixed-effects models to properly account for the built-in relationships among the multiple predictor variables, which has important implications for proper statistical modeling and interpretation of human behavior in terms of neural correlates and biomarkers. PMID:28264422
Mohler, Rachel E; Dombek, Kenneth M; Hoggard, Jamin C; Pierce, Karisa M; Young, Elton T; Synovec, Robert E
2007-08-01
The first extensive study of yeast metabolite GC x GC-TOFMS data from cells grown under fermenting, R, and respiring, DR, conditions is reported. In this study, recently developed chemometric software for use with three-dimensional instrumentation data was implemented, using a statistically-based Fisher ratio method. The Fisher ratio method is fully automated and will rapidly reduce the data to pinpoint two-dimensional chromatographic peaks differentiating sample types while utilizing all the mass channels. The effect of lowering the Fisher ratio threshold on peak identification was studied. At the lowest threshold (just above the noise level), 73 metabolite peaks were identified, nearly three-fold greater than the number of previously reported metabolite peaks identified (26). In addition to the 73 identified metabolites, 81 unknown metabolites were also located. A Parallel Factor Analysis graphical user interface (PARAFAC GUI) was applied to selected mass channels to obtain a concentration ratio, for each metabolite under the two growth conditions. Of the 73 known metabolites identified by the Fisher ratio method, 54 were statistically changing to the 95% confidence limit between the DR and R conditions according to the rigorous Student's t-test. PARAFAC determined the concentration ratio and provided a fully-deconvoluted (i.e. mathematically resolved) mass spectrum for each of the metabolites. The combination of the Fisher ratio method with the PARAFAC GUI provides high-throughput software for discovery-based metabolomics research, and is novel for GC x GC-TOFMS data due to the use of the entire data set in the analysis (640 MB x 70 runs, double precision floating point).
Exploring patient satisfaction predictors in relation to a theoretical model.
Grøndahl, Vigdis Abrahamsen; Hall-Lord, Marie Louise; Karlsson, Ingela; Appelgren, Jari; Wilde-Larsson, Bodil
2013-01-01
The aim is to describe patients' care quality perceptions and satisfaction and to explore potential patient satisfaction predictors as person-related conditions, external objective care conditions and patients' perception of actual care received ("PR") in relation to a theoretical model. A cross-sectional design was used. Data were collected using one questionnaire combining questions from four instruments: Quality from patients' perspective; Sense of coherence; Big five personality trait; and Emotional stress reaction questionnaire (ESRQ), together with questions from previous research. In total, 528 patients (83.7 per cent response rate) from eight medical, three surgical and one medical/surgical ward in five Norwegian hospitals participated. Answers from 373 respondents with complete ESRQ questionnaires were analysed. Sequential multiple regression analysis with ESRQ as dependent variable was run in three steps: person-related conditions, external objective care conditions, and PR (p < 0.05). Step 1 (person-related conditions) explained 51.7 per cent of the ESRQ variance. Step 2 (external objective care conditions) explained an additional 2.4 per cent. Step 3 (PR) gave no significant additional explanation (0.05 per cent). Steps 1 and 2 contributed statistical significance to the model. Patients rated both quality-of-care and satisfaction highly. The paper shows that the theoretical model using an emotion-oriented approach to assess patient satisfaction can explain 54 per cent of patient satisfaction in a statistically significant manner.
PROM and Labour Effects on Urinary Metabolome: A Pilot Study
Meloni, Alessandra; Palmas, Francesco; Mereu, Rossella; Deiana, Sara Francesca; Fais, Maria Francesca; Mussap, Michele; Ragusa, Antonio; Pintus, Roberta; Fanos, Vassilios; Melis, Gian Benedetto
2018-01-01
Since pathologies and complications occurring during pregnancy and/or during labour may cause adverse outcomes for both newborns and mothers, there is a growing interest in metabolomic applications on pregnancy investigation. In fact, metabolomics has proved to be an efficient strategy for the description of several perinatal conditions. In particular, this study focuses on premature rupture of membranes (PROM) in pregnancy at term. For this project, urine samples were collected at three different clinical conditions: out of labour before PROM occurrence (Ph1), out of labour with PROM (Ph2), and during labour with PROM (Ph3). GC-MS analysis, followed by univariate and multivariate statistical analysis, was able to discriminate among the different classes, highlighting the metabolites most involved in the discrimination. PMID:29511388
Algorithmic detectability threshold of the stochastic block model
NASA Astrophysics Data System (ADS)
Kawamoto, Tatsuro
2018-03-01
The assumption that the values of model parameters are known or correctly learned, i.e., the Nishimori condition, is one of the requirements for the detectability analysis of the stochastic block model in statistical inference. In practice, however, there is no example demonstrating that we can know the model parameters beforehand, and there is no guarantee that the model parameters can be learned accurately. In this study, we consider the expectation-maximization (EM) algorithm with belief propagation (BP) and derive its algorithmic detectability threshold. Our analysis is not restricted to the community structure but includes general modular structures. Because the algorithm cannot always learn the planted model parameters correctly, the algorithmic detectability threshold is qualitatively different from the one with the Nishimori condition.
NASA Astrophysics Data System (ADS)
Saez, Núria; Ruiz, Xavier; Pallarés, Jordi; Shevtsova, Valentina
2013-04-01
An accelerometric record from the IVIDIL experiment (ESA Columbus module) has exhaustively been studied. The analysis involved the determination of basic statistical properties as, for instance, the auto-correlation and the power spectrum (second-order statistical analyses). Also, and taking into account the shape of the associated histograms, we address another important question, the non-Gaussian nature of the time series using the bispectrum and the bicoherence of the signals. Extrapolating the above-mentioned results, a computational model of a high-temperature shear cell has been performed. A scalar indicator has been used to quantify the accuracy of the diffusion coefficient measurements in the case of binary mixtures involving photovoltaic silicon or liquid Al-Cu binary alloys. Three different initial arrangements have been considered, the so-called interdiffusion, centred thick layer and the lateral thick layer. Results allow us to conclude that, under the conditions of the present work, the diffusion coefficient is insensitive to the environmental conditions, that is to say, accelerometric disturbances and initial shear cell arrangement.
Pinto, Luís Fernando Batista; Tarouco, Jaime Urdapilleta; Pedrosa, Victor Breno; de Farias Jucá, Adriana; Leão, André Gustavo; Moita, Antonia Kécya França
2013-08-01
This study aimed to evaluate visual precocity, muscling, conformation, skeletal, and breed scores; live weights at birth, at 205, and at 550 days of age; and, besides, rib eye area and fat thickness between the 12th and 13th ribs obtained by ultrasound. Those traits were evaluated in 1,645 Angus cattle kept in five feeding conditions as follows: supplemented or non-supplemented, grazing native pasture or grazing cultivated pasture, and feedlot. Descriptive statistics, Pearson's correlations, and principal component analysis were carried out. Gender and feeding conditions were fixed effects, while animal's age and mother's weight at weaning were the covariates analyzed. Gender and feeding conditions were very significant for the studied traits, but visual scores were not influenced by gender. Animal's age and mother's weight at weaning influenced many traits and must be appropriately adjusted in the statistical models. An important correlation between visual scores, live weights, and carcass traits obtained by ultrasound was found, which can be analyzed by univariate procedure. However, the multivariate approach revealed some information that cannot be neglected in order to ensure a more detailed assessment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cullen, David A; Koestner, Roland; Kukreja, Ratan
Improved conditions for imaging and spectroscopic mapping of thin perfluorosulfonic acid (PFSA) ionomer layers in fuel cell electrodes by scanning transmission electron microscopy (STEM) have been investigated. These conditions are first identified on model systems of Nafion ionomer-coated nanostructured thin films and nanoporous Si. The optimized conditions are then applied in a quantitative study of the ionomer through-layer loading for two typical electrode catalyst coatings using electron energy loss and energy dispersive X-ray spectroscopy in the transmission electron microscope. The e-beam induced damage to the perfluorosulfonic acid (PFSA) ionomer is quantified by following the fluorine mass loss with electron exposuremore » and is then mitigated by a few orders of magnitude using cryogenic specimen cooling and a higher incident electron voltage. Multivariate statistical analysis is also applied to the analysis of spectrum images for data denoising and unbiased separation of independent components related to the catalyst, ionomer, and support.« less
Significance levels for studies with correlated test statistics.
Shi, Jianxin; Levinson, Douglas F; Whittemore, Alice S
2008-07-01
When testing large numbers of null hypotheses, one needs to assess the evidence against the global null hypothesis that none of the hypotheses is false. Such evidence typically is based on the test statistic of the largest magnitude, whose statistical significance is evaluated by permuting the sample units to simulate its null distribution. Efron (2007) has noted that correlation among the test statistics can induce substantial interstudy variation in the shapes of their histograms, which may cause misleading tail counts. Here, we show that permutation-based estimates of the overall significance level also can be misleading when the test statistics are correlated. We propose that such estimates be conditioned on a simple measure of the spread of the observed histogram, and we provide a method for obtaining conditional significance levels. We justify this conditioning using the conditionality principle described by Cox and Hinkley (1974). Application of the method to gene expression data illustrates the circumstances when conditional significance levels are needed.
NASA Astrophysics Data System (ADS)
Newman, Brent D.; Havenor, Kay C.; Longmire, Patrick
2016-06-01
Analysis of groundwater chemistry can yield important insights about subsurface conditions, and provide an alternative and complementary method for characterizing basin hydrogeology, especially in areas where hydraulic data are limited. More specifically, hydrochemical facies have been used for decades to help understand basin flow and transport, and a set of facies were developed for the Roswell Artesian Basin (RAB) in a semi-arid part of New Mexico, USA. The RAB is an important agricultural water source, and is an excellent example of a rechargeable artesian system. However, substantial uncertainties about the RAB hydrogeology and groundwater chemistry exist. The RAB was a great opportunity to explore hydrochemcial facies definition. A set of facies, derived from fingerprint diagrams (graphical approach), existed as a basis for testing and for comparison to principal components, factor analysis, and cluster analyses (statistical approaches). Geochemical data from over 300 RAB wells in the central basin were examined. The statistical testing of fingerprint-diagram-based facies was useful in terms of quantitatively evaluating differences between facies, and for understanding potential controls on basin groundwater chemistry. This study suggests the presence of three hydrochemical facies in the shallower part of the RAB (mostly unconfined conditions) and three in the deeper artesian system of the RAB. These facies reflect significant spatial differences in chemistry in the basin that are associated with specific stratigraphic intervals as well as structural features. Substantial chemical variability across faults and within fault blocks was also observed.
Bayesian analysis of the kinetics of quantal transmitter secretion at the neuromuscular junction.
Saveliev, Anatoly; Khuzakhmetova, Venera; Samigullin, Dmitry; Skorinkin, Andrey; Kovyazina, Irina; Nikolsky, Eugeny; Bukharaeva, Ellya
2015-10-01
The timing of transmitter release from nerve endings is considered nowadays as one of the factors determining the plasticity and efficacy of synaptic transmission. In the neuromuscular junction, the moments of release of individual acetylcholine quanta are related to the synaptic delays of uniquantal endplate currents recorded under conditions of lowered extracellular calcium. Using Bayesian modelling, we performed a statistical analysis of synaptic delays in mouse neuromuscular junction with different patterns of rhythmic nerve stimulation and when the entry of calcium ions into the nerve terminal was modified. We have obtained a statistical model of the release timing which is represented as the summation of two independent statistical distributions. The first of these is the exponentially modified Gaussian distribution. The mixture of normal and exponential components in this distribution can be interpreted as a two-stage mechanism of early and late periods of phasic synchronous secretion. The parameters of this distribution depend on both the stimulation frequency of the motor nerve and the calcium ions' entry conditions. The second distribution was modelled as quasi-uniform, with parameters independent of nerve stimulation frequency and calcium entry. Two different probability density functions for the distribution of synaptic delays suggest at least two independent processes controlling the time course of secretion, one of them potentially involving two stages. The relative contribution of these processes to the total number of mediator quanta released depends differently on the motor nerve stimulation pattern and on calcium ion entry into nerve endings.
ERIC Educational Resources Information Center
Poole, Sonja Martin
2011-01-01
Using data from the National Center for Educational Statistics, this article examines the relationship between strength of state accountability policy (i.e., external accountability) and internal accountability, defined as a school-level system in which collective behaviors and conditions exist that direct the attention and effort of the internal…
Millimeter wave propagation measurements using the ATS 5 satellite
NASA Technical Reports Server (NTRS)
Ippolito, L. J.
1972-01-01
The ATS 5 millimeter wave propagation experiment determines long- and short-term attenuation statistics of operational millimeter wavelength earthspace links as functions of defined meteorological conditions. A preliminary analysis of results with 15 GHz downlink and 32 GHz uplink frequency bands indicates that both frequency bands exhibit an excellent potential for utilization in reliable high data rate earth-space communications systems.
Evaluation program for secondary spacecraft cells: Cycle life test
NASA Technical Reports Server (NTRS)
Harkness, J. D.
1979-01-01
The service life and storage stability for several storage batteries were determined. The batteries included silver-zinc batteries, nickel-cadmium batteries, and silver-cadmium batteries. The cell performance characteristics and limitations are to be used by spacecraft power systems planners and designers. A statistical analysis of the life cycle prediction and cause of failure versus test conditions is presented.
Determination of the refractive index of dehydrated cells by means of digital holographic microscopy
NASA Astrophysics Data System (ADS)
Belashov, A. V.; Zhikhoreva, A. A.; Bespalov, V. G.; Vasyutinskii, O. S.; Zhilinskaya, N. T.; Novik, V. I.; Semenova, I. V.
2017-10-01
Spatial distributions of the integral refractive index in dehydrated cells of human oral cavity epithelium are obtained by means of digital holographic microscopy, and mean refractive index of the cells is determined. The statistical analysis of the data obtained is carried out, and absolute errors of the method are estimated for different experimental conditions.
Borsa, Paul A.; Liggett, Charles L.
1998-01-01
Objective: To assess the therapeutic effects of flexible magnets on pain perception, intramuscular swelling, range of motion, and muscular strength in individuals with a muscle microinjury. Design and Setting: This experiment was a single-blind, placebo study using a repeated-measures design. Subjects performed an intense exercise protocol to induce a muscle microinjury. After pretreatment measurements were recorded, subjects were randomly assigned to an experimental (magnet), placebo (imitation magnet), or control (no magnet) group. Posttreatment measurements were repeated at 24, 48, and 72 hours. Subjects: Forty-five healthy subjects participated in the study. Measurements: Subjects were measured repeatedly for pain perception, upper arm girth, range of motion, and static force production. Four separate univariate analyses of variances were used to reveal statistically significant mean (±SD) differences between variables over time. Interaction effects were analyzed using Scheffe post hoc analysis. Results: Analysis of variance revealed no statistically significant (P > .05) mean differences between conditions for any dependent pretreatment and posttreatment measurements. No significant interaction effects were demonstrated between conditions and times. Conclusions: No significant therapeutic effects on pain control and muscular dysfunction were observed in subjects wearing flexible magnets. ImagesFig 2.Fig 3. PMID:16558503
NASA Astrophysics Data System (ADS)
M, Vasu; Shivananda Nayaka, H.
2018-06-01
In this experimental work dry turning process carried out on EN47 spring steel with coated tungsten carbide tool insert with 0.8 mm nose radius are optimized by using statistical technique. Experiments were conducted at three different cutting speeds (625, 796 and 1250 rpm) with three different feed rates (0.046, 0.062 and 0.093 mm/rev) and depth of cuts (0.2, 0.3 and 0.4 mm). Experiments are conducted based on full factorial design (FFD) 33 three factors and three levels. Analysis of variance is used to identify significant factor for each output response. The result reveals that feed rate is the most significant factor influencing on cutting force followed by depth of cut and cutting speed having less significance. Optimum machining condition for cutting force obtained from the statistical technique. Tool wear measurements are performed with optimum condition of Vc = 796 rpm, ap = 0.2 mm, f = 0.046 mm/rev. The minimum tool wear observed as 0.086 mm with 5 min machining. Analysis of tool wear was done by confocal microscope it was observed that tool wear increases with increasing cutting time.
GIS-based bivariate statistical techniques for groundwater potential analysis (an example of Iran)
NASA Astrophysics Data System (ADS)
Haghizadeh, Ali; Moghaddam, Davoud Davoudi; Pourghasemi, Hamid Reza
2017-12-01
Groundwater potential analysis prepares better comprehension of hydrological settings of different regions. This study shows the potency of two GIS-based data driven bivariate techniques namely statistical index (SI) and Dempster-Shafer theory (DST) to analyze groundwater potential in Broujerd region of Iran. The research was done using 11 groundwater conditioning factors and 496 spring positions. Based on the ground water potential maps (GPMs) of SI and DST methods, 24.22% and 23.74% of the study area is covered by poor zone of groundwater potential, and 43.93% and 36.3% of Broujerd region is covered by good and very good potential zones, respectively. The validation of outcomes displayed that area under the curve (AUC) of SI and DST techniques are 81.23% and 79.41%, respectively, which shows SI method has slightly a better performance than the DST technique. Therefore, SI and DST methods are advantageous to analyze groundwater capacity and scrutinize the complicated relation between groundwater occurrence and groundwater conditioning factors, which permits investigation of both systemic and stochastic uncertainty. Finally, it can be realized that these techniques are very beneficial for groundwater potential analyzing and can be practical for water-resource management experts.
On-Line Analysis of Physiologic and Neurobehavioral Variables During Long-Duration Space Missions
NASA Technical Reports Server (NTRS)
Brown, Emery N.
1999-01-01
The goal of this project is to develop reliable statistical algorithms for on-line analysis of physiologic and neurobehavioral variables monitored during long-duration space missions. Maintenance of physiologic and neurobehavioral homeostasis during long-duration space missions is crucial for ensuring optimal crew performance. If countermeasures are not applied, alterations in homeostasis will occur in nearly all-physiologic systems. During such missions data from most of these systems will be either continually and/or continuously monitored. Therefore, if these data can be analyzed as they are acquired and the status of these systems can be continually assessed, then once alterations are detected, appropriate countermeasures can be applied to correct them. One of the most important physiologic systems in which to maintain homeostasis during long-duration missions is the circadian system. To detect and treat alterations in circadian physiology during long duration space missions requires development of: 1) a ground-based protocol to assess the status of the circadian system under the light-dark environment in which crews in space will typically work; and 2) appropriate statistical methods to make this assessment. The protocol in Project 1, Circadian Entrainment, Sleep-Wake Regulation and Neurobehavioral will study human volunteers under the simulated light-dark environment of long-duration space missions. Therefore, we propose to develop statistical models to characterize in near real time circadian and neurobehavioral physiology under these conditions. The specific aims of this project are to test the hypotheses that: 1) Dynamic statistical methods based on the Kronauer model of the human circadian system can be developed to estimate circadian phase, period, amplitude from core-temperature data collected under simulated light- dark conditions of long-duration space missions. 2) Analytic formulae and numerical algorithms can be developed to compute the error in the estimates of circadian phase, period and amplitude determined from the data in Specific Aim 1. 3) Statistical models can detect reliably in near real- time (daily) significant alternations in the circadian physiology of individual subjects by analyzing the circadian and neurobehavioral data collected in Project 1. 4) Criteria can be developed using the Kronauer model and the recently developed Jewett model of cognitive -performance and subjective alertness to define altered circadian and neurobehavioral physiology and to set conditions for immediate administration of countermeasures.
Revisiting photon-statistics effects on multiphoton ionization
NASA Astrophysics Data System (ADS)
Mouloudakis, G.; Lambropoulos, P.
2018-05-01
We present a detailed analysis of the effects of photon statistics on multiphoton ionization. Through a detailed study of the role of intermediate states, we evaluate the conditions under which the premise of nonresonant processes is valid. The limitations of its validity are manifested in the dependence of the process on the stochastic properties of the radiation and found to be quite sensitive to the intensity. The results are quantified through detailed calculations for coherent, chaotic, and squeezed vacuum radiation. Their significance in the context of recent developments in radiation sources such as the short-wavelength free-electron laser and squeezed vacuum radiation is also discussed.
Kurtosis Approach Nonlinear Blind Source Separation
NASA Technical Reports Server (NTRS)
Duong, Vu A.; Stubbemd, Allen R.
2005-01-01
In this paper, we introduce a new algorithm for blind source signal separation for post-nonlinear mixtures. The mixtures are assumed to be linearly mixed from unknown sources first and then distorted by memoryless nonlinear functions. The nonlinear functions are assumed to be smooth and can be approximated by polynomials. Both the coefficients of the unknown mixing matrix and the coefficients of the approximated polynomials are estimated by the gradient descent method conditional on the higher order statistical requirements. The results of simulation experiments presented in this paper demonstrate the validity and usefulness of our approach for nonlinear blind source signal separation Keywords: Independent Component Analysis, Kurtosis, Higher order statistics.
The clinical value of large neuroimaging data sets in Alzheimer's disease.
Toga, Arthur W
2012-02-01
Rapid advances in neuroimaging and cyberinfrastructure technologies have brought explosive growth in the Web-based warehousing, availability, and accessibility of imaging data on a variety of neurodegenerative and neuropsychiatric disorders and conditions. There has been a prolific development and emergence of complex computational infrastructures that serve as repositories of databases and provide critical functionalities such as sophisticated image analysis algorithm pipelines and powerful three-dimensional visualization and statistical tools. The statistical and operational advantages of collaborative, distributed team science in the form of multisite consortia push this approach in a diverse range of population-based investigations. Copyright © 2012 Elsevier Inc. All rights reserved.
System and method for statistically monitoring and analyzing sensed conditions
Pebay, Philippe P [Livermore, CA; Brandt, James M [Dublin, CA; Gentile, Ann C [Dublin, CA; Marzouk, Youssef M [Oakland, CA; Hale, Darrian J [San Jose, CA; Thompson, David C [Livermore, CA
2011-01-04
A system and method of monitoring and analyzing a plurality of attributes for an alarm condition is disclosed. The attributes are processed and/or unprocessed values of sensed conditions of a collection of a statistically significant number of statistically similar components subjected to varying environmental conditions. The attribute values are used to compute the normal behaviors of some of the attributes and also used to infer parameters of a set of models. Relative probabilities of some attribute values are then computed and used along with the set of models to determine whether an alarm condition is met. The alarm conditions are used to prevent or reduce the impact of impending failure.
System and method for statistically monitoring and analyzing sensed conditions
Pebay, Philippe P [Livermore, CA; Brandt, James M [Dublin, CA; Gentile, Ann C [Dublin, CA; Marzouk, Youssef M [Oakland, CA; Hale, Darrian J [San Jose, CA; Thompson, David C [Livermore, CA
2011-01-25
A system and method of monitoring and analyzing a plurality of attributes for an alarm condition is disclosed. The attributes are processed and/or unprocessed values of sensed conditions of a collection of a statistically significant number of statistically similar components subjected to varying environmental conditions. The attribute values are used to compute the normal behaviors of some of the attributes and also used to infer parameters of a set of models. Relative probabilities of some attribute values are then computed and used along with the set of models to determine whether an alarm condition is met. The alarm conditions are used to prevent or reduce the impact of impending failure.
System and method for statistically monitoring and analyzing sensed conditions
Pebay, Philippe P [Livermore, CA; Brandt, James M. , Gentile; Ann C. , Marzouk; Youssef M. , Hale; Darrian J. , Thompson; David, C [Livermore, CA
2010-07-13
A system and method of monitoring and analyzing a plurality of attributes for an alarm condition is disclosed. The attributes are processed and/or unprocessed values of sensed conditions of a collection of a statistically significant number of statistically similar components subjected to varying environmental conditions. The attribute values are used to compute the normal behaviors of some of the attributes and also used to infer parameters of a set of models. Relative probabilities of some attribute values are then computed and used along with the set of models to determine whether an alarm condition is met. The alarm conditions are used to prevent or reduce the impact of impending failure.
Analysis of the statistic al properties of pulses in atmospheric corona discharge
NASA Astrophysics Data System (ADS)
Aubrecht, L.; Koller, J.; Plocek, J.; Stanék, Z.
2000-03-01
The properties of the negative corona current pulses in a single point-to-plane configuration have been extensively studied by many investigators. The amplitude and the interval of these pulses are not generally constant and depend on many variables. The repetition rate and the amplitude of the pulses fluctuate in time. Since these fluctuations are subject to a certain probability distribution, the statistical processing was used for the analysis of the pulse fluctuations. The behavior of the pulses has been also investigated in a multipoint geometry configuration. The dependence of the behavior of the corona pulses on the gap lengths, the material, the shape of the point electrode, the number and separation of electrodes (in the multiple-point mode) has been investigated, too. No detailed study has been carried out up to now for this case. The attention has been devoted also to the study of the pulses on the points of live materials (needles of coniferous trees). This contribution describes recent studies of the statistical properties of the pulses for various conditions.
NASA Astrophysics Data System (ADS)
Baiyegunhi, Christopher; Liu, Kuiwu; Gwavava, Oswald
2017-11-01
Grain size analysis is a vital sedimentological tool used to unravel the hydrodynamic conditions, mode of transportation and deposition of detrital sediments. In this study, detailed grain-size analysis was carried out on thirty-five sandstone samples from the Ecca Group in the Eastern Cape Province of South Africa. Grain-size statistical parameters, bivariate analysis, linear discriminate functions, Passega diagrams and log-probability curves were used to reveal the depositional processes, sedimentation mechanisms, hydrodynamic energy conditions and to discriminate different depositional environments. The grain-size parameters show that most of the sandstones are very fine to fine grained, moderately well sorted, mostly near-symmetrical and mesokurtic in nature. The abundance of very fine to fine grained sandstones indicate the dominance of low energy environment. The bivariate plots show that the samples are mostly grouped, except for the Prince Albert samples that show scattered trend, which is due to the either mixture of two modes in equal proportion in bimodal sediments or good sorting in unimodal sediments. The linear discriminant function analysis is dominantly indicative of turbidity current deposits under shallow marine environments for samples from the Prince Albert, Collingham and Ripon Formations, while those samples from the Fort Brown Formation are lacustrine or deltaic deposits. The C-M plots indicated that the sediments were deposited mainly by suspension and saltation, and graded suspension. Visher diagrams show that saltation is the major process of transportation, followed by suspension.
Bae, Sangok; Shoda, Makoto
2005-04-05
Culture conditions in a jar fermentor for bacterial cellulose (BC) production from A. xylinum BPR2001 were optimized by statistical analysis using Box-Behnken design. Response surface methodology was used to predict the levels of the factors, fructose (X1), corn steep liquor (CSL) (X2), dissolved oxygen (DO) (X3), and agar concentration (X4). Total 27 experimental runs by combination of each factor were carried out in a 10-L jar fermentor, and a three-dimensional response surface was generated to determine the effect of the factors and to find out the optimum concentration of each factor for maximum BC production and BC yield. The fructose and agar concentration highly influenced the BC production and BC yield. However, the optimum conditions according to changes in CSL and DO concentrations were predicted at almost central values of tested ranges. The predicted results showed that BC production was 14.3 g/L under the condition of 4.99% fructose, 2.85% CSL, 28.33% DO, and 0.38% agar concentration. On the other hand, BC yield was predicted in 0.34 g/g under the condition of 3.63% fructose, 2.90% CSL, 31.14% DO, and 0.42% agar concentration. Under optimized culture conditions, improvement of BC production and BC yield were experimentally confirmed, which increased 76% and 57%, respectively, compared to BC production and BC yield before optimizing the culture conditions. Copyright (c) 2005 Wiley Periodicals, Inc.
Biomechanical analysis of tension band fixation for olecranon fracture treatment.
Kozin, S H; Berglund, L J; Cooney, W P; Morrey, B F; An, K N
1996-01-01
This study assessed the strength of various tension band fixation methods with wire and cable applied to simulated olecranon fractures to compare stability and potential failure or complications between the two. Transverse olecranon fractures were simulated by osteotomy. The fracture was anatomically reduced, and various tension band fixation techniques were applied with monofilament wire or multifilament cable. With a material testing machine load displacement curves were obtained and statistical relevance determined by analysis of variance. Two loading modes were tested: loading on the posterior surface of olecranon to simulate triceps pull and loading on the anterior olecranon tip to recreate a potential compressive loading on the fragment during the resistive flexion. All fixation methods were more resistant to posterior loading than to an anterior load. Individual comparative analysis for various loading conditions concluded that tension band fixation is more resilient to tensile forces exerted by the triceps than compressive forces on the anterior olecranon tip. Neither wire passage anterior to the K-wires nor the multifilament cable provided statistically significant increased stability.
Damage detection of engine bladed-disks using multivariate statistical analysis
NASA Astrophysics Data System (ADS)
Fang, X.; Tang, J.
2006-03-01
The timely detection of damage in aero-engine bladed-disks is an extremely important and challenging research topic. Bladed-disks have high modal density and, particularly, their vibration responses are subject to significant uncertainties due to manufacturing tolerance (blade-to-blade difference or mistuning), operating condition change and sensor noise. In this study, we present a new methodology for the on-line damage detection of engine bladed-disks using their vibratory responses during spin-up or spin-down operations which can be measured by blade-tip-timing sensing technique. We apply a principle component analysis (PCA)-based approach for data compression, feature extraction, and denoising. The non-model based damage detection is achieved by analyzing the change between response features of the healthy structure and of the damaged one. We facilitate such comparison by incorporating the Hotelling's statistic T2 analysis, which yields damage declaration with a given confidence level. The effectiveness of the method is demonstrated by case studies.
Analysis of health in health centers area in Depok using correspondence analysis and scan statistic
NASA Astrophysics Data System (ADS)
Basir, C.; Widyaningsih, Y.; Lestari, D.
2017-07-01
Hotspots indicate area that has a higher case intensity than others. For example, in health problems of an area, the number of sickness of a region can be used as parameter and condition of area that determined severity of an area. If this condition is known soon, it can be overcome preventively. Many factors affect the severity level of area. Some health factors to be considered in this study are the number of infant with low birth weight, malnourished children under five years old, under five years old mortality, maternal deaths, births without the help of health personnel, infants without handling the baby's health, and infant without basic immunization. The number of cases is based on every public health center area in Depok. Correspondence analysis provides graphical information about two nominal variables relationship. It create plot based on row and column scores and show categories that have strong relation in a close distance. Scan Statistic method is used to examine hotspot based on some selected variables that occurred in the study area; and Correspondence Analysis is used to picturing association between the regions and variables. Apparently, using SaTScan software, Sukatani health center is obtained as a point hotspot; and Correspondence Analysis method shows health centers and the seven variables have a very significant relationship and the majority of health centers close to all variables, except Cipayung which is distantly related to the number of pregnant mother death. These results can be used as input for the government agencies to upgrade the health level in the area.
Management System of Occupational Diseases in Korea: Statistics, Report and Monitoring System
Choe, Seong Weon
2010-01-01
The management system of occupational diseases in Korea can be assessed from the perspective of a surveillance system. Workers' compensation insurance reports are used to produce official statistics on occupational diseases in Korea. National working conditions surveys are used to monitor the magnitude of work-related symptoms and signs in the labor force. A health examination program was introduced to detect occupational diseases through both selective and mass screening programs. The Working Environment Measurement Institution assesses workers' exposure to hazards in the workplace. Government regulates that the employer should do health examinations and working conditions measurement through contracted private agencies and following the Occupational Safety and Health Act. It is hoped that these institutions may be able to effectively detect and monitor occupational diseases and hazards in the workplace. In view of this, the occupational management system in Korea is well designed, except for the national survey system. In the future, national surveys for detection of hazards and ill-health outcomes in workers should be developed. The existing surveillance system for occupational disease can be improved by providing more refined information through statistical analysis of surveillance data. PMID:21258584
NASA Astrophysics Data System (ADS)
Aldrin, John C.; Annis, Charles; Sabbagh, Harold A.; Lindgren, Eric A.
2016-02-01
A comprehensive approach to NDE and SHM characterization error (CE) evaluation is presented that follows the framework of the `ahat-versus-a' regression analysis for POD assessment. Characterization capability evaluation is typically more complex with respect to current POD evaluations and thus requires engineering and statistical expertise in the model-building process to ensure all key effects and interactions are addressed. Justifying the statistical model choice with underlying assumptions is key. Several sizing case studies are presented with detailed evaluations of the most appropriate statistical model for each data set. The use of a model-assisted approach is introduced to help assess the reliability of NDE and SHM characterization capability under a wide range of part, environmental and damage conditions. Best practices of using models are presented for both an eddy current NDE sizing and vibration-based SHM case studies. The results of these studies highlight the general protocol feasibility, emphasize the importance of evaluating key application characteristics prior to the study, and demonstrate an approach to quantify the role of varying SHM sensor durability and environmental conditions on characterization performance.
Variance approximations for assessments of classification accuracy
R. L. Czaplewski
1994-01-01
Variance approximations are derived for the weighted and unweighted kappa statistics, the conditional kappa statistic, and conditional probabilities. These statistics are useful to assess classification accuracy, such as accuracy of remotely sensed classifications in thematic maps when compared to a sample of reference classifications made in the field. Published...
Fu, Mingkun; Perlman, Michael; Lu, Qing; Varga, Csanad
2015-03-25
An accelerated stress approach utilizing the moisture-modified Arrhenius equation and JMP statistical software was utilized to quantitatively assess the solid state stability of an investigational oncology drug MLNA under the influence of temperature (1/T) and humidity (%RH). Physical stability of MLNA under stress conditions was evaluated by using XRPD, DSC, TGA, and DVS, while chemical stability was evaluated by using HPLC. The major chemical degradation product was identified as a hydrolysis product of MLNA drug substance, and was subsequently subjected to an investigation of kinetics based on the isoconversion concept. A mathematical model (ln k=-11,991×(1/T)+0.0298×(%RH)+29.8823) based on the initial linear kinetics observed for the formation of this degradant at all seven stress conditions was built by using the moisture-modified Arrhenius equation and JMP statistical software. Comparison of the predicted versus experimental lnk values gave a mean deviation value of 5.8%, an R(2) value of 0.94, a p-value of 0.0038, and a coefficient of variation of the root mean square error CV(RMSE) of 7.9%. These statistics all indicated a good fit to the model for the stress data of MLNA. Both temperature and humidity were shown to have a statistically significant impact on stability by using effect leverage plots (p-value<0.05 for both 1/T and %RH). Inclusion of a term representing the interaction of relative humidity and temperature (%RH×1/T) was shown not to be justified by using Analysis of Covariance (ANCOVA), which supported the use of the moisture-corrected Arrhenius equation modeling theory. The model was found to be of value to aid setting of specifications and retest period, and storage condition selection. A model was also generated using only four conditions, as an example from a resource saving perspective, which was found to provide a good fit to the entire set of data. Copyright © 2015 Elsevier B.V. All rights reserved.
Vasilaki, V; Volcke, E I P; Nandi, A K; van Loosdrecht, M C M; Katsou, E
2018-04-26
Multivariate statistical analysis was applied to investigate the dependencies and underlying patterns between N 2 O emissions and online operational variables (dissolved oxygen and nitrogen component concentrations, temperature and influent flow-rate) during biological nitrogen removal from wastewater. The system under study was a full-scale reactor, for which hourly sensor data were available. The 15-month long monitoring campaign was divided into 10 sub-periods based on the profile of N 2 O emissions, using Binary Segmentation. The dependencies between operating variables and N 2 O emissions fluctuated according to Spearman's rank correlation. The correlation between N 2 O emissions and nitrite concentrations ranged between 0.51 and 0.78. Correlation >0.7 between N 2 O emissions and nitrate concentrations was observed at sub-periods with average temperature lower than 12 °C. Hierarchical k-means clustering and principal component analysis linked N 2 O emission peaks with precipitation events and ammonium concentrations higher than 2 mg/L, especially in sub-periods characterized by low N 2 O fluxes. Additionally, the highest ranges of measured N 2 O fluxes belonged to clusters corresponding with NO 3 -N concentration less than 1 mg/L in the upstream plug-flow reactor (middle of oxic zone), indicating slow nitrification rates. The results showed that the range of N 2 O emissions partially depends on the prior behavior of the system. The principal component analysis validated the findings from the clustering analysis and showed that ammonium, nitrate, nitrite and temperature explained a considerable percentage of the variance in the system for the majority of the sub-periods. The applied statistical methods, linked the different ranges of emissions with the system variables, provided insights on the effect of operating conditions on N 2 O emissions in each sub-period and can be integrated into N 2 O emissions data processing at wastewater treatment plants. Copyright © 2018. Published by Elsevier Ltd.
Kim, Youngshin
2008-01-01
The purpose of this study was to investigate the effects of two music therapy approaches, improvisation-assisted desensitization, and music-assisted progressive muscle relaxation and imagery on ameliorating the symptoms of music performance anxiety (MPA) among student pianists. Thirty female college pianists (N = 30) were randomly assigned to one of two conditions: (a) improvised music-assisted desensitization group (n = 15), or (b) music-assisted progressive muscle relaxation (PMR) and imagery group (n = 15). All participants received 6 weekly music therapy sessions according to their assigned group. Two lab performances were provided; one before and one after the 6 music therapy sessions, as the performance stimuli for MPA. All participants completed pretest and posttest measures that included four types of visual analogue scales (MPA, stress, tension, and comfort), the state portion of Spielberger's State-Trait Anxiety Inventory (STAI), and the Music Performance Anxiety Questionnaire (MPAQ) developed by Lehrer, Goldman, and Strommen (1990). Participants' finger temperatures were also measured. When results of the music-assisted PMR and imagery condition were compared from pretest to posttest, statistically significant differences occurred in 6 out of the 7 measures-MPA, tension, comfort, STAI, MPAQ, and finger temperature, indicating that the music-assisted PMR and imagery treatment was very successful in reducing MPA. For the improvisation-assisted desensitization condition, the statistically significant decreases in tension and STAI, with increases in finger temperature indicated that this approach was effective in managing MPA to some extent. When the difference scores for the two approaches were compared, there was no statistically significant difference between the two approaches for any of the seven measures. Therefore, no one treatment condition appeared more effective than the other. Although statistically significant differences were not found between the two groups, a visual analysis of mean difference scores revealed that the music-assisted PMR and imagery condition resulted in greater mean differences from pretest to posttest than the improvisation-assisted desensitization condition across all seven measures. This result may be due to the fact that all participants in the music-assisted PMR and imagery condition followed the procedure easily, while two of the 15 participants in the improvisation-assisted desensitization group had difficulty improvising.
Ranking metrics in gene set enrichment analysis: do they matter?
Zyla, Joanna; Marczyk, Michal; Weiner, January; Polanska, Joanna
2017-05-12
There exist many methods for describing the complex relation between changes of gene expression in molecular pathways or gene ontologies under different experimental conditions. Among them, Gene Set Enrichment Analysis seems to be one of the most commonly used (over 10,000 citations). An important parameter, which could affect the final result, is the choice of a metric for the ranking of genes. Applying a default ranking metric may lead to poor results. In this work 28 benchmark data sets were used to evaluate the sensitivity and false positive rate of gene set analysis for 16 different ranking metrics including new proposals. Furthermore, the robustness of the chosen methods to sample size was tested. Using k-means clustering algorithm a group of four metrics with the highest performance in terms of overall sensitivity, overall false positive rate and computational load was established i.e. absolute value of Moderated Welch Test statistic, Minimum Significant Difference, absolute value of Signal-To-Noise ratio and Baumgartner-Weiss-Schindler test statistic. In case of false positive rate estimation, all selected ranking metrics were robust with respect to sample size. In case of sensitivity, the absolute value of Moderated Welch Test statistic and absolute value of Signal-To-Noise ratio gave stable results, while Baumgartner-Weiss-Schindler and Minimum Significant Difference showed better results for larger sample size. Finally, the Gene Set Enrichment Analysis method with all tested ranking metrics was parallelised and implemented in MATLAB, and is available at https://github.com/ZAEDPolSl/MrGSEA . Choosing a ranking metric in Gene Set Enrichment Analysis has critical impact on results of pathway enrichment analysis. The absolute value of Moderated Welch Test has the best overall sensitivity and Minimum Significant Difference has the best overall specificity of gene set analysis. When the number of non-normally distributed genes is high, using Baumgartner-Weiss-Schindler test statistic gives better outcomes. Also, it finds more enriched pathways than other tested metrics, which may induce new biological discoveries.
Hauber, A Brett; González, Juan Marcos; Groothuis-Oudshoorn, Catharina G M; Prior, Thomas; Marshall, Deborah A; Cunningham, Charles; IJzerman, Maarten J; Bridges, John F P
2016-06-01
Conjoint analysis is a stated-preference survey method that can be used to elicit responses that reveal preferences, priorities, and the relative importance of individual features associated with health care interventions or services. Conjoint analysis methods, particularly discrete choice experiments (DCEs), have been increasingly used to quantify preferences of patients, caregivers, physicians, and other stakeholders. Recent consensus-based guidance on good research practices, including two recent task force reports from the International Society for Pharmacoeconomics and Outcomes Research, has aided in improving the quality of conjoint analyses and DCEs in outcomes research. Nevertheless, uncertainty regarding good research practices for the statistical analysis of data from DCEs persists. There are multiple methods for analyzing DCE data. Understanding the characteristics and appropriate use of different analysis methods is critical to conducting a well-designed DCE study. This report will assist researchers in evaluating and selecting among alternative approaches to conducting statistical analysis of DCE data. We first present a simplistic DCE example and a simple method for using the resulting data. We then present a pedagogical example of a DCE and one of the most common approaches to analyzing data from such a question format-conditional logit. We then describe some common alternative methods for analyzing these data and the strengths and weaknesses of each alternative. We present the ESTIMATE checklist, which includes a list of questions to consider when justifying the choice of analysis method, describing the analysis, and interpreting the results. Copyright © 2016 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Meta-analysis of the effects of prokinetic agents in patients with functional dyspepsia.
Hiyama, Toru; Yoshihara, Masaharu; Matsuo, Keitaro; Kusunoki, Hiroaki; Kamada, Tomoari; Ito, Masanori; Tanaka, Shinji; Nishi, Nobuo; Chayama, Kazuaki; Haruma, Ken
2007-03-01
Functional dyspepsia (FD) is often treated with prokinetic agents; however, the efficacy of prokinetic agents in patients with FD has been questioned recently. The aim of this study was to perform a meta-analysis of the effects of prokinetic agents in patients with FD. Prokinetic agents, including metoclopramide, domperidone, trimebutine, cisapride, itopride and mosapride, used for treatment of FD between 1951 and 2005 were identified. Twenty-seven studies were selected. Difference in the probability of response between the interventional drug and placebo was used as a summary statistic for the treatment effect. Meta-regression analysis was used to detect sources of heterogeneity. In total, 1844 subjects were assigned to an experimental arm, and 1591 subjects were assigned to a placebo arm. Publication bias was ruled out by funnel plot and statistical testing (P = 0.975). In the overall analysis, the summary statistic was 0.295 (95% confidence interval: 0.208-0.382, P < 0.001), indicating that the interventional drug has 30% excess probability of producing a response compared with placebo. The most significant source of heterogeneity was the year of publication (P < 0.001). The data clearly indicate that prokinetic agents are significantly more effective than placebo in the treatment of FD. Although FD is a chronic condition, efficacy was assessed over short periods. Long-term randomized controlled trials are needed to confirm the effect.
GPUs for statistical data analysis in HEP: a performance study of GooFit on GPUs vs. RooFit on CPUs
NASA Astrophysics Data System (ADS)
Pompili, Alexis; Di Florio, Adriano; CMS Collaboration
2016-10-01
In order to test the computing capabilities of GPUs with respect to traditional CPU cores a high-statistics toy Monte Carlo technique has been implemented both in ROOT/RooFit and GooFit frameworks with the purpose to estimate the statistical significance of the structure observed by CMS close to the kinematical boundary of the Jψϕ invariant mass in the three-body decay B +→JψϕK +. GooFit is a data analysis open tool under development that interfaces ROOT/RooFit to CUDA platform on nVidia GPU. The optimized GooFit application running on GPUs hosted by servers in the Bari Tier2 provides striking speed-up performances with respect to the RooFit application parallelised on multiple CPUs by means of PROOF-Lite tool. The considerably resulting speed-up, while comparing concurrent GooFit processes allowed by CUDA Multi Process Service and a RooFit/PROOF-Lite process with multiple CPU workers, is presented and discussed in detail. By means of GooFit it has also been possible to explore the behaviour of a likelihood ratio test statistic in different situations in which the Wilks Theorem may apply or does not apply because its regularity conditions are not satisfied.
NASA Astrophysics Data System (ADS)
Mirbaha, Babak; Saffarzadeh, Mahmoud; AmirHossein Beheshty, Seyed; Aniran, MirMoosa; Yazdani, Mirbahador; Shirini, Bahram
2017-10-01
Analysis of vehicle speed with different weather condition and traffic characteristics is very effective in traffic planning. Since the weather condition and traffic characteristics vary every day, the prediction of average speed can be useful in traffic management plans. In this study, traffic and weather data for a two-lane highway located in Northwest of Iran were selected for analysis. After merging traffic and weather data, the linear regression model was calibrated for speed prediction using STATA12.1 Statistical and Data Analysis software. Variables like vehicle flow, percentage of heavy vehicles, vehicle flow in opposing lane, percentage of heavy vehicles in opposing lane, rainfall (mm), snowfall and maximum daily wind speed more than 13m/s were found to be significant variables in the model. Results showed that variables of vehicle flow and heavy vehicle percent acquired the positive coefficient that shows, by increasing these variables the average vehicle speed in every weather condition will also increase. Vehicle flow in opposing lane, percentage of heavy vehicle in opposing lane, rainfall amount (mm), snowfall and maximum daily wind speed more than 13m/s acquired the negative coefficient that shows by increasing these variables, the average vehicle speed will decrease.
Lukášová, Ivana; Muselík, Jan; Franc, Aleš; Goněc, Roman; Mika, Filip; Vetchý, David
2017-11-15
Warfarin is intensively discussed drug with narrow therapeutic range. There have been cases of bleeding attributed to varying content or altered quality of the active substance. Factor analysis is useful for finding suitable technological parameters leading to high content uniformity of tablets containing low amount of active substance. The composition of tabletting blend and technological procedure were set with respect to factor analysis of previously published results. The correctness of set parameters was checked by manufacturing and evaluation of tablets containing 1-10mg of warfarin sodium. The robustness of suggested technology was checked by using "worst case scenario" and statistical evaluation of European Pharmacopoeia (EP) content uniformity limits with respect to Bergum division and process capability index (Cpk). To evaluate the quality of active substance and tablets, dissolution method was developed (water; EP apparatus II; 25rpm), allowing for statistical comparison of dissolution profiles. Obtained results prove the suitability of factor analysis to optimize the composition with respect to batches manufactured previously and thus the use of metaanalysis under industrial conditions is feasible. Copyright © 2017 Elsevier B.V. All rights reserved.
[The main directions of reforming the service of medical statistics in Ukraine].
Golubchykov, Mykhailo V; Orlova, Nataliia M; Bielikova, Inna V
2018-01-01
Introduction: Implementation of new methods of information support of managerial decision-making should ensure of the effective health system reform and create conditions for improving the quality of operational management, reasonable planning of medical care and increasing the efficiency of the use of system resources. Reforming of Medical Statistics Service of Ukraine should be considered only in the context of the reform of the entire health system. The aim: This work is an analysis of the current situation and justification of the main directions of reforming of Medical Statistics Service of Ukraine. Material and methods: In the work is used a range of methods: content analysis, bibliosemantic, systematic approach. The information base of the research became: WHO strategic and program documents, data of the Medical Statistics Center of the Ministry of Health of Ukraine. Review: The Medical Statistics Service of Ukraine has a completed and effective structure, headed by the State Institution "Medical Statistics Center of the Ministry of Health of Ukraine." This institution reports on behalf of the Ministry of Health of Ukraine to the State Statistical Service of Ukraine, the WHO European Office and other international organizations. An analysis of the current situation showed that to achieve this goal it is necessary: to improve the system of statistical indicators for an adequate assessment of the performance of health institutions, including in the economic aspect; creation of a developed medical and statistical base of administrative territories; change of existing technologies for the formation of information resources; strengthening the material-technical base of the structural units of Medical Statistics Service; improvement of the system of training and retraining of personnel for the service of medical statistics; development of international cooperation in the field of methodology and practice of medical statistics, implementation of internationally accepted methods for collecting, processing, analyzing and disseminating medical and statistical information; the creation of a medical and statistical service that adapted to the specifics of market relations in health care, flexible and sensitive to changes in international methodologies and standards. Conclusions: The data of medical statistics are the basis for taking managerial decisions by managers at all levels of health care. Reform of Medical Statistics Service of Ukraine should be considered only in the context of the reform of the entire health system. The main directions of the reform of the medical statistics service in Ukraine are: the introduction of information technologies, the improvement of the training of personnel for the service, the improvement of material and technical equipment, the maximum reuse of the data obtained, which provides for the unification of primary data and a system of indicators. The most difficult area is the formation of information funds and the introduction of modern information technologies.
[The main directions of reforming the service of medical statistics in Ukraine].
Golubchykov, Mykhailo V; Orlova, Nataliia M; Bielikova, Inna V
Introduction: Implementation of new methods of information support of managerial decision-making should ensure of the effective health system reform and create conditions for improving the quality of operational management, reasonable planning of medical care and increasing the efficiency of the use of system resources. Reforming of Medical Statistics Service of Ukraine should be considered only in the context of the reform of the entire health system. The aim: This work is an analysis of the current situation and justification of the main directions of reforming of Medical Statistics Service of Ukraine. Material and methods: In the work is used a range of methods: content analysis, bibliosemantic, systematic approach. The information base of the research became: WHO strategic and program documents, data of the Medical Statistics Center of the Ministry of Health of Ukraine. Review: The Medical Statistics Service of Ukraine has a completed and effective structure, headed by the State Institution "Medical Statistics Center of the Ministry of Health of Ukraine." This institution reports on behalf of the Ministry of Health of Ukraine to the State Statistical Service of Ukraine, the WHO European Office and other international organizations. An analysis of the current situation showed that to achieve this goal it is necessary: to improve the system of statistical indicators for an adequate assessment of the performance of health institutions, including in the economic aspect; creation of a developed medical and statistical base of administrative territories; change of existing technologies for the formation of information resources; strengthening the material-technical base of the structural units of Medical Statistics Service; improvement of the system of training and retraining of personnel for the service of medical statistics; development of international cooperation in the field of methodology and practice of medical statistics, implementation of internationally accepted methods for collecting, processing, analyzing and disseminating medical and statistical information; the creation of a medical and statistical service that adapted to the specifics of market relations in health care, flexible and sensitive to changes in international methodologies and standards. Conclusions: The data of medical statistics are the basis for taking managerial decisions by managers at all levels of health care. Reform of Medical Statistics Service of Ukraine should be considered only in the context of the reform of the entire health system. The main directions of the reform of the medical statistics service in Ukraine are: the introduction of information technologies, the improvement of the training of personnel for the service, the improvement of material and technical equipment, the maximum reuse of the data obtained, which provides for the unification of primary data and a system of indicators. The most difficult area is the formation of information funds and the introduction of modern information technologies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mayer, B. P.; Valdez, C. A.; DeHope, A. J.
Critical to many modern forensic investigations is the chemical attribution of the origin of an illegal drug. This process greatly relies on identification of compounds indicative of its clandestine or commercial production. The results of these studies can yield detailed information on method of manufacture, sophistication of the synthesis operation, starting material source, and final product. In the present work, chemical attribution signatures (CAS) associated with the synthesis of the analgesic 3- methylfentanyl, N-(3-methyl-1-phenethylpiperidin-4-yl)-N-phenylpropanamide, were investigated. Six synthesis methods were studied in an effort to identify and classify route-specific signatures. These methods were chosen to minimize the use of scheduledmore » precursors, complicated laboratory equipment, number of overall steps, and demanding reaction conditions. Using gas and liquid chromatographies combined with mass spectrometric methods (GC-QTOF and LC-QTOF) in conjunction with inductivelycoupled plasma mass spectrometry (ICP-MS), over 240 distinct compounds and elements were monitored. As seen in our previous work with CAS of fentanyl synthesis the complexity of the resultant data matrix necessitated the use of multivariate statistical analysis. Using partial least squares discriminant analysis (PLS-DA), 62 statistically significant, route-specific CAS were identified. Statistical classification models using a variety of machine learning techniques were then developed with the ability to predict the method of 3-methylfentanyl synthesis from three blind crude samples generated by synthetic chemists without prior experience with these methods.« less
A powerful score-based test statistic for detecting gene-gene co-association.
Xu, Jing; Yuan, Zhongshang; Ji, Jiadong; Zhang, Xiaoshuai; Li, Hongkai; Wu, Xuesen; Xue, Fuzhong; Liu, Yanxun
2016-01-29
The genetic variants identified by Genome-wide association study (GWAS) can only account for a small proportion of the total heritability for complex disease. The existence of gene-gene joint effects which contains the main effects and their co-association is one of the possible explanations for the "missing heritability" problems. Gene-gene co-association refers to the extent to which the joint effects of two genes differ from the main effects, not only due to the traditional interaction under nearly independent condition but the correlation between genes. Generally, genes tend to work collaboratively within specific pathway or network contributing to the disease and the specific disease-associated locus will often be highly correlated (e.g. single nucleotide polymorphisms (SNPs) in linkage disequilibrium). Therefore, we proposed a novel score-based statistic (SBS) as a gene-based method for detecting gene-gene co-association. Various simulations illustrate that, under different sample sizes, marginal effects of causal SNPs and co-association levels, the proposed SBS has the better performance than other existed methods including single SNP-based and principle component analysis (PCA)-based logistic regression model, the statistics based on canonical correlations (CCU), kernel canonical correlation analysis (KCCU), partial least squares path modeling (PLSPM) and delta-square (δ (2)) statistic. The real data analysis of rheumatoid arthritis (RA) further confirmed its advantages in practice. SBS is a powerful and efficient gene-based method for detecting gene-gene co-association.
NASA Astrophysics Data System (ADS)
Jiang, H.; Lin, T.
2017-12-01
Rain-fed corn production systems are subject to sub-seasonal variations of precipitation and temperature during the growing season. As each growth phase has varied inherent physiological process, plants necessitate different optimal environmental conditions during each phase. However, this temporal heterogeneity towards climate variability alongside the lifecycle of crops is often simplified and fixed as constant responses in large scale statistical modeling analysis. To capture the time-variant growing requirements in large scale statistical analysis, we develop and compare statistical models at various spatial and temporal resolutions to quantify the relationship between corn yield and weather factors for 12 corn belt states from 1981 to 2016. The study compares three spatial resolutions (county, agricultural district, and state scale) and three temporal resolutions (crop growth phase, monthly, and growing season) to characterize the effects of spatial and temporal variability. Our results show that the agricultural district model together with growth phase resolution can explain 52% variations of corn yield caused by temperature and precipitation variability. It provides a practical model structure balancing the overfitting problem in county specific model and weak explanation power in state specific model. In US corn belt, precipitation has positive impact on corn yield in growing season except for vegetative stage while extreme heat attains highest sensitivity from silking to dough phase. The results show the northern counties in corn belt area are less interfered by extreme heat but are more vulnerable to water deficiency.
Bogaert, Kenny A; Manoharan-Basil, Sheeba S; Perez, Emilie; Levine, Raphael D; Remacle, Francoise; Remacle, Claire
2018-01-01
The usual cultivation mode of the green microalga Chlamydomonas is liquid medium and light. However, the microalga can also be grown on agar plates and in darkness. Our aim is to analyze and compare gene expression of cells cultivated in these different conditions. For that purpose, RNA-seq data are obtained from Chlamydomonas samples of two different labs grown in four environmental conditions (agar@light, agar@dark, liquid@light, liquid@dark). The RNA seq data are analyzed by surprisal analysis, which allows the simultaneous meta-analysis of all the samples. First we identify a balance state, which defines a state where the expression levels are similar in all the samples irrespectively of their growth conditions, or lab origin. In addition our analysis identifies additional constraints needed to quantify the deviation with respect to the balance state. The first constraint differentiates the agar samples versus the liquid ones; the second constraint the dark samples versus the light ones. The two constraints are almost of equal importance. Pathways involved in stress responses are found in the agar phenotype while the liquid phenotype comprises ATP and NADH production pathways. Remodeling of membrane is suggested in the dark phenotype while photosynthetic pathways characterize the light phenotype. The same trends are also present when performing purely statistical analysis such as K-means clustering and differentially expressed genes.
The large sample size fallacy.
Lantz, Björn
2013-06-01
Significance in the statistical sense has little to do with significance in the common practical sense. Statistical significance is a necessary but not a sufficient condition for practical significance. Hence, results that are extremely statistically significant may be highly nonsignificant in practice. The degree of practical significance is generally determined by the size of the observed effect, not the p-value. The results of studies based on large samples are often characterized by extreme statistical significance despite small or even trivial effect sizes. Interpreting such results as significant in practice without further analysis is referred to as the large sample size fallacy in this article. The aim of this article is to explore the relevance of the large sample size fallacy in contemporary nursing research. Relatively few nursing articles display explicit measures of observed effect sizes or include a qualitative discussion of observed effect sizes. Statistical significance is often treated as an end in itself. Effect sizes should generally be calculated and presented along with p-values for statistically significant results, and observed effect sizes should be discussed qualitatively through direct and explicit comparisons with the effects in related literature. © 2012 Nordic College of Caring Science.
NASA Astrophysics Data System (ADS)
Hendikawati, P.; Arifudin, R.; Zahid, M. Z.
2018-03-01
This study aims to design an android Statistics Data Analysis application that can be accessed through mobile devices to making it easier for users to access. The Statistics Data Analysis application includes various topics of basic statistical along with a parametric statistics data analysis application. The output of this application system is parametric statistics data analysis that can be used for students, lecturers, and users who need the results of statistical calculations quickly and easily understood. Android application development is created using Java programming language. The server programming language uses PHP with the Code Igniter framework, and the database used MySQL. The system development methodology used is the Waterfall methodology with the stages of analysis, design, coding, testing, and implementation and system maintenance. This statistical data analysis application is expected to support statistical lecturing activities and make students easier to understand the statistical analysis of mobile devices.
Effects of the water level on the flow topology over the Bolund island
NASA Astrophysics Data System (ADS)
Cuerva-Tejero, A.; Yeow, T. S.; Gallego-Castillo, C.; Lopez-Garcia, O.
2014-06-01
We have analyzed the influence of the actual height of Bolund island above water level on different full-scale statistics of the velocity field over the peninsula. Our analysis is focused on the database of 10-minute statistics provided by Risø-DTU for the Bolund Blind Experiment. We have considered 10-minut.e periods with near-neutral atmospheric conditions, mean wind speed values in the interval [5,20] m/s, and westerly wind directions. As expected, statistics such as speed-up, normalized increase of turbulent kinetic energy and probability of recirculating flow show a large dependence on the emerged height of the island for the locations close to the escarpment. For the published ensemble mean values of speed-up and normalized increase of turbulent kinetic energy in these locations, we propose that some ammount of uncertainty could be explained as a deterministic dependence of the flow field statistics upon the actual height of the Bolund island above the sea level.
NASA Astrophysics Data System (ADS)
Yan, Rui; Parrot, Michel; Pinçon, Jean-Louis
2017-12-01
In this paper, we present the result of a statistical study performed on the ionospheric ion density variations above areas of seismic activity. The ion density was observed by the low altitude satellite DEMETER between 2004 and 2010. In the statistical analysis a superposed epoch method is used where the observed ionospheric ion density close to the epicenters both in space and in time is compared to background values recorded at the same location and in the same conditions. Data associated with aftershocks have been carefully removed from the database to prevent spurious effects on the statistics. It is shown that, during nighttime, anomalous ionospheric perturbations related to earthquakes with magnitudes larger than 5 are evidenced. At the time of these perturbations the background ion fluctuation departs from a normal distribution. They occur up to 200 km from the epicenters and mainly 5 days before the earthquakes. As expected, an ion density perturbation occurring just after the earthquakes and close to the epicenters is also evidenced.
Statistical modeling of optical attenuation measurements in continental fog conditions
NASA Astrophysics Data System (ADS)
Khan, Muhammad Saeed; Amin, Muhammad; Awan, Muhammad Saleem; Minhas, Abid Ali; Saleem, Jawad; Khan, Rahimdad
2017-03-01
Free-space optics is an innovative technology that uses atmosphere as a propagation medium to provide higher data rates. These links are heavily affected by atmospheric channel mainly because of fog and clouds that act to scatter and even block the modulated beam of light from reaching the receiver end, hence imposing severe attenuation. A comprehensive statistical study of the fog effects and deep physical understanding of the fog phenomena are very important for suggesting improvements (reliability and efficiency) in such communication systems. In this regard, 6-months real-time measured fog attenuation data are considered and statistically investigated. A detailed statistical analysis related to each fog event for that period is presented; the best probability density functions are selected on the basis of Akaike information criterion, while the estimates of unknown parameters are computed by maximum likelihood estimation technique. The results show that most fog attenuation events follow normal mixture distribution and some follow the Weibull distribution.
NASA Astrophysics Data System (ADS)
Huang, Haiping
2017-05-01
Revealing hidden features in unlabeled data is called unsupervised feature learning, which plays an important role in pretraining a deep neural network. Here we provide a statistical mechanics analysis of the unsupervised learning in a restricted Boltzmann machine with binary synapses. A message passing equation to infer the hidden feature is derived, and furthermore, variants of this equation are analyzed. A statistical analysis by replica theory describes the thermodynamic properties of the model. Our analysis confirms an entropy crisis preceding the non-convergence of the message passing equation, suggesting a discontinuous phase transition as a key characteristic of the restricted Boltzmann machine. Continuous phase transition is also confirmed depending on the embedded feature strength in the data. The mean-field result under the replica symmetric assumption agrees with that obtained by running message passing algorithms on single instances of finite sizes. Interestingly, in an approximate Hopfield model, the entropy crisis is absent, and a continuous phase transition is observed instead. We also develop an iterative equation to infer the hyper-parameter (temperature) hidden in the data, which in physics corresponds to iteratively imposing Nishimori condition. Our study provides insights towards understanding the thermodynamic properties of the restricted Boltzmann machine learning, and moreover important theoretical basis to build simplified deep networks.
Analysis of Loss-of-Offsite-Power Events 1997-2015
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Nancy Ellen; Schroeder, John Alton
2016-07-01
Loss of offsite power (LOOP) can have a major negative impact on a power plant’s ability to achieve and maintain safe shutdown conditions. LOOP event frequencies and times required for subsequent restoration of offsite power are important inputs to plant probabilistic risk assessments. This report presents a statistical and engineering analysis of LOOP frequencies and durations at U.S. commercial nuclear power plants. The data used in this study are based on the operating experience during calendar years 1997 through 2015. LOOP events during critical operation that do not result in a reactor trip, are not included. Frequencies and durations weremore » determined for four event categories: plant-centered, switchyard-centered, grid-related, and weather-related. Emergency diesel generator reliability is also considered (failure to start, failure to load and run, and failure to run more than 1 hour). There is an adverse trend in LOOP durations. The previously reported adverse trend in LOOP frequency was not statistically significant for 2006-2015. Grid-related LOOPs happen predominantly in the summer. Switchyard-centered LOOPs happen predominantly in winter and spring. Plant-centered and weather-related LOOPs do not show statistically significant seasonality. The engineering analysis of LOOP data shows that human errors have been much less frequent since 1997 than in the 1986 -1996 time period.« less
Nearfield Summary and Statistical Analysis of the Second AIAA Sonic Boom Prediction Workshop
NASA Technical Reports Server (NTRS)
Park, Michael A.; Nemec, Marian
2017-01-01
A summary is provided for the Second AIAA Sonic Boom Workshop held 8-9 January 2017 in conjunction with AIAA SciTech 2017. The workshop used three required models of increasing complexity: an axisymmetric body, a wing body, and a complete configuration with flow-through nacelle. An optional complete configuration with propulsion boundary conditions is also provided. These models are designed with similar nearfield signatures to isolate geometry and shock/expansion interaction effects. Eleven international participant groups submitted nearfield signatures with forces, pitching moment, and iterative convergence norms. Statistics and grid convergence of these nearfield signatures are presented. These submissions are propagated to the ground, and noise levels are computed. This allows the grid convergence and the statistical distribution of a noise level to be computed. While progress is documented since the first workshop, improvement to the analysis methods for a possible subsequent workshop are provided. The complete configuration with flow-through nacelle showed the most dramatic improvement between the two workshops. The current workshop cases are more relevant to vehicles with lower loudness and have the potential for lower annoyance than the first workshop cases. The models for this workshop with quieter ground noise levels than the first workshop exposed weaknesses in analysis, particularly in convective discretization.
Kim, Min-Uk; Moon, Kyong Whan; Sohn, Jong-Ryeul; Byeon, Sang-Hoon
2018-05-18
We studied sensitive weather variables for consequence analysis, in the case of chemical leaks on the user side of offsite consequence analysis (OCA) tools. We used OCA tools Korea Offsite Risk Assessment (KORA) and Areal Location of Hazardous Atmospheres (ALOHA) in South Korea and the United States, respectively. The chemicals used for this analysis were 28% ammonia (NH₃), 35% hydrogen chloride (HCl), 50% hydrofluoric acid (HF), and 69% nitric acid (HNO₃). The accident scenarios were based on leakage accidents in storage tanks. The weather variables were air temperature, wind speed, humidity, and atmospheric stability. Sensitivity analysis was performed using the Statistical Package for the Social Sciences (SPSS) program for dummy regression analysis. Sensitivity analysis showed that impact distance was not sensitive to humidity. Impact distance was most sensitive to atmospheric stability, and was also more sensitive to air temperature than wind speed, according to both the KORA and ALOHA tools. Moreover, the weather variables were more sensitive in rural conditions than in urban conditions, with the ALOHA tool being more influenced by weather variables than the KORA tool. Therefore, if using the ALOHA tool instead of the KORA tool in rural conditions, users should be careful not to cause any differences in impact distance due to input errors of weather variables, with the most sensitive one being atmospheric stability.
ERIC Educational Resources Information Center
Martuza, Victor R.; Engel, John D.
Results from classical power analysis (Brewer, 1972) suggest that a researcher should not set a=p (when p is less than a) in a posteriori fashion when a study yields statistically significant results because of a resulting decrease in power. The purpose of the present report is to use Bayesian theory in examining the validity of this…
George L. Farnsworth; James D. Nichols; John R. Sauer; Steven G. Fancy; Kenneth H. Pollock; Susan A. Shriner; Theodore R. Simons
2005-01-01
Point counts are a standard sampling procedure for many bird species, but lingering concerns still exist about the quality of information produced from the method. It is well known that variation in observer ability and environmental conditions can influence the detection probability of birds in point counts, but many biologists have been reluctant to abandon point...
Michael Arbaugh; Larry Bednar
1996-01-01
The sampling methods used to monitor ozone injury to ponderosa and Jeffrey pines depend on the objectives of the study, geographic and genetic composition of the forest, and the source and composition of air pollutant emissions. By using a standardized sampling methodology, it may be possible to compare conditions within local areas more accurately, and to apply the...
Optimization of technological equipment used in the laser-radiation hardening of instruments
NASA Astrophysics Data System (ADS)
Tverdokhlebov, G. N.; Maznichenko, S. A.
Results of a statistical analysis of an instrument intended for laser hardening are presented. The kinematics of the positioning and fastening of an instrument for uniform laser-pulse treatment is analyzed. The results are used to devise an automatic device and the procedure for laser treatment under optimized conditions of various rotary cutting instruments, such as milling cutters, drills, and counterbores.
Working conditions, socioeconomic factors and low birth weight: path analysis.
Mahmoodi, Zohreh; Karimlou, Masoud; Sajjadi, Homeira; Dejman, Masoumeh; Vameghi, Meroe; Dolatian, Mahrokh
2013-09-01
In recent years, with socioeconomic changes in the society, the presence of women in the workplace is inevitable. The differences in working condition, especially for pregnant women, has adverse consequences like low birth weight. This study was conducted with the aim to model the relationship between working conditions, socioeconomic factors, and birth weight. This study was conducted in case-control design. The control group consisted of 500 women with normal weight babies, and the case group, 250 women with low weight babies from selected hospitals in Tehran. Data were collected using a researcher-made questionnaire to determine mothers' lifestyle during pregnancy with low birth weight with health-affecting social determinants approach. This questionnaire investigated women's occupational lifestyle in terms of working conditions, activities, and job satisfaction. Data were analyzed with SPSS-16 and Lisrel-8.8 software using statistical path analysis. The final path model fitted well (CFI =1, RMSEA=0.00) and showed that among direct paths, working condition (β=-0.032), among indirect paths, household income (β=-0.42), and in the overall effect, unemployed spouse (β=-0.1828) had the most effects on the low birth weight. Negative coefficients indicate decreasing effect on birth weight. Based on the path analysis model, working condition and socioeconomic status directly and indirectly influence birth weight. Thus, as well as attention to treatment and health care (biological aspect), special attention must also be paid to mothers' socioeconomic factors.
The educational value of consumer-targeted prescription drug print advertising.
Bell, R A; Wilkes, M S; Kravitz, R L
2000-12-01
The case for direct-to-consumer (DTC) prescription drug advertising has often been based on the argument that such promotions can educate the public about medical conditions and associated treatments. Our content analysis of DTC advertising assessed the extent to which such educational efforts have been attempted. We collected advertisements appearing in 18 popular magazines from 1989 through 1998. Two coders independently evaluated 320 advertisements encompassing 101 drug brands to determine if information appeared about specific aspects of the medical conditions for which the drug was promoted and about the treatment (mean kappa reliability=0.91). We employed basic descriptive statistics using the advertisement as the unit of analysis and cross-tabulations using the brand as the unit of analysis. Virtually all the advertisements gave the name of the condition treated by the promoted drug, and a majority provided information about the symptoms of that condition. However, few reported details about the condition's precursors or its prevalence; attempts to clarify misconceptions about the condition were also rare. The advertisements seldom provided information about the drug's mechanism of action, its success rate, treatment duration, alternative treatments, and behavioral changes that could enhance the health of affected patients. Informative advertisements were identified, but most of the promotions provided only a minimal amount of information. Strategies for improving the educational value of DTC advertisements are considered.
Kleikers, Pamela W M; Hooijmans, Carlijn; Göb, Eva; Langhauser, Friederike; Rewell, Sarah S J; Radermacher, Kim; Ritskes-Hoitinga, Merel; Howells, David W; Kleinschnitz, Christoph; Schmidt, Harald H H W
2015-08-27
Biomedical research suffers from a dramatically poor translational success. For example, in ischemic stroke, a condition with a high medical need, over a thousand experimental drug targets were unsuccessful. Here, we adopt methods from clinical research for a late-stage pre-clinical meta-analysis (MA) and randomized confirmatory trial (pRCT) approach. A profound body of literature suggests NOX2 to be a major therapeutic target in stroke. Systematic review and MA of all available NOX2(-/y) studies revealed a positive publication bias and lack of statistical power to detect a relevant reduction in infarct size. A fully powered multi-center pRCT rejects NOX2 as a target to improve neurofunctional outcomes or achieve a translationally relevant infarct size reduction. Thus stringent statistical thresholds, reporting negative data and a MA-pRCT approach can ensure biomedical data validity and overcome risks of bias.
Pinheiro, Rubiane C; Soares, Cleide M F; de Castro, Heizir F; Moraes, Flavio F; Zanin, Gisella M
2008-03-01
The conditions for maximization of the enzymatic activity of lipase entrapped in sol-gel matrix were determined for different vegetable oils using an experimental design. The effects of pH, temperature, and biocatalyst loading on lipase activity were verified using a central composite experimental design leading to a set of 13 assays and the surface response analysis. For canola oil and entrapped lipase, statistical analyses showed significant effects for pH and temperature and also the interactions between pH and temperature and temperature and biocatalyst loading. For the olive oil and entrapped lipase, it was verified that the pH was the only variable statistically significant. This study demonstrated that response surface analysis is a methodology appropriate for the maximization of the percentage of hydrolysis, as a function of pH, temperature, and lipase loading.
Stupák, Ivan; Pavloková, Sylvie; Vysloužil, Jakub; Dohnal, Jiří; Čulen, Martin
2017-11-23
Biorelevant dissolution instruments represent an important tool for pharmaceutical research and development. These instruments are designed to simulate the dissolution of drug formulations in conditions most closely mimicking the gastrointestinal tract. In this work, we focused on the optimization of dissolution compartments/vessels for an updated version of the biorelevant dissolution apparatus-Golem v2. We designed eight compartments of uniform size but different inner geometry. The dissolution performance of the compartments was tested using immediate release caffeine tablets and evaluated by standard statistical methods and principal component analysis. Based on two phases of dissolution testing (using 250 and 100 mL of dissolution medium), we selected two compartment types yielding the highest measurement reproducibility. We also confirmed a statistically ssignificant effect of agitation rate and dissolution volume on the extent of drug dissolved and measurement reproducibility.
Statistical properties of the radiation belt seed population
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boyd, A. J.; Spence, H. E.; Huang, C. -L.
Here, we present a statistical analysis of phase space density data from the first 26 months of the Van Allen Probes mission. In particular, we investigate the relationship between the tens and hundreds of keV seed electrons and >1 MeV core radiation belt electron population. Using a cross-correlation analysis, we find that the seed and core populations are well correlated with a coefficient of ≈0.73 with a time lag of 10–15 h. We present evidence of a seed population threshold that is necessary for subsequent acceleration. The depth of penetration of the seed population determines the inner boundary of themore » acceleration process. However, we show that an enhanced seed population alone is not enough to produce acceleration in the higher energies, implying that the seed population of hundreds of keV electrons is only one of several conditions required for MeV electron radiation belt acceleration.« less
NASA Astrophysics Data System (ADS)
Su, Xing; Meng, Xingmin; Ye, Weilin; Wu, Weijiang; Liu, Xingrong; Wei, Wanhong
2018-03-01
Tianshui City is one of the mountainous cities that are threatened by severe geo-hazards in Gansu Province, China. Statistical probability models have been widely used in analyzing and evaluating geo-hazards such as landslide. In this research, three approaches (Certainty Factor Method, Weight of Evidence Method and Information Quantity Method) were adopted to quantitively analyze the relationship between the causative factors and the landslides, respectively. The source data used in this study are including the SRTM DEM and local geological maps in the scale of 1:200,000. 12 causative factors (i.e., altitude, slope, aspect, curvature, plan curvature, profile curvature, roughness, relief amplitude, and distance to rivers, distance to faults, distance to roads, and the stratum lithology) were selected to do correlation analysis after thorough investigation of geological conditions and historical landslides. The results indicate that the outcomes of the three models are fairly consistent.
Statistical properties of the radiation belt seed population
Boyd, A. J.; Spence, H. E.; Huang, C. -L.; ...
2016-07-25
Here, we present a statistical analysis of phase space density data from the first 26 months of the Van Allen Probes mission. In particular, we investigate the relationship between the tens and hundreds of keV seed electrons and >1 MeV core radiation belt electron population. Using a cross-correlation analysis, we find that the seed and core populations are well correlated with a coefficient of ≈0.73 with a time lag of 10–15 h. We present evidence of a seed population threshold that is necessary for subsequent acceleration. The depth of penetration of the seed population determines the inner boundary of themore » acceleration process. However, we show that an enhanced seed population alone is not enough to produce acceleration in the higher energies, implying that the seed population of hundreds of keV electrons is only one of several conditions required for MeV electron radiation belt acceleration.« less
Rolling-Element Fatigue Testing and Data Analysis - A Tutorial
NASA Technical Reports Server (NTRS)
Vlcek, Brian L.; Zaretsky, Erwin V.
2011-01-01
In order to rank bearing materials, lubricants and other design variables using rolling-element bench type fatigue testing of bearing components and full-scale rolling-element bearing tests, the investigator needs to be cognizant of the variables that affect rolling-element fatigue life and be able to maintain and control them within an acceptable experimental tolerance. Once these variables are controlled, the number of tests and the test conditions must be specified to assure reasonable statistical certainty of the final results. There is a reasonable correlation between the results from elemental test rigs with those results obtained with full-scale bearings. Using the statistical methods of W. Weibull and L. Johnson, the minimum number of tests required can be determined. This paper brings together and discusses the technical aspects of rolling-element fatigue testing and data analysis as well as making recommendations to assure quality and reliable testing of rolling-element specimens and full-scale rolling-element bearings.
Mallette, Jennifer R; Casale, John F; Jordan, James; Morello, David R; Beyer, Paul M
2016-03-23
Previously, geo-sourcing to five major coca growing regions within South America was accomplished. However, the expansion of coca cultivation throughout South America made sub-regional origin determinations increasingly difficult. The former methodology was recently enhanced with additional stable isotope analyses ((2)H and (18)O) to fully characterize cocaine due to the varying environmental conditions in which the coca was grown. An improved data analysis method was implemented with the combination of machine learning and multivariate statistical analysis methods to provide further partitioning between growing regions. Here, we show how the combination of trace cocaine alkaloids, stable isotopes, and multivariate statistical analyses can be used to classify illicit cocaine as originating from one of 19 growing regions within South America. The data obtained through this approach can be used to describe current coca cultivation and production trends, highlight trafficking routes, as well as identify new coca growing regions.
NASA Astrophysics Data System (ADS)
Mallette, Jennifer R.; Casale, John F.; Jordan, James; Morello, David R.; Beyer, Paul M.
2016-03-01
Previously, geo-sourcing to five major coca growing regions within South America was accomplished. However, the expansion of coca cultivation throughout South America made sub-regional origin determinations increasingly difficult. The former methodology was recently enhanced with additional stable isotope analyses (2H and 18O) to fully characterize cocaine due to the varying environmental conditions in which the coca was grown. An improved data analysis method was implemented with the combination of machine learning and multivariate statistical analysis methods to provide further partitioning between growing regions. Here, we show how the combination of trace cocaine alkaloids, stable isotopes, and multivariate statistical analyses can be used to classify illicit cocaine as originating from one of 19 growing regions within South America. The data obtained through this approach can be used to describe current coca cultivation and production trends, highlight trafficking routes, as well as identify new coca growing regions.
Analysis of responses of cold pressor tests on pilots and executives
NASA Technical Reports Server (NTRS)
Swaroop, R.
1977-01-01
Statistical analyses were performed to study the relationship between cold pressor test responses and certain medical attributes of a group of 81 pilots and a group of 466 executives. The important results of this study were as follows: There was a significant relationship between a subject's cold pressor test response and his profession (that is, pilot or executive). The executives' diastolic cold pressor test responses were significantly related to their medical conditions, and their families' medical conditions. Significant relationships were observed between executives' diastolic and systolic cold pressor test responses and their history of tranquilizer and cardiac drug use.
NASA Technical Reports Server (NTRS)
Malila, W. A.; Cicone, R. C.; Gleason, J. M.
1976-01-01
Simulated scanner system data values generated in support of LACIE (Large Area Crop Inventory Experiment) research and development efforts are presented. Synthetic inband (LANDSAT) wheat radiances and radiance components were computed and are presented for various wheat canopy and atmospheric conditions and scanner view geometries. Values include: (1) inband bidirectional reflectances for seven stages of wheat crop growth; (2) inband atmospheric features; and (3) inband radiances corresponding to the various combinations of wheat canopy and atmospheric conditions. Analyses of these data values are presented in the main report.
NASA Astrophysics Data System (ADS)
Ul'yanov, A. S.; Lyapina, A. M.; Ulianova, O. V.; Fedorova, V. A.; Uianov, S. S.
2011-04-01
Specific statistical characteristics of biospeckles, emerging under the diffraction of coherent beams on the bacterial colonies, are studied. The dependence of the fractal dimensions of biospeckles on the conditions of both illumination and growth of the colonies is studied theoretically and experimentally. Particular attention is paid to the fractal properties of biospeckles, emerging under the scattering of light by the colonies of the vaccinal strain of the plague microbe. The possibility in principle to classify the colonies of Yersinia pestis EV NIIEG using the fractal dimension analysis is demonstrated.
NASA Astrophysics Data System (ADS)
Holden, Todd; Marchese, P.; Tremberger, G., Jr.; Cheung, E.; Subramaniam, R.; Sullivan, R.; Schneider, P.; Flamholz, A.; Lieberman, D.; Cheung, T.
2008-08-01
We have characterized function related DNA sequences of various organisms using informatics techniques, including fractal dimension calculation, nucleotide and multi-nucleotide statistics, and sequence fluctuation analysis. Our analysis shows trends which differentiate extremophile from non-extremophile organisms, which could be reproduced in extraterrestrial life. Among the systems studied are radiation repair genes, genes involved in thermal shocks, and genes involved in drug resistance. We also evaluate sequence level changes that have occurred during short term evolution (several thousand generations) under extreme conditions.
NASA Technical Reports Server (NTRS)
Simon, William E.; Li, Ku-Yen; Yaws, Carl L.; Mei, Harry T.; Nguyen, Vinh D.; Chu, Hsing-Wei
1994-01-01
A methyl acetate reactor was developed to perform a subscale kinetic investigation in the design and optimization of a full-scale metabolic simulator for long term testing of life support systems. Other tasks in support of the closed ecological life support system test program included: (1) heating, ventilation and air conditioning analysis of a variable pressure growth chamber, (2) experimental design for statistical analysis of plant crops, (3) resource recovery for closed life support systems, and (4) development of data acquisition software for automating an environmental growth chamber.
System level modeling and component level control of fuel cells
NASA Astrophysics Data System (ADS)
Xue, Xingjian
This dissertation investigates the fuel cell systems and the related technologies in three aspects: (1) system-level dynamic modeling of both PEM fuel cell (PEMFC) and solid oxide fuel cell (SOFC); (2) condition monitoring scheme development of PEM fuel cell system using model-based statistical method; and (3) strategy and algorithm development of precision control with potential application in energy systems. The dissertation first presents a system level dynamic modeling strategy for PEM fuel cells. It is well known that water plays a critical role in PEM fuel cell operations. It makes the membrane function appropriately and improves the durability. The low temperature operating conditions, however, impose modeling difficulties in characterizing the liquid-vapor two phase change phenomenon, which becomes even more complex under dynamic operating conditions. This dissertation proposes an innovative method to characterize this phenomenon, and builds a comprehensive model for PEM fuel cell at the system level. The model features the complete characterization of multi-physics dynamic coupling effects with the inclusion of dynamic phase change. The model is validated using Ballard stack experimental result from open literature. The system behavior and the internal coupling effects are also investigated using this model under various operating conditions. Anode-supported tubular SOFC is also investigated in the dissertation. While the Nernst potential plays a central role in characterizing the electrochemical performance, the traditional Nernst equation may lead to incorrect analysis results under dynamic operating conditions due to the current reverse flow phenomenon. This dissertation presents a systematic study in this regard to incorporate a modified Nernst potential expression and the heat/mass transfer into the analysis. The model is used to investigate the limitations and optimal results of various operating conditions; it can also be utilized to perform the optimal design of tubular SOFC. With the system-level dynamic model as a basis, a framework for the robust, online monitoring of PEM fuel cell is developed in the dissertation. The monitoring scheme employs the Hotelling T2 based statistical scheme to handle the measurement noise and system uncertainties and identifies the fault conditions through a series of self-checking and conformal testing. A statistical sampling strategy is also utilized to improve the computation efficiency. Fuel/gas flow control is the fundamental operation for fuel cell energy systems. In the final part of the dissertation, a high-precision and robust tracking control scheme using piezoelectric actuator circuit with direct hysteresis compensation is developed. The key characteristic of the developed control algorithm includes the nonlinear continuous control action with the adaptive boundary layer strategy.
MAVTgsa: An R Package for Gene Set (Enrichment) Analysis
Chien, Chih-Yi; Chang, Ching-Wei; Tsai, Chen-An; ...
2014-01-01
Gene semore » t analysis methods aim to determine whether an a priori defined set of genes shows statistically significant difference in expression on either categorical or continuous outcomes. Although many methods for gene set analysis have been proposed, a systematic analysis tool for identification of different types of gene set significance modules has not been developed previously. This work presents an R package, called MAVTgsa, which includes three different methods for integrated gene set enrichment analysis. (1) The one-sided OLS (ordinary least squares) test detects coordinated changes of genes in gene set in one direction, either up- or downregulation. (2) The two-sided MANOVA (multivariate analysis variance) detects changes both up- and downregulation for studying two or more experimental conditions. (3) A random forests-based procedure is to identify gene sets that can accurately predict samples from different experimental conditions or are associated with the continuous phenotypes. MAVTgsa computes the P values and FDR (false discovery rate) q -value for all gene sets in the study. Furthermore, MAVTgsa provides several visualization outputs to support and interpret the enrichment results. This package is available online.« less
Csányi, V; Gervai, J
1985-01-01
Passive dark avoidance conditioning and effects of the presence and absence of a fish-like dummy on the training process were studied in four inbred strains of paradise fish. Strain differences were found in the shuttle activity during habituation trials, and in the sensitivity to the mild electric shock punishment. The presence or absence of the dummy in the punished dark side of the shuttle box had a genotype-dependent effect on the measures taken during the conditioning process. The statistical analysis of the learning curves revealed differences in the way the strains varied in the different environments, i.e. genotype--environment interaction components of variances were identified. The results are discussed in the light of previous investigations and their implication in further genetic analysis.
Zbilut, Joseph P.; Colosimo, Alfredo; Conti, Filippo; Colafranceschi, Mauro; Manetti, Cesare; Valerio, MariaCristina; Webber, Charles L.; Giuliani, Alessandro
2003-01-01
The problem of protein folding vs. aggregation was investigated in acylphosphatase and the amyloid protein Aβ(1–40) by means of nonlinear signal analysis of their chain hydrophobicity. Numerical descriptors of recurrence patterns provided the basis for statistical evaluation of folding/aggregation distinctive features. Static and dynamic approaches were used to elucidate conditions coincident with folding vs. aggregation using comparisons with known protein secondary structure classifications, site-directed mutagenesis studies of acylphosphatase, and molecular dynamics simulations of amyloid protein, Aβ(1–40). The results suggest that a feature derived from principal component space characterized by the smoothness of singular, deterministic hydrophobicity patches plays a significant role in the conditions governing protein aggregation. PMID:14645049
Effect of non-normality on test statistics for one-way independent groups designs.
Cribbie, Robert A; Fiksenbaum, Lisa; Keselman, H J; Wilcox, Rand R
2012-02-01
The data obtained from one-way independent groups designs is typically non-normal in form and rarely equally variable across treatment populations (i.e., population variances are heterogeneous). Consequently, the classical test statistic that is used to assess statistical significance (i.e., the analysis of variance F test) typically provides invalid results (e.g., too many Type I errors, reduced power). For this reason, there has been considerable interest in finding a test statistic that is appropriate under conditions of non-normality and variance heterogeneity. Previously recommended procedures for analysing such data include the James test, the Welch test applied either to the usual least squares estimators of central tendency and variability, or the Welch test with robust estimators (i.e., trimmed means and Winsorized variances). A new statistic proposed by Krishnamoorthy, Lu, and Mathew, intended to deal with heterogeneous variances, though not non-normality, uses a parametric bootstrap procedure. In their investigation of the parametric bootstrap test, the authors examined its operating characteristics under limited conditions and did not compare it to the Welch test based on robust estimators. Thus, we investigated how the parametric bootstrap procedure and a modified parametric bootstrap procedure based on trimmed means perform relative to previously recommended procedures when data are non-normal and heterogeneous. The results indicated that the tests based on trimmed means offer the best Type I error control and power when variances are unequal and at least some of the distribution shapes are non-normal. © 2011 The British Psychological Society.
Brunelli, Elvira; Talarico, Erminia; Corapi, Barbara; Perrotta, Ida; Tripepi, Sandro
2008-10-01
We analysed the morphology and ultrastructure of the gill apparatus of the ornate wrasse, Thalassoma pavo, under normal conditions and after exposure to a sublethal concentration of sodium lauryl sulphate (3.5 mg/l, which is one-third of the 96LC99 value). To identify the biochemical mechanisms affected by this pollutant, we evaluated and compared the localisation of Na(+)/K(+) ATPase in normal and experimental conditions. Immunocytochemical analysis revealed that this enzyme was active in the chloride cells (CCs), which were distributed in clusters in the interlamellar region of the filament. Ultrastructural analysis revealed conspicuous alterations on the epithelium after 96 and 192 h of exposure to sodium lauryl sulphate: structural features of the surface cells were lost, the appearance of intercellular lacunae changed, and cellular degeneration occurred. Statistical analysis comparing the number and dimensions of CCs in normal conditions and after 96 h of exposure showed that the CC area decreased after exposure to the detergent.
Network Analysis of Rodent Transcriptomes in Spaceflight
NASA Technical Reports Server (NTRS)
Ramachandran, Maya; Fogle, Homer; Costes, Sylvain
2017-01-01
Network analysis methods leverage prior knowledge of cellular systems and the statistical and conceptual relationships between analyte measurements to determine gene connectivity. Correlation and conditional metrics are used to infer a network topology and provide a systems-level context for cellular responses. Integration across multiple experimental conditions and omics domains can reveal the regulatory mechanisms that underlie gene expression. GeneLab has assembled rich multi-omic (transcriptomics, proteomics, epigenomics, and epitranscriptomics) datasets for multiple murine tissues from the Rodent Research 1 (RR-1) experiment. RR-1 assesses the impact of 37 days of spaceflight on gene expression across a variety of tissue types, such as adrenal glands, quadriceps, gastrocnemius, tibalius anterior, extensor digitorum longus, soleus, eye, and kidney. Network analysis is particularly useful for RR-1 -omics datasets because it reinforces subtle relationships that may be overlooked in isolated analyses and subdues confounding factors. Our objective is to use network analysis to determine potential target nodes for therapeutic intervention and identify similarities with existing disease models. Multiple network algorithms are used for a higher confidence consensus.
Strer, Maximilian; Svoboda, Nikolai; Herrmann, Antje
2018-01-01
Understanding the abundance of adverse environmental conditions e.g. frost, drought, and heat during critical crop growth stages, which are assumed to be altered by climate change, is crucial for an accurate risk assessment for cropping systems. While a lengthening of the vegetation period may be beneficial, higher frequencies of heat or frost events and drought spells are generally regarded as harmful. The objective of the present study was to quantify shifts in maize and wheat phenology and the occurrence of adverse environmental conditions during critical growth stages for four regions located in the North German Plain. First, a statistical analysis of phenological development was conducted based on recent data (1981-2010). Next, these data were used to calibrate the DSSAT-CERES wheat and maize models, which were then used to run three climate projections representing the maximum, intermediate and minimum courses of climate development within the RCP 8.5 continuum during the years 2021-2050. By means of model simulation runs and statistical analysis, the climate data were evaluated for the abundance of adverse environmental conditions during critical development stages, i.e. the stages of early crop development, anthesis, sowing and harvest. Proxies for adverse environmental conditions included thresholds of low and high temperatures as well as soil moisture. The comparison of the baseline climate and future climate projections showed a significant increase in the abundance of adverse environmental conditions during critical growth stages in the future. The lengthening of the vegetation period in spring did not compensate for the increased abundance of high temperatures, e.g. during anthesis. The results of this study indicate the need to develop adaptation strategies, such as implementing changes in cropping calendars. An increase in frost risk during early development, however, reveals the limited feasibility of early sowing as a mitigation strategy. In addition, the abundance of low soil water contents that hamper important production processes such as sowing and harvest were found to increase locally.
Surface Landing Site Weather Analysis for Constellation Program
NASA Technical Reports Server (NTRS)
Altino, Karen M.; Burns, K. Lee
2008-01-01
Weather information is an important asset for NASA's Constellation Program in developing the next generation space transportation system to fly to the International Space Station, the Moon and, eventually, to Mars. Weather conditions can affect vehicle safety and performance during multiple mission phases ranging from pre-launch ground processing to landing and recovery operations, including all potential abort scenarios. Meteorological analysis is an important contributor, not only to the development and verification of system design requirements but also to mission planning and active ground operations. Of particular interest are the surface atmospheric conditions at both nominal and abort landing sites for the manned Orion capsule. Weather parameters such as wind, rain, and fog all play critical roles in the safe landing of the vehicle and subsequent crew and vehicle recovery. The Marshall Space Flight Center Natural Environments Branch has been tasked by the Constellation Program with defining the natural environments at potential landing zones. Climatological time series of operational surface weather observations are used to calculate probabilities of occurrence of various sets of hypothetical vehicle constraint thresholds, Data are available for numerous geographical locations such that statistical analysis can be performed for single sites as well as multiple-site network configurations. Results provide statistical descriptions of how often certain weather conditions are observed at the site(s) and the percentage that specified criteria thresholds are matched or exceeded. Outputs are tabulated by month and hour of day to show both seasonal and diurnal variation. This paper will describe the methodology used for data collection and quality control, detail the types of analyses performed, and provide a sample of the results that can be obtained,
Miller, Nathan D; Durham Brooks, Tessa L; Assadi, Amir H; Spalding, Edgar P
2010-10-01
Gene disruption frequently produces no phenotype in the model plant Arabidopsis thaliana, complicating studies of gene function. Functional redundancy between gene family members is one common explanation but inadequate detection methods could also be responsible. Here, newly developed methods for automated capture and processing of time series of images, followed by computational analysis employing modified linear discriminant analysis (LDA) and wavelet-based differentiation, were employed in a study of mutants lacking the Glutamate Receptor-Like 3.3 gene. Root gravitropism was selected as the process to study with high spatiotemporal resolution because the ligand-gated Ca(2+)-permeable channel encoded by GLR3.3 may contribute to the ion fluxes associated with gravity signal transduction in roots. Time series of root tip angles were collected from wild type and two different glr3.3 mutants across a grid of seed-size and seedling-age conditions previously found to be important to gravitropism. Statistical tests of average responses detected no significant difference between populations, but LDA separated both mutant alleles from the wild type. After projecting the data onto LDA solution vectors, glr3.3 mutants displayed greater population variance than the wild type in all four conditions. In three conditions the projection means also differed significantly between mutant and wild type. Wavelet analysis of the raw response curves showed that the LDA-detected phenotypes related to an early deceleration and subsequent slower-bending phase in glr3.3 mutants. These statistically significant, heritable, computation-based phenotypes generated insight into functions of GLR3.3 in gravitropism. The methods could be generally applicable to the study of phenotypes and therefore gene function.
Miller, Nathan D.; Durham Brooks, Tessa L.; Assadi, Amir H.; Spalding, Edgar P.
2010-01-01
Gene disruption frequently produces no phenotype in the model plant Arabidopsis thaliana, complicating studies of gene function. Functional redundancy between gene family members is one common explanation but inadequate detection methods could also be responsible. Here, newly developed methods for automated capture and processing of time series of images, followed by computational analysis employing modified linear discriminant analysis (LDA) and wavelet-based differentiation, were employed in a study of mutants lacking the Glutamate Receptor-Like 3.3 gene. Root gravitropism was selected as the process to study with high spatiotemporal resolution because the ligand-gated Ca2+-permeable channel encoded by GLR3.3 may contribute to the ion fluxes associated with gravity signal transduction in roots. Time series of root tip angles were collected from wild type and two different glr3.3 mutants across a grid of seed-size and seedling-age conditions previously found to be important to gravitropism. Statistical tests of average responses detected no significant difference between populations, but LDA separated both mutant alleles from the wild type. After projecting the data onto LDA solution vectors, glr3.3 mutants displayed greater population variance than the wild type in all four conditions. In three conditions the projection means also differed significantly between mutant and wild type. Wavelet analysis of the raw response curves showed that the LDA-detected phenotypes related to an early deceleration and subsequent slower-bending phase in glr3.3 mutants. These statistically significant, heritable, computation-based phenotypes generated insight into functions of GLR3.3 in gravitropism. The methods could be generally applicable to the study of phenotypes and therefore gene function. PMID:20647506
Scannapieco, Frank A; Ho, Alex W; DiTolla, Maris; Chen, Casey; Dentino, Andrew R
2004-03-01
To determine if the prevalence of respiratory disease among dental students and dental residents varies with their exposure to the clinical dental environment. A detailed questionnaire was administered to 817 students at 3 dental schools. The questionnaire sought information concerning demographic characteristics, school year, exposure to the dental environment and dental procedures, and history of respiratory disease. The data obtained were subjected to bivariate and multiple logistic regression analysis. Respondents reported experiencing the following respiratory conditions during the previous year: asthma (26 cases), bronchitis (11 cases), chronic lung disease (6 cases), pneumonia (5 cases) and streptococcal pharyngitis (50 cases). Bivariate statistical analyses indicated no significant associations between the prevalence of any of the respiratory conditions and year in dental school, except for asthma, for which there was a significantly higher prevalence at 1 school compared to the other 2 schools. When all cases of respiratory disease were combined as a composite variable and subjected to multivariate logistic regression analysis controlling for age, sex, race, dental school, smoking history and alcohol consumption, no statistically significant association was observed between respiratory condition and year in dental school or exposure to the dental environment as a dental patient. No association was found between the prevalence of respiratory disease and a student's year in dental school or previous exposure to the dental environment as a patient. These results suggest that exposure to the dental environment does not increase the risk for respiratory infection in healthy dental health care workers.
In Situ and In Vitro Effects of Two Bleaching Treatments on Human Enamel Hardness.
Henn-Donassollo, Sandrina; Fabris, Cristiane; Gagiolla, Morgana; Kerber, Ícaro; Caetano, Vinícius; Carboni, Vitor; Salas, Mabel Miluska Suca; Donassollo, Tiago Aurélio; Demarco, Flávio Fernando
2016-01-01
The aim of this study was to evaluate in vitro and in situ the effects of two bleaching treatments on human enamel surface microhardness. Sixty enamel slabs from recently extracted thirty molars were used. The specimens were polished with sandpapers under water-cooling. The enamel samples were randomly divided in four groups, treated with 10% hydrogen peroxide (HP) or Whitening Strips (WS) containing 10% hydrogen peroxide and using two conditions: in vitro or in situ model. For in situ condition, six volunteers wore an intra-oral appliance containing enamel slabs, while for in vitro condition the specimens were kept in deionized water after the bleaching protocols. The bleaching treatments were applied one-hour daily for 14 days. Similar amounts of bleaching agents were used in both conditions. Before and after bleaching treatments, microhardness was measured. Statistical analysis (ANOVA and Tukey test) showed that in the in situ condition there was no statistically significant microhardness reduction in the bleached enamel (p>0.05). Significant decrease in hardness was observed for enamel slabs bleached with both treatments in the in vitro condition (p<0.05). Regarding the bleaching agents, in situ results showed no difference between HP and WS, while in vitro WS produced the lowest hardness value. It could be concluded that there was no deleterious effect on enamel produced by any of the bleaching protocols used in the in situ model. The reduction of hardness was only observed in vitro.
Ihira, Hikaru; Makizako, Hyuma; Mizumoto, Atsushi; Makino, Keitarou; Matsuyama, Kiyoji; Furuna, Taketo
2016-01-01
In dual-task situations, postural control is closely associated with attentional cost. Previous studies have reported age-related differences between attentional cost and postural control, but little is known about the association in conditions with a one-legged standing posture. The purpose of this study was to determine age-related differences in postural control and attentional cost while performing tasks at various difficulty levels in a one-legged standing posture. In total, 29 healthy older adults aged 64 to 78 years [15 males, 14 females, mean (SD) = 71.0 (3.8) years] and 29 healthy young adults aged 20 to 26 years [14 males, 15 females, mean (SD) = 22.5 (1.5) years] participated in this study. We measured the reaction time, trunk accelerations, and lower limb muscle activity under 3 different one-legged standing conditions-on a firm surface, on a soft surface with a urethane mat, and on a softer more unstable surface with 2 piled urethane mats. Reaction time as an indication of attentional cost was measured by pressing a handheld button as quickly as possible in response to an auditory stimulus. A 2-way repeated-measures analysis of variance was performed to examine the differences between the 3 task conditions and the 2 age groups for each outcome. Trunk accelerations showed a statistically significant group-by-condition interaction in the anteroposterior (F = 9.1, P < .05), mediolateral (F = 9.9, P < .05), and vertical (F = 9.3, P < .05) directions. Muscle activity did not show a statistically significant group-by-condition interaction, but there was a significant main effect of condition in the tibialis anterior muscle (F = 33.1, P < .01) and medial gastrocnemius muscle (F = 14.7, P < .01) in young adults and the tibialis anterior muscle (F = 24.8, P < .01) and medial gastrocnemius muscle (F = 10.8, P < .01) in older adults. In addition, there was a statistically significant interaction in reaction time (F = 8.2, P < .05) for group-by-condition. The study results confirmed that reaction times in older adults are more prolonged than young adults in the same challenging postural control condition.
The Condition of Education, 1990. Volume 2: Postsecondary Education.
ERIC Educational Resources Information Center
Alsalam, Nabeel, Ed.; Rogers, Gayle Thompson, Ed.
The National Center for Education Statistics' annual statistical report on the condition of education in the United States is presented in two volumes for 1990. This volume covers postsecondary education, while the first volume addresses elementary and secondary education. Condition of education indicators (CEIs)--key data that measure the health…
Thompson, Cheryl Bagley
2009-01-01
This 13th article of the Basics of Research series is first in a short series on statistical analysis. These articles will discuss creating your statistical analysis plan, levels of measurement, descriptive statistics, probability theory, inferential statistics, and general considerations for interpretation of the results of a statistical analysis.
Statistical learning of action: the role of conditional probability.
Meyer, Meredith; Baldwin, Dare
2011-12-01
Identification of distinct units within a continuous flow of human action is fundamental to action processing. Such segmentation may rest in part on statistical learning. In a series of four experiments, we examined what types of statistics people can use to segment a continuous stream involving many brief, goal-directed action elements. The results of Experiment 1 showed no evidence for sensitivity to conditional probability, whereas Experiment 2 displayed learning based on joint probability. In Experiment 3, we demonstrated that additional exposure to the input failed to engender sensitivity to conditional probability. However, the results of Experiment 4 showed that a subset of adults-namely, those more successful at identifying actions that had been seen more frequently than comparison sequences-were also successful at learning conditional-probability statistics. These experiments help to clarify the mechanisms subserving processing of intentional action, and they highlight important differences from, as well as similarities to, prior studies of statistical learning in other domains, including language.
Inferring Small Scale Dynamics from Aircraft Measurements of Tracers
NASA Technical Reports Server (NTRS)
Sparling, L. C.; Einaudi, Franco (Technical Monitor)
2000-01-01
The millions of ER-2 and DC-8 aircraft measurements of long-lived tracers in the Upper Troposphere/Lower Stratosphere (UT/LS) hold enormous potential as a source of statistical information about subgrid scale dynamics. Extracting this information however can be extremely difficult because the measurements are made along a 1-D transect through fields that are highly anisotropic in all three dimensions. Some of the challenges and limitations posed by both the instrumentation and platform are illustrated within the context of the problem of using the data to obtain an estimate of the dissipation scale. This presentation will also include some tutorial remarks about the conditional and two-point statistics used in the analysis.
Di Lorenzo, Rosaria; Baraldi, Sara; Ferrara, Maria; Mimmi, Stefano; Rigatelli, Marco
2012-04-01
To analyze physical restraint use in an Italian acute psychiatric ward, where mechanical restraint by belt is highly discouraged but allowed. Data were retrospectively collected from medical and nursing charts, from January 1, 2005, to December 31, 2008. Physical restraint rate and relationships between restraints and selected variables were statistically analyzed. Restraints were statistically significantly more frequent in compulsory or voluntary admissions of patients with an altered state of consciousness, at night, to control aggressive behavior, and in patients with "Schizophrenia and other Psychotic Disorders" during the first 72 hr of hospitalization. Analysis of clinical and organizational factors conditioning restraints may limit its use. © 2011 Wiley Periodicals, Inc.
Souto, R Seoane; Martín-Rodero, A; Yeyati, A Levy
2016-12-23
We analyze the quantum quench dynamics in the formation of a phase-biased superconducting nanojunction. We find that in the absence of an external relaxation mechanism and for very general conditions the system gets trapped in a metastable state, corresponding to a nonequilibrium population of the Andreev bound states. The use of the time-dependent full counting statistics analysis allows us to extract information on the asymptotic population of even and odd many-body states, demonstrating that a universal behavior, dependent only on the Andreev state energy, is reached in the quantum point contact limit. These results shed light on recent experimental observations on quasiparticle trapping in superconducting atomic contacts.
The extraction and integration framework: a two-process account of statistical learning.
Thiessen, Erik D; Kronstein, Alexandra T; Hufnagle, Daniel G
2013-07-01
The term statistical learning in infancy research originally referred to sensitivity to transitional probabilities. Subsequent research has demonstrated that statistical learning contributes to infant development in a wide array of domains. The range of statistical learning phenomena necessitates a broader view of the processes underlying statistical learning. Learners are sensitive to a much wider range of statistical information than the conditional relations indexed by transitional probabilities, including distributional and cue-based statistics. We propose a novel framework that unifies learning about all of these kinds of statistical structure. From our perspective, learning about conditional relations outputs discrete representations (such as words). Integration across these discrete representations yields sensitivity to cues and distributional information. To achieve sensitivity to all of these kinds of statistical structure, our framework combines processes that extract segments of the input with processes that compare across these extracted items. In this framework, the items extracted from the input serve as exemplars in long-term memory. The similarity structure of those exemplars in long-term memory leads to the discovery of cues and categorical structure, which guides subsequent extraction. The extraction and integration framework provides a way to explain sensitivity to both conditional statistical structure (such as transitional probabilities) and distributional statistical structure (such as item frequency and variability), and also a framework for thinking about how these different aspects of statistical learning influence each other. 2013 APA, all rights reserved
The Plasma Sheet as Natural Symmetry Plane for Dipolarization Fronts in the Earth's Magnetotail
NASA Astrophysics Data System (ADS)
Frühauff, D.; Glassmeier, K.-H.
2017-11-01
In this work, observations of multispacecraft mission Time History of Events and Macroscale Interactions during Substorms are used for statistical investigation of dipolarization fronts in the near-Earth plasma sheet of the magnetotail. Using very stringent criteria, 460 events are detected in almost 10 years of mission data. Minimum variance analysis is used to determine the normal directions of the phase fronts, providing evidence for the existence of a natural symmetry of these phenomena, given by the neutral sheet of the magnetotail. This finding enables the definition of a local coordinate system based on the Tsyganenko model, reflecting the intrinsic orientation of the neutral sheet and, therefore, the dipolarization fronts. In this way, the comparison of events with very different background conditions is improved. Through this study, the statistical results of Liu, Angelopoulos, Runov, et al. (2013) are both confirmed and extended. In a case study, the knowledge of this plane of symmetry helps to explain the concave curvature of dipolarization fronts in the XZ plane through phase propagation speeds of magnetoacoustic waves. A second case study is presented to determine the central current system of a passing dipolarization front through a constellation of three spacecraft. With this information, a statistical analysis of spacecraft observations above and below the neutral sheet is used to provide further evidence for the neutral sheet as the symmetry plane and the central current system. Furthermore, it is shown that the signatures of dipolarization fronts are under certain conditions closely related to that of flux ropes, indicating a possible relationship between these two transient phenomena.
NASA Technical Reports Server (NTRS)
Cull, R. C.; Eltimsahy, A. H.
1983-01-01
The present investigation is concerned with the formulation of energy management strategies for stand-alone photovoltaic (PV) systems, taking into account a basic control algorithm for a possible predictive, (and adaptive) controller. The control system controls the flow of energy in the system according to the amount of energy available, and predicts the appropriate control set-points based on the energy (insolation) available by using an appropriate system model. Aspects of adaptation to the conditions of the system are also considered. Attention is given to a statistical analysis technique, the analysis inputs, the analysis procedure, and details regarding the basic control algorithm.
NASA Technical Reports Server (NTRS)
Hemingway, J. C.
1984-01-01
The objective was to determine whether the Sternberg item-recognition task, employed as a secondary task measure of spare mental capacity for flight handling qualities (FHQ) simulation research, could help to differentiate between different flight-control conditions. FHQ evaluations were conducted on the Vertical Motion Simulator at Ames Research Center to investigate different primary flight-control configurations, and selected stability and control augmentation levels for helicopters engaged in low-level flight regimes. The Sternberg task was superimposed upon the primary flight-control task in a balanced experimental design. The results of parametric statistical analysis of Sternberg secondary task data failed to support the continued use of this task as a measure of pilot workload. In addition to the secondary task, subjects provided Cooper-Harper pilot ratings (CHPR) and responded to workload questionnaire. The CHPR data also failed to provide reliable statistical discrimination between FHQ treatment conditions; some insight into the behavior of the secondary task was gained from the workload questionnaire data.
Scenario based optimization of a container vessel with respect to its projected operating conditions
NASA Astrophysics Data System (ADS)
Wagner, Jonas; Binkowski, Eva; Bronsart, Robert
2014-06-01
In this paper the scenario based optimization of the bulbous bow of the KRISO Container Ship (KCS) is presented. The optimization of the parametrically modeled vessel is based on a statistically developed operational profile generated from noon-to-noon reports of a comparable 3600 TEU container vessel and specific development functions representing the growth of global economy during the vessels service time. In order to consider uncertainties, statistical fluctuations are added. An analysis of these data lead to a number of most probable upcoming operating conditions (OC) the vessel will stay in the future. According to their respective likeliness an objective function for the evaluation of the optimal design variant of the vessel is derived and implemented within the parametrical optimization workbench FRIENDSHIP Framework. In the following this evaluation is done with respect to vessel's calculated effective power based on the usage of potential flow code. The evaluation shows, that the usage of scenarios within the optimization process has a strong influence on the hull form.
Rauen, Michelle Soares; Moreira, Emília Addison Machado; Calvo, Maria Cristina Marino; Lobo, Adriana Soares
2006-07-01
The objective of this study was to identify the relationship between the oral condition and nutritional status of all institutionalized elderly people in Florianópolis, Brazil. Of the population of 232 institutionalized individuals, the sample consisted of 187 elderly people. In the oral evaluation, the criteria used was the number of functional units present in the oral cavity, classifying the participants as those with highly compromised dentition (48%) and those with less-compromised dentition (52%). Diagnosis of nutritional status was carried out according to body mass index, observing a prevalence of 14% thin, 45% eutrophic, 28% overweight, and 13% obese. Statistical analysis of the variables studied was carried out by means of chi(2) association tests. There was a statistically significant association between highly compromised dentition and thinness (P=0.007) and among those who presented less-compromised dentition and the nutritional status of overweight, including obesity (P=0.014). It was concluded that compromising of the teeth could contribute to a tendency toward inadequate nutritional status.
NASA Astrophysics Data System (ADS)
Yuksel, Kivanc; Chang, Xin; Skarbek, Władysław
2017-08-01
The novel smile recognition algorithm is presented based on extraction of 68 facial salient points (fp68) using the ensemble of regression trees. The smile detector exploits the Support Vector Machine linear model. It is trained with few hundreds exemplar images by SVM algorithm working in 136 dimensional space. It is shown by the strict statistical data analysis that such geometric detector strongly depends on the geometry of mouth opening area, measured by triangulation of outer lip contour. To this goal two Bayesian detectors were developed and compared with SVM detector. The first uses the mouth area in 2D image, while the second refers to the mouth area in 3D animated face model. The 3D modeling is based on Candide-3 model and it is performed in real time along with three smile detectors and statistics estimators. The mouth area/Bayesian detectors exhibit high correlation with fp68/SVM detector in a range [0:8; 1:0], depending mainly on light conditions and individual features with advantage of 3D technique, especially in hard light conditions.
The Sternberg Task as a Workload Metric in Flight Handling Qualities Research
NASA Technical Reports Server (NTRS)
Hemingway, J. C.
1984-01-01
The objective of this research was to determine whether the Sternberg item-recognition task, employed as a secondary task measure of spare mental capacity for flight handling qualities (FHQ) simulation research, could help to differentiate between different flight-control conditions. FHQ evaluations were conducted on the Vertical Motion Simulator at Ames Research Center to investigate different primary flight-control configurations, and selected stability and control augmentation levels for helicopers engaged in low-level flight regimes. The Sternberg task was superimposed upon the primary flight-control task in a balanced experimental design. The results of parametric statistical analysis of Sternberg secondary task data failed to support the continued use of this task as a measure of pilot workload. In addition to the secondary task, subjects provided Cooper-Harper pilot ratings (CHPR) and responded to a workload questionnaire. The CHPR data also failed to provide reliable statistical discrimination between FHQ treatment conditions; some insight into the behavior of the secondary task was gained from the workload questionnaire data.
The analysis of influence of individual and environmental factors on 2-wheeled users' injuries.
Marković, Nenad; Pešić, Dalibor R; Antić, Boris; Vujanić, Milan
2016-08-17
Powered 2-wheeled motor vehicles (PTWs) are one of the most vulnerable categories of road users. Bearing that fact in mind, we have researched the effects of individual and environmental factors on the severity and type of injuries of PTW users. The aim was to recognize the circumstances that cause these accidents and take some preventive actions that would improve the level of road safety for PTWs. In the period from 2001 to 2010, an analysis of 139 road accidents involving PTWs was made by the Faculty of Transport and Traffic Engineering in Belgrade. The effects of both individual (age, gender, etc.) and environmental factors (place of an accident, time of day, etc.) on the cause of accidents and severity and type of injuries of PTWs are reported in this article. Analyses of these effects were conducted using logistic regression, chi-square tests, and Pearson's correlation. Factors such as categories of road users, pavement conditions, place of accident, age, and time of day have a statistically significant effect on PTW injuries, whereas other factors (gender, road type; that is, straight or curvy) do not. The article also defines the interdependence of the occurrence of particular injuries at certain speeds. The results show that if PTW users died of a head injury, these were usually concurrent with chest injuries, injuries to internal organs, and limb injuries. It has been shown that there is a high degree of influence of individual factors on the occurrence of accidents involving 2-wheelers (PTWs/bicycles) but with no statistically significant relation. Establishing the existence of such conditionalities enables identifying and defining factors that have an impact on the occurrence of traffic accidents involving bicyclists or PTWs. Such a link between individual factors and the occurrence of accidents makes it possible for system managers to take appropriate actions aimed at certain categories of 2-wheelers in order to reduce casualties in a particular area. The analysis showed that most of the road factors do not have a statistically significant effect on either category of 2-wheeler. Namely, the logistic regression analysis showed that there is a statistically significant effect of the place of accident on the occurrence of accidents involving bicyclists.
Zhao, Jianping; Avula, Bharathi; Chan, Michael; Clément, Céline; Kreuzer, Michael; Khan, Ikhlas A
2012-01-01
To gain insights on the effects of color type, cultivation history, and growing site on the composition alterations of maca (Lepidium meyenii Walpers) hypocotyls, NMR profiling combined with chemometric analysis was applied to investigate the metabolite variability in different maca accessions. Maca hypocotyls with different colors (yellow, pink, violet, and lead-colored) cultivated at different geographic sites and different areas were examined for differences in metabolite expression. Differentiations of the maca accessions grown under the different cultivation conditions were determined by principle component analyses (PCAs) which were performed on the datasets derived from their ¹H NMR spectra. A total of 16 metabolites were identified by NMR analysis, and the changes in metabolite levels in relation to the color types and growing conditions of maca hypocotyls were evaluated using univariate statistical analysis. In addition, the changes of the correlation pattern among the metabolites identified in the maca accessions planted at the two different sites were examined. The results from both multivariate and univariate analysis indicated that the planting site was the major determining factor with regards to metabolite variations in maca hypocotyls, while the color of maca accession seems to be of minor importance in this respect. © Georg Thieme Verlag KG Stuttgart · New York.
Streamwise evolution of statistical events and the triple correlation in a model wind turbine array
NASA Astrophysics Data System (ADS)
Viestenz, Kyle; Cal, Raúl Bayoán
2013-11-01
Hot-wire anemometry data, obtained from a wind tunnel experiment containing a 3 × 3 wind turbine array, are used to conditionally average the Reynolds stresses. Nine profiles at the centerline behind the array are analyzed to characterize the turbulent velocity statistics of the wake flow. Quadrant analysis yields statistical events occurring in the wake of the wind farm, where quadrants 2 and 4 produce ejections and sweeps, respectively. A balance between these quadrants is expressed via the ΔSo parameter, which attains a maximum value at the bottom tip and changes sign near the top tip of the rotor. These are then associated to the triple correlation term present in the turbulent kinetic energy equation of the fluctuations. The development of these various quantities is assessed in light of wake remediation, energy transport and possess significance in closure models. National Science Foundation: ECCS-1032647.
The log-periodic-AR(1)-GARCH(1,1) model for financial crashes
NASA Astrophysics Data System (ADS)
Gazola, L.; Fernandes, C.; Pizzinga, A.; Riera, R.
2008-02-01
This paper intends to meet recent claims for the attainment of more rigorous statistical methodology within the econophysics literature. To this end, we consider an econometric approach to investigate the outcomes of the log-periodic model of price movements, which has been largely used to forecast financial crashes. In order to accomplish reliable statistical inference for unknown parameters, we incorporate an autoregressive dynamic and a conditional heteroskedasticity structure in the error term of the original model, yielding the log-periodic-AR(1)-GARCH(1,1) model. Both the original and the extended models are fitted to financial indices of U. S. market, namely S&P500 and NASDAQ. Our analysis reveal two main points: (i) the log-periodic-AR(1)-GARCH(1,1) model has residuals with better statistical properties and (ii) the estimation of the parameter concerning the time of the financial crash has been improved.
A Statistical Framework for the Functional Analysis of Metagenomes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharon, Itai; Pati, Amrita; Markowitz, Victor
2008-10-01
Metagenomic studies consider the genetic makeup of microbial communities as a whole, rather than their individual member organisms. The functional and metabolic potential of microbial communities can be analyzed by comparing the relative abundance of gene families in their collective genomic sequences (metagenome) under different conditions. Such comparisons require accurate estimation of gene family frequencies. They present a statistical framework for assessing these frequencies based on the Lander-Waterman theory developed originally for Whole Genome Shotgun (WGS) sequencing projects. They also provide a novel method for assessing the reliability of the estimations which can be used for removing seemingly unreliable measurements.more » They tested their method on a wide range of datasets, including simulated genomes and real WGS data from sequencing projects of whole genomes. Results suggest that their framework corrects inherent biases in accepted methods and provides a good approximation to the true statistics of gene families in WGS projects.« less
Batch Statistical Process Monitoring Approach to a Cocrystallization Process.
Sarraguça, Mafalda C; Ribeiro, Paulo R S; Dos Santos, Adenilson O; Lopes, João A
2015-12-01
Cocrystals are defined as crystalline structures composed of two or more compounds that are solid at room temperature held together by noncovalent bonds. Their main advantages are the increase of solubility, bioavailability, permeability, stability, and at the same time retaining active pharmaceutical ingredient bioactivity. The cocrystallization between furosemide and nicotinamide by solvent evaporation was monitored on-line using near-infrared spectroscopy (NIRS) as a process analytical technology tool. The near-infrared spectra were analyzed using principal component analysis. Batch statistical process monitoring was used to create control charts to perceive the process trajectory and define control limits. Normal and non-normal operating condition batches were performed and monitored with NIRS. The use of NIRS associated with batch statistical process models allowed the detection of abnormal variations in critical process parameters, like the amount of solvent or amount of initial components present in the cocrystallization. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.
Generalising the logistic map through the q-product
NASA Astrophysics Data System (ADS)
Pessoa, R. W. S.; Borges, E. P.
2011-03-01
We investigate a generalisation of the logistic map as xn+1 = 1 - axn otimesqmap xn (-1 <= xn <= 1, 0 < a <= 2) where otimesq stands for a generalisation of the ordinary product, known as q-product [Borges, E.P. Physica A 340, 95 (2004)]. The usual product, and consequently the usual logistic map, is recovered in the limit q → 1, The tent map is also a particular case for qmap → ∞. The generalisation of this (and others) algebraic operator has been widely used within nonextensive statistical mechanics context (see C. Tsallis, Introduction to Nonextensive Statistical Mechanics, Springer, NY, 2009). We focus the analysis for qmap > 1 at the edge of chaos, particularly at the first critical point ac, that depends on the value of qmap. Bifurcation diagrams, sensitivity to initial conditions, fractal dimension and rate of entropy growth are evaluated at ac(qmap), and connections with nonextensive statistical mechanics are explored.
NASA Astrophysics Data System (ADS)
Brizzi, S.; Sandri, L.; Funiciello, F.; Corbi, F.; Piromallo, C.; Heuret, A.
2018-03-01
The observed maximum magnitude of subduction megathrust earthquakes is highly variable worldwide. One key question is which conditions, if any, favor the occurrence of giant earthquakes (Mw ≥ 8.5). Here we carry out a multivariate statistical study in order to investigate the factors affecting the maximum magnitude of subduction megathrust earthquakes. We find that the trench-parallel extent of subduction zones and the thickness of trench sediments provide the largest discriminating capability between subduction zones that have experienced giant earthquakes and those having significantly lower maximum magnitude. Monte Carlo simulations show that the observed spatial distribution of giant earthquakes cannot be explained by pure chance to a statistically significant level. We suggest that the combination of a long subduction zone with thick trench sediments likely promotes a great lateral rupture propagation, characteristic of almost all giant earthquakes.
Ibáñez, Sergio J.; García, Javier; Feu, Sebastian; Lorenzo, Alberto; Sampaio, Jaime
2009-01-01
The aim of the present study was to identify the game-related statistics that discriminated basketball winning and losing teams in each of the three consecutive games played in a condensed tournament format. The data were obtained from the Spanish Basketball Federation and included game-related statistics from the Under-20 league (2005-2006 and 2006-2007 seasons). A total of 223 games were analyzed with the following game-related statistics: two and three-point field goal (made and missed), free-throws (made and missed), offensive and defensive rebounds, assists, steals, turnovers, blocks (made and received), fouls committed, ball possessions and offensive rating. Results showed that winning teams in this competition had better values in all game-related statistics, with the exception of three point field goals made, free-throws missed and turnovers (p ≥ 0.05). The main effect of game number was only identified in turnovers, with a statistical significant decrease between the second and third game. No interaction was found in the analysed variables. A discriminant analysis allowed identifying the two-point field goals made, the defensive rebounds and the assists as discriminators between winning and losing teams in all three games. Additionally to these, only the three-point field goals made contributed to discriminate teams in game three, suggesting a moderate effect of fatigue. Coaches may benefit from being aware of this variation in game determinant related statistics and, also, from using offensive and defensive strategies in the third game, allowing to explore or hide the three point field-goals performance. Key points Overall team performances along the three consecutive games were very similar, not confirming an accumulated fatigue effect. The results from the three-point field goals in the third game suggested that winning teams were able to shoot better from longer distances and this could be the result of exhibiting higher conditioning status and/or the losing teams’ exhibiting low conditioning in defense. PMID:24150011
NASA Astrophysics Data System (ADS)
Lalonde, S. V.; Smith, D. S.; Owttrim, G. W.; Konhauser, K. O.
2008-03-01
Significant efforts have been made to elucidate the chemical properties of bacterial surfaces for the purposes of refining surface complexation models that can account for their metal sorptive behavior under diverse conditions. However, the influence of culturing conditions on surface chemical parameters that are modeled from the potentiometric titration of bacterial surfaces has received little regard. While culture age and metabolic pathway have been considered as factors potentially influencing cell surface reactivity, statistical treatments have been incomplete and variability has remained unconfirmed. In this study, we employ potentiometric titrations to evaluate variations in bacterial surface ligand distributions using live cells of the sheathless cyanobacterium Anabaena sp. strain PCC 7120, grown under a variety of batch culture conditions. We evaluate the ability for a single set of modeled parameters, describing acid-base surface properties averaged over all culture conditions tested, to accurately account for the ligand distributions modeled for each individual culture condition. In addition to considering growth phase, we assess the role of the various assimilatory nitrogen metabolisms available to this organism as potential determinants of surface reactivity. We observe statistically significant variability in site distribution between the majority of conditions assessed. By employing post hoc Tukey-Kramer analysis for all possible pair-wise condition comparisons, we conclude that the average parameters are inadequate for the accurate chemical description of this cyanobacterial surface. It was determined that for this Gram-negative bacterium in batch culture, ligand distributions were influenced to a greater extent by nitrogen assimilation pathway than by growth phase.
Jumbri, Khairulazhar; Al-Haniff Rozy, Mohd Fahruddin; Ashari, Siti Efliza; Mohamad, Rosfarizan; Basri, Mahiran; Fard Masoumi, Hamid Reza
2015-01-01
Kojic acid is widely used to inhibit the browning effect of tyrosinase in cosmetic and food industries. In this work, synthesis of kojic monooleate ester (KMO) was carried out using lipase-catalysed esterification of kojic acid and oleic acid in a solvent-free system. Response Surface Methodology (RSM) based on central composite rotatable design (CCRD) was used to optimise the main important reaction variables, such as enzyme amount, reaction temperature, substrate molar ratio, and reaction time along with immobilised lipase from Candida Antarctica (Novozym 435) as a biocatalyst. The RSM data indicated that the reaction temperature was less significant in comparison to other factors for the production of a KMO ester. By using this statistical analysis, a quadratic model was developed in order to correlate the preparation variable to the response (reaction yield). The optimum conditions for the enzymatic synthesis of KMO were as follows: an enzyme amount of 2.0 wt%, reaction temperature of 83.69°C, substrate molar ratio of 1:2.37 (mmole kojic acid:oleic acid) and a reaction time of 300.0 min. Under these conditions, the actual yield percentage obtained was 42.09%, which is comparably well with the maximum predicted value of 44.46%. Under the optimal conditions, Novozym 435 could be reused for 5 cycles for KMO production percentage yield of at least 40%. The results demonstrated that statistical analysis using RSM can be used efficiently to optimise the production of a KMO ester. Moreover, the optimum conditions obtained can be applied to scale-up the process and minimise the cost.
Jumbri, Khairulazhar; Al-Haniff Rozy, Mohd Fahruddin; Ashari, Siti Efliza; Mohamad, Rosfarizan; Basri, Mahiran; Fard Masoumi, Hamid Reza
2015-01-01
Kojic acid is widely used to inhibit the browning effect of tyrosinase in cosmetic and food industries. In this work, synthesis of kojic monooleate ester (KMO) was carried out using lipase-catalysed esterification of kojic acid and oleic acid in a solvent-free system. Response Surface Methodology (RSM) based on central composite rotatable design (CCRD) was used to optimise the main important reaction variables, such as enzyme amount, reaction temperature, substrate molar ratio, and reaction time along with immobilised lipase from Candida Antarctica (Novozym 435) as a biocatalyst. The RSM data indicated that the reaction temperature was less significant in comparison to other factors for the production of a KMO ester. By using this statistical analysis, a quadratic model was developed in order to correlate the preparation variable to the response (reaction yield). The optimum conditions for the enzymatic synthesis of KMO were as follows: an enzyme amount of 2.0 wt%, reaction temperature of 83.69°C, substrate molar ratio of 1:2.37 (mmole kojic acid:oleic acid) and a reaction time of 300.0 min. Under these conditions, the actual yield percentage obtained was 42.09%, which is comparably well with the maximum predicted value of 44.46%. Under the optimal conditions, Novozym 435 could be reused for 5 cycles for KMO production percentage yield of at least 40%. The results demonstrated that statistical analysis using RSM can be used efficiently to optimise the production of a KMO ester. Moreover, the optimum conditions obtained can be applied to scale-up the process and minimise the cost. PMID:26657030
Lally, Richard D.; Galbally, Paul; Moreira, António S.; Spink, John; Ryan, David; Germaine, Kieran J.; Dowling, David N.
2017-01-01
Plant associated bacteria with plant growth promotion (PGP) properties have been proposed for use as environmentally friendly biofertilizers for sustainable agriculture; however, analysis of their efficacy in the field is often limited. In this study, greenhouse and field trials were carried out using individual endophytic Pseudomonas fluorescens strains, the well characterized rhizospheric P. fluorescens F113 and an endophytic microbial consortium of 10 different strains. These bacteria had been previously characterized with respect to their PGP properties in vitro and had been shown to harbor a range of traits associated with PGP including siderophore production, 1-aminocyclopropane-1-carboxylic acid (ACC) deaminase activity, and inorganic phosphate solubilization. In greenhouse experiments individual strains tagged with gfp and Kmr were applied to Brassica napus as a seed coat and were shown to effectively colonize the rhizosphere and root of B. napus and in addition they demonstrated a significant increase in plant biomass compared with the non-inoculated control. In the field experiment, the bacteria (individual and consortium) were spray inoculated to winter oilseed rape B. napus var. Compass which was grown under standard North Western European agronomic conditions. Analysis of the data provides evidence that the application of the live bacterial biofertilizers can enhance aspects of crop development in B. napus at field scale. The field data demonstrated statistically significant increases in crop height, stem/leaf, and pod biomass, particularly, in the case of the consortium inoculated treatment. However, although seed and oil yield were increased in the field in response to inoculation, these data were not statistically significant under the experimental conditions tested. Future field trials will investigate the effectiveness of the inoculants under different agronomic conditions. PMID:29312422
Network meta-analysis: a technique to gather evidence from direct and indirect comparisons
2017-01-01
Systematic reviews and pairwise meta-analyses of randomized controlled trials, at the intersection of clinical medicine, epidemiology and statistics, are positioned at the top of evidence-based practice hierarchy. These are important tools to base drugs approval, clinical protocols and guidelines formulation and for decision-making. However, this traditional technique only partially yield information that clinicians, patients and policy-makers need to make informed decisions, since it usually compares only two interventions at the time. In the market, regardless the clinical condition under evaluation, usually many interventions are available and few of them have been studied in head-to-head studies. This scenario precludes conclusions to be drawn from comparisons of all interventions profile (e.g. efficacy and safety). The recent development and introduction of a new technique – usually referred as network meta-analysis, indirect meta-analysis, multiple or mixed treatment comparisons – has allowed the estimation of metrics for all possible comparisons in the same model, simultaneously gathering direct and indirect evidence. Over the last years this statistical tool has matured as technique with models available for all types of raw data, producing different pooled effect measures, using both Frequentist and Bayesian frameworks, with different software packages. However, the conduction, report and interpretation of network meta-analysis still poses multiple challenges that should be carefully considered, especially because this technique inherits all assumptions from pairwise meta-analysis but with increased complexity. Thus, we aim to provide a basic explanation of network meta-analysis conduction, highlighting its risks and benefits for evidence-based practice, including information on statistical methods evolution, assumptions and steps for performing the analysis. PMID:28503228
Statistical analysis of low frequency vibrations in variable speed wind turbines
NASA Astrophysics Data System (ADS)
Escaler, X.; Mebarki, T.
2013-12-01
The spectral content of the low frequency vibrations in the band from 0 to 10 Hz measured in full scale wind turbines has been statistically analyzed as a function of the whole range of steady operating conditions. Attention has been given to the amplitudes of the vibration peaks and their dependency on rotating speed and power output. Two different wind turbine models of 800 and 2000 kW have been compared. For each model, a sample of units located in the same wind farm and operating during a representative period of time have been considered. A condition monitoring system installed in each wind turbine has been used to register the axial acceleration on the gearbox casing between the intermediate and the high speed shafts. The average frequency spectrum has permitted to identify the vibration signature and the position of the first tower natural frequency in both models. The evolution of the vibration amplitudes at the rotor rotating frequency and its multiples has shown that the tower response is amplified by resonance conditions in one of the models. So, it is concluded that a continuous measurement and control of low frequency vibrations is required to protect the turbines against harmful vibrations of this nature.
CAVASSIM, Rodrigo; LEITE, Fábio Renato Manzolli; ZANDIM, Daniela Leal; DANTAS, Andrea Abi Rached; RACHED, Ricardo Samih Georges Abi; SAMPAIO, José Eduardo Cezar
2012-01-01
Objective The aim of this study was to establish the parameters of concentration, time and mode of application of citric acid and sodium citrate in relation to root conditioning. Material and Methods A total of 495 samples were obtained and equally distributed among 11 groups (5 for testing different concentrations of citric acid, 5 for testing different concentrations of sodium citrate and 1 control group). After laboratorial processing, the samples were analyzed under scanning electron microscopy. A previously calibrated and blind examiner evaluated micrographs of the samples. Non-parametric statistical analysis was performed to analyze the data obtained. Results Brushing 25% citric acid for 3 min, promoted greater exposure of collagen fibers in comparison with the brushing of 1% citric acid for 1 minute and its topical application at 1% for 3 min. Sodium citrate exposed collagen fibers in a few number of samples. Conclusion Despite the lack of statistical significance, better results for collagen exposure were obtained with brushing application of 25% citric acid for 3 min than with other application parameter. Sodium citrate produced a few number of samples with collagen exposure, so it is not indicated for root conditioning. PMID:22858707
Wu, Wei-Jie; Ahn, Byung-Yong
2014-01-01
Response surface methodology (RSM) was used to determine the optimum vitamin D2 synthesis conditions in oyster mushrooms (Pleurotus ostreatus). Ultraviolet B (UV-B) was selected as the most efficient irradiation source for the preliminary experiment, in addition to the levels of three independent variables, which included ambient temperature (25-45°C), exposure time (40-120 min), and irradiation intensity (0.6-1.2 W/m2). The statistical analysis indicated that, for the range which was studied, irradiation intensity was the most critical factor that affected vitamin D2 synthesis in oyster mushrooms. Under optimal conditions (ambient temperature of 28.16°C, UV-B intensity of 1.14 W/m2, and exposure time of 94.28 min), the experimental vitamin D2 content of 239.67 µg/g (dry weight) was in very good agreement with the predicted value of 245.49 µg/g, which verified the practicability of this strategy. Compared to fresh mushrooms, the lyophilized mushroom powder can synthesize remarkably higher level of vitamin D2 (498.10 µg/g) within much shorter UV-B exposure time (10 min), and thus should receive attention from the food processing industry.
Improved silicon nitride for advanced heat engines
NASA Technical Reports Server (NTRS)
Yeh, Hun C.; Fang, Ho T.
1987-01-01
The technology base required to fabricate silicon nitride components with the strength, reliability, and reproducibility necessary for actual heat engine applications is presented. Task 2 was set up to develop test bars with high Weibull slope and greater high temperature strength, and to conduct an initial net shape component fabrication evaluation. Screening experiments were performed in Task 7 on advanced materials and processing for input to Task 2. The technical efforts performed in the second year of a 5-yr program are covered. The first iteration of Task 2 was completed as planned. Two half-replicated, fractional factorial (2 sup 5), statistically designed matrix experiments were conducted. These experiments have identified Denka 9FW Si3N4 as an alternate raw material to GTE SN502 Si3N4 for subsequent process evaluation. A detailed statistical analysis was conducted to correlate processing conditions with as-processed test bar properties. One processing condition produced a material with a 97 ksi average room temperature MOR (100 percent of goal) with 13.2 Weibull slope (83 percent of goal); another condition produced 86 ksi (6 percent over baseline) room temperature strength with a Weibull slope of 20 (125 percent of goal).
Dennett, Amy M; Taylor, Nicholas F
2015-01-01
To determine the effectiveness of computer-based electronic devices that provide feedback in improving mobility and balance and reducing falls. Randomized controlled trials were searched from the earliest available date to August 2013. Standardized mean differences were used to complete meta-analyses, with statistical heterogeneity being described with the I-squared statistic. The GRADE approach was used to summarize the level of evidence for each completed meta-analysis. Risk of bias for individual trials was assessed with the (Physiotherapy Evidence Database) PEDro scale. Thirty trials were included. There was high-quality evidence that computerized devices can improve dynamic balance in people with a neurological condition compared with no therapy. There was low-to-moderate-quality evidence that computerized devices have no significant effect on mobility, falls efficacy and falls risk in community-dwelling older adults, and people with a neurological condition compared with physiotherapy. There is high-quality evidence that computerized devices that provide feedback may be useful in improving balance in people with neurological conditions compared with no therapy, but there is a lack of evidence supporting more meaningful changes in mobility and falls risk.
NASA Astrophysics Data System (ADS)
Ng, C. S.; Bhattacharjee, A.
1996-08-01
A sufficient condition is obtained for the development of a finite-time singularity in a highly symmetric Euler flow, first proposed by Kida [J. Phys. Soc. Jpn. 54, 2132 (1995)] and recently simulated by Boratav and Pelz [Phys. Fluids 6, 2757 (1994)]. It is shown that if the second-order spatial derivative of the pressure (pxx) is positive following a Lagrangian element (on the x axis), then a finite-time singularity must occur. Under some assumptions, this Lagrangian sufficient condition can be reduced to an Eulerian sufficient condition which requires that the fourth-order spatial derivative of the pressure (pxxxx) at the origin be positive for all times leading up to the singularity. Analytical as well as direct numerical evaluation over a large ensemble of initial conditions demonstrate that for fixed total energy, pxxxx is predominantly positive with the average value growing with the numbers of modes.
Data of ERPs and spectral alpha power when attention is engaged on visual or verbal/auditory imagery
Villena-González, Mario; López, Vladimir; Rodríguez, Eugenio
2016-01-01
This article provides data from statistical analysis of event-related brain potentials (ERPs) and spectral power from 20 participants during three attentional conditions. Specifically, P1, N1 and P300 amplitude of ERP were compared when participant׳s attention was oriented to an external task, to a visual imagery and to an inner speech. The spectral power from alpha band was also compared in these three attentional conditions. These data are related to the research article where sensory processing of external information was compared during these three conditions entitled “Orienting attention to visual or verbal/auditory imagery differentially impairs the processing of visual stimuli” (Villena-Gonzalez et al., 2016) [1]. PMID:27077090
Kennedy, Andrea; Semple, Lisa; Alderson, Kerri; Bouskill, Vanessa; Karasevich, Janice; Riske, Brenda; van Gunst, Sheri
Children who are living with chronic conditions may be supported in self-care through enjoyable active learning and family social processes. This research focused on development and evaluation of "Don't Push Your Luck!", an educational board game designed to inspire family discussion about chronic conditions, and help affected children learn about self-care choices and consequences. Mixed-method research was conducted with families from one outpatient Cystic Fibrosis Clinic and four Hemophilia Treatment Centres in Canada and United States (N=72). In phase I, board game prototype and questionnaires were refined with affected boys, siblings, and parents living with hemophilia (n=11), compared with families living with cystic fibrosis (n=11). In phase II, final board game was evaluated with families living with hemophilia (n=50). Data collection included pre-post-game questionnaires on decision-making and Haemo-QoL Index©, and post-game enjoyment. Analysis included descriptive statistics, inferential statistics (non-parametric), and qualitative themes. Findings revealed this game was an enjoyable and effective resource to engage families in self-care discussions. Key themes included communication, being involved, knowing, decisions and consequences, and being connected. Qualitative and quantitative findings aligned. Statistical significance suggests the game enhanced family engagement to support decision-making skills, as parents identified that the game helped them talk about important topics, and children gained insight regarding family supports and self-care responsibility. This board game was an effective, developmentally appropriate family resource to facilitate engagement and conversation about everyday life experiences in preparation for self-care. There is promising potential to extend this educational family board game intervention with a greater range of school-age children and families living with chronic conditions. Copyright © 2017 Elsevier Inc. All rights reserved.
Fall 2014 SEI Research Review Probabilistic Analysis of Time Sensitive Systems
2014-10-28
Osmosis SMC Tool Osmosis is a tool for Statistical Model Checking (SMC) with Semantic Importance Sampling. • Input model is written in subset of C...ASSERT() statements in model indicate conditions that must hold. • Input probability distributions defined by the user. • Osmosis returns the...on: – Target relative error, or – Set number of simulations Osmosis Main Algorithm 1 http://dreal.cs.cmu.edu/ (?⃑?): Indicator
Evaluation program for secondary spacecraft cells
NASA Technical Reports Server (NTRS)
Christy, D. E.; Harkness, J. D.
1973-01-01
A life cycle test of secondary electric batteries for spacecraft applications was conducted. A sample number of nickel cadmium batteries were subjected to general performance tests to determine the limit of their actual capabilities. Weaknesses discovered in cell design are reported and aid in research and development efforts toward improving the reliability of spacecraft batteries. A statistical analysis of the life cycle prediction and cause of failure versus test conditions is provided.
Comparative Statistical Analysis of Auroral Models
2012-03-22
was willing to add this project to her extremely busy schedule. Lastly, I must also express my sincere appreciation for the rest of the faculty and...models have been extensively used for estimating GPS and other communication satellite disturbances ( Newell et al., 2010a). The auroral oval...models predict changes in the auroral oval in response to various geomagnetic conditions. In 2010, Newell et al. conducted a comparative study of
ERIC Educational Resources Information Center
Liao, Pei-shan
2009-01-01
This study explores the consistency between objective indicators and subjective perceptions of quality of life in a ranking of survey data for cities and counties in Taiwan. Data used for analysis included the Statistical Yearbook of Hsiens and Municipalities and the Survey on Living Conditions of Citizens in Taiwan, both given for the year 2000.…
The Condition of Education, 1990. Volume 1: Elementary and Secondary Education.
ERIC Educational Resources Information Center
Ogle, Laurence T., Ed.; Alsalam, Nabeel, Ed.
This is the first of two volumes of the National Center for Education Statistics' annual statistical report on the condition of education in the United States for 1990. This volume addresses elementary and secondary education, while the second volume covers postsecondary education (PE). Condition of education indicators (CEIs)--key data that…
NASA Astrophysics Data System (ADS)
Yao, Yuchen; Bao, Jie; Skyllas-Kazacos, Maria; Welch, Barry J.; Akhmetov, Sergey
2018-04-01
Individual anode current signals in aluminum reduction cells provide localized cell conditions in the vicinity of each anode, which contain more information than the conventionally measured cell voltage and line current. One common use of this measurement is to identify process faults that can cause significant changes in the anode current signals. While this method is simple and direct, it ignores the interactions between anode currents and other important process variables. This paper presents an approach that applies multivariate statistical analysis techniques to individual anode currents and other process operating data, for the detection and diagnosis of local process abnormalities in aluminum reduction cells. Specifically, since the Hall-Héroult process is time-varying with its process variables dynamically and nonlinearly correlated, dynamic kernel principal component analysis with moving windows is used. The cell is discretized into a number of subsystems, with each subsystem representing one anode and cell conditions in its vicinity. The fault associated with each subsystem is identified based on multivariate statistical control charts. The results show that the proposed approach is able to not only effectively pinpoint the problematic areas in the cell, but also assess the effect of the fault on different parts of the cell.
Web-GIS-based SARS epidemic situation visualization
NASA Astrophysics Data System (ADS)
Lu, Xiaolin
2004-03-01
In order to research, perform statistical analysis and broadcast the information of SARS epidemic situation according to the relevant spatial position, this paper proposed a unified global visualization information platform for SARS epidemic situation based on Web-GIS and scientific virtualization technology. To setup the unified global visual information platform, the architecture of Web-GIS based interoperable information system is adopted to enable public report SARS virus information to health cure center visually by using the web visualization technology. A GIS java applet is used to visualize the relationship between spatial graphical data and virus distribution, and other web based graphics figures such as curves, bars, maps and multi-dimensional figures are used to visualize the relationship between SARS virus tendency with time, patient number or locations. The platform is designed to display the SARS information in real time, simulate visually for real epidemic situation and offer an analyzing tools for health department and the policy-making government department to support the decision-making for preventing against the SARS epidemic virus. It could be used to analyze the virus condition through visualized graphics interface, isolate the areas of virus source, and control the virus condition within shortest time. It could be applied to the visualization field of SARS preventing systems for SARS information broadcasting, data management, statistical analysis, and decision supporting.
Enzinger, Ewald; Morrison, Geoffrey Stewart
2017-08-01
In a 2012 case in New South Wales, Australia, the identity of a speaker on several audio recordings was in question. Forensic voice comparison testimony was presented based on an auditory-acoustic-phonetic-spectrographic analysis. No empirical demonstration of the validity and reliability of the analytical methodology was presented. Unlike the admissibility standards in some other jurisdictions (e.g., US Federal Rule of Evidence 702 and the Daubert criteria, or England & Wales Criminal Practice Directions 19A), Australia's Unified Evidence Acts do not require demonstration of the validity and reliability of analytical methods and their implementation before testimony based upon them is presented in court. The present paper reports on empirical tests of the performance of an acoustic-phonetic-statistical forensic voice comparison system which exploited the same features as were the focus of the auditory-acoustic-phonetic-spectrographic analysis in the case, i.e., second-formant (F2) trajectories in /o/ tokens and mean fundamental frequency (f0). The tests were conducted under conditions similar to those in the case. The performance of the acoustic-phonetic-statistical system was very poor compared to that of an automatic system. Copyright © 2017 Elsevier B.V. All rights reserved.
Ferrara, Pietro; Scancarello, Marta; Khazrai, Yeganeh M; Romani, Lorenza; Cutrona, Costanza; DE Gara, Laura; Bona, Gianni
2016-10-12
The nutritional status of foster children, the quality of daily menus in group homes and the Food Security inside these organizations have been poorly studied and this study means to investigate them. A sample of 125 children, ranging in age from 0-17 years, among seven group homes (group A) was compared with 121 children of the general population we (group B). To evaluate nutritional status, BMI percentiles were used. Mean percentiles of both groups were compared through statistical analysis. Both nutritional and caloric daily distributions in each organization were obtained using the 24-hour recall method. A specific questionnaire was administered to evaluate Food Security. From the analysis of mean BMI-for-age (or height-for-length) percentiles, did not observe statistically significant differences between group A and group B. The average daily nutrient and calorie distribution in group homes proves to be nearly optimal with the exception of a slight excess in proteins and a slight deficiency in PUFAs. Moreover, a low intake of iron and calcium was revealed. All organizations obtained a "High Food Security" profile. Nutritional conditions of foster children are no worse than that of children of the general population. Foster care provides the necessary conditions to support their growth.
Hybrid modeling as a QbD/PAT tool in process development: an industrial E. coli case study.
von Stosch, Moritz; Hamelink, Jan-Martijn; Oliveira, Rui
2016-05-01
Process understanding is emphasized in the process analytical technology initiative and the quality by design paradigm to be essential for manufacturing of biopharmaceutical products with consistent high quality. A typical approach to developing a process understanding is applying a combination of design of experiments with statistical data analysis. Hybrid semi-parametric modeling is investigated as an alternative method to pure statistical data analysis. The hybrid model framework provides flexibility to select model complexity based on available data and knowledge. Here, a parametric dynamic bioreactor model is integrated with a nonparametric artificial neural network that describes biomass and product formation rates as function of varied fed-batch fermentation conditions for high cell density heterologous protein production with E. coli. Our model can accurately describe biomass growth and product formation across variations in induction temperature, pH and feed rates. The model indicates that while product expression rate is a function of early induction phase conditions, it is negatively impacted as productivity increases. This could correspond with physiological changes due to cytoplasmic product accumulation. Due to the dynamic nature of the model, rational process timing decisions can be made and the impact of temporal variations in process parameters on product formation and process performance can be assessed, which is central for process understanding.
Bodner, Todd E.
2017-01-01
Wilkinson and Task Force on Statistical Inference (1999) recommended that researchers include information on the practical magnitude of effects (e.g., using standardized effect sizes) to distinguish between the statistical and practical significance of research results. To date, however, researchers have not widely incorporated this recommendation into the interpretation and communication of the conditional effects and differences in conditional effects underlying statistical interactions involving a continuous moderator variable where at least one of the involved variables has an arbitrary metric. This article presents a descriptive approach to investigate two-way statistical interactions involving continuous moderator variables where the conditional effects underlying these interactions are expressed in standardized effect size metrics (i.e., standardized mean differences and semi-partial correlations). This approach permits researchers to evaluate and communicate the practical magnitude of particular conditional effects and differences in conditional effects using conventional and proposed guidelines, respectively, for the standardized effect size and therefore provides the researcher important supplementary information lacking under current approaches. The utility of this approach is demonstrated with two real data examples and important assumptions underlying the standardization process are highlighted. PMID:28484404
Los Alamos National Laboratory W76 Pit Tube Lifetime Study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abeln, Terri G.
2012-04-25
A metallurgical study was requested as part of the Los Alamos National Laboratory (LANL) W76-1 life-extension program (LEP) involving a lifetime analysis of type 304 stainless steel pit tubes subject to repeat bending loads during assembly and disassembly operations at BWXT/Pantex. This initial test phase was completed during the calendar years of 2004-2006 and the report not issued until additional recommended tests could be performed. These tests have not been funded to this date and therefore this report is considered final. Tubes were reportedly fabricated according to Rocky Flats specification P14548 - Seamless Type 304 VIM/VAR Stainless Steel Tubing. Tubemore » diameter was specified as 0.125 inches and wall thickness as 0.028 inches. A heat treat condition is not specified and the hardness range specification can be characteristic of both 1/8 and 1/4 hard conditions. Properties of all tubes tested were within specification. Metallographic analysis could not conclusively determine a specified limit to number of bends allowable. A statistical analysis suggests a range of 5-7 bends with a 99.95% confidence limit. See the 'Statistical Analysis' section of this report. The initial phase of this study involved two separate sets of test specimens. The first group was part of an investigation originating in the ESA-GTS [now Gas Transfer Systems (W-7) Group]. After the bend cycle test parameters were chosen (all three required bends subjected to the same amount of bend cycles) and the tubes bent, the investigation was transferred to Terri Abeln (Metallurgical Science and Engineering) for analysis. Subsequently, another limited quantity of tubes became available for testing and were cycled with the same bending fixture, but with different test parameters determined by T. Abeln.« less
ERIC Educational Resources Information Center
Hoeken, Hans; Hustinx, Lettica
2009-01-01
Under certain conditions, statistical evidence is more persuasive than anecdotal evidence in supporting a claim about the probability that a certain event will occur. In three experiments, it is shown that the type of argument is an important condition in this respect. If the evidence is part of an argument by generalization, statistical evidence…
Optimisation of warpage on plastic injection moulding part using response surface methodology (RSM)
NASA Astrophysics Data System (ADS)
Miza, A. T. N. A.; Shayfull, Z.; Nasir, S. M.; Fathullah, M.; Rashidi, M. M.
2017-09-01
The warpage is often encountered which occur during injection moulding process of thin shell part depending the process condition. The statistical design of experiment method which are Integrating Finite Element (FE) Analysis, moldflow analysis and response surface methodology (RSM) are the stage of few ways in minimize the warpage values of x,y and z on thin shell plastic parts that were investigated. A battery cover of a remote controller is one of the thin shell plastic part that produced by using injection moulding process. The optimum process condition parameter were determined as to achieve the minimum warpage from being occur. Packing pressure, Cooling time, Melt temperature and Mould temperature are 4 parameters that considered in this study. A two full factorial experimental design was conducted in Design Expert of RSM analysis as to combine all these parameters study. FE analysis result gain from analysis of variance (ANOVA) method was the one of the important process parameters influenced warpage. By using RSM, a predictive response surface model for warpage data will be shown.
Lonsdorf, Tina B; Menz, Mareike M; Andreatta, Marta; Fullana, Miguel A; Golkar, Armita; Haaker, Jan; Heitland, Ivo; Hermann, Andrea; Kuhn, Manuel; Kruse, Onno; Meir Drexler, Shira; Meulders, Ann; Nees, Frauke; Pittig, Andre; Richter, Jan; Römer, Sonja; Shiban, Youssef; Schmitz, Anja; Straube, Benjamin; Vervliet, Bram; Wendt, Julia; Baas, Johanna M P; Merz, Christian J
2017-06-01
The so-called 'replicability crisis' has sparked methodological discussions in many areas of science in general, and in psychology in particular. This has led to recent endeavours to promote the transparency, rigour, and ultimately, replicability of research. Originating from this zeitgeist, the challenge to discuss critical issues on terminology, design, methods, and analysis considerations in fear conditioning research is taken up by this work, which involved representatives from fourteen of the major human fear conditioning laboratories in Europe. This compendium is intended to provide a basis for the development of a common procedural and terminology framework for the field of human fear conditioning. Whenever possible, we give general recommendations. When this is not feasible, we provide evidence-based guidance for methodological decisions on study design, outcome measures, and analyses. Importantly, this work is also intended to raise awareness and initiate discussions on crucial questions with respect to data collection, processing, statistical analyses, the impact of subtle procedural changes, and data reporting specifically tailored to the research on fear conditioning. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Statistical principle and methodology in the NISAN system.
Asano, C
1979-01-01
The NISAN system is a new interactive statistical analysis program package constructed by an organization of Japanese statisticans. The package is widely available for both statistical situations, confirmatory analysis and exploratory analysis, and is planned to obtain statistical wisdom and to choose optimal process of statistical analysis for senior statisticians. PMID:540594
Hierl, L.A.; Loftin, C.S.; Longcore, J.R.; McAuley, D.G.; Urban, D.L.
2007-01-01
We assessed changes in vegetative structure of 49 impoundments at Moosehorn National Wildlife Refuge (MNWR), Maine, USA, between the periods 1984-1985 to 2002 with a multivariate, adaptive approach that may be useful in a variety of wetland and other habitat management situations. We used Mahalanobis Distance (MD) analysis to classify the refuge?s wetlands as poor or good waterbird habitat based on five variables: percent emergent vegetation, percent shrub, percent open water, relative richness of vegetative types, and an interspersion juxtaposition index that measures adjacency of vegetation patches. Mahalanobis Distance is a multivariate statistic that examines whether a particular data point is an outlier or a member of a data cluster while accounting for correlations among inputs. For each wetland, we used MD analysis to quantify a distance from a reference condition defined a priori by habitat conditions measured in MNWR wetlands used by waterbirds. Twenty-five wetlands declined in quality between the two periods, whereas 23 wetlands improved. We identified specific wetland characteristics that may be modified to improve habitat conditions for waterbirds. The MD analysis seems ideal for instituting an adaptive wetland management approach because metrics can be easily added or removed, ranges of target habitat conditions can be defined by field-collected data, and the analysis can identify priorities for single or multiple management objectives.
Diagnostic value of creatine kinase activity in canine cerebrospinal fluid.
Ferreira, Alexandra
2016-10-01
This study aimed to determine whether creatine kinase (CK) activity in cerebrospinal fluid (CSF) has diagnostic value for various groups of neurological conditions or for different anatomical areas of the nervous system (NS). The age, breed, results of CSF analysis, and diagnosis of 578 canine patients presenting with various neurological conditions between January 2009 and February 2015 were retrospectively collected. The cases were divided according to anatomical areas of the nervous system, i.e., brain, spinal cord, and peripheral nervous system, and into groups according to the nature of the condition diagnosed: vascular, immune/inflammatory/infectious, traumatic, toxic, anomalous, metabolic, idiopathic, neoplastic, and degenerative. Statistical analysis showed that CSF-CK alone cannot be used as a diagnostic tool and that total proteins in the CSF and red blood cells (RBCs) do not have a significant relationship with the CSF-CK activity. CSF-CK did not have a diagnostic value for different disease groups or anatomical areas of the nervous system.
Contrera-Moreno, Luciana; de Andrade, Sonia Maria Oliveira; Motta-Castro, Ana Rita Coimbra; Pinto, Alexandra Maria Almeida Carvalho; Salas, Frederico Reis Pouso; Stief, Alcione Cavalheiros Faro
2012-01-01
Firefighters are exposed to a wide range of risks, among them, biological risk. The objective was to analyze working conditions of firefighters in the city of Campo Grande, MS, Brazil, focusing on risk conditions of exposure to biological material. Three hundred and seven (307) firefighters were interviewed for data collection and observed for ergonomic job analysis (AET). 63.5% of the firefighters suffered some kind of job related accident with blood or body fluids. Statistically significant association was found between having suffered accidents at work and incomplete use of personal protective equipment (PPE). About AET regarding the biological risks, 57.1% of all patients had blood or secretions, which corresponds in average to 16.0% of the total work time, based on a working day of 24 h. Besides biological risks, other stressing factors were identified: emergency and complexity of decision, high responsibility regarding patients and environment, and conflicts. Health promotion and accident prevention actions must be emphasized as measures to minimize these risks.
Integrating single-cell transcriptomic data across different conditions, technologies, and species.
Butler, Andrew; Hoffman, Paul; Smibert, Peter; Papalexi, Efthymia; Satija, Rahul
2018-06-01
Computational single-cell RNA-seq (scRNA-seq) methods have been successfully applied to experiments representing a single condition, technology, or species to discover and define cellular phenotypes. However, identifying subpopulations of cells that are present across multiple data sets remains challenging. Here, we introduce an analytical strategy for integrating scRNA-seq data sets based on common sources of variation, enabling the identification of shared populations across data sets and downstream comparative analysis. We apply this approach, implemented in our R toolkit Seurat (http://satijalab.org/seurat/), to align scRNA-seq data sets of peripheral blood mononuclear cells under resting and stimulated conditions, hematopoietic progenitors sequenced using two profiling technologies, and pancreatic cell 'atlases' generated from human and mouse islets. In each case, we learn distinct or transitional cell states jointly across data sets, while boosting statistical power through integrated analysis. Our approach facilitates general comparisons of scRNA-seq data sets, potentially deepening our understanding of how distinct cell states respond to perturbation, disease, and evolution.
NASA Technical Reports Server (NTRS)
Volino, Ralph J.; Simon, Terrence W.
1995-01-01
Measurements from transitional, heated boundary layers along a concave-curved test wall are presented and discussed. A boundary layer subject to low free-stream turbulence intensity (FSTI), which contains stationary streamwise (Gortler) vortices, is documented. The low FSTI measurements are followed by measurements in boundary layers subject to high (initially 8%) free-stream turbulence intensity and moderate to strong streamwise acceleration. Conditions were chosen to simulate those present on the downstream half of the pressure side of a gas turbine airfoil. Mean flow characteristics as well as turbulence statistics, including the turbulent shear stress, turbulent heat flux, and turbulent Prandtl number, are documented. A technique called "octant analysis" is introduced and applied to several cases from the literature as well as to data from the present study. Spectral analysis was applied to describe the effects of turbulence scales of different sizes during transition. To the authors'knowledge, this is the first detailed documentation of boundary layer transition under such high free-stream turbulence conditions.
Hahn, A; Hock, B
1999-01-01
Spore color mutants of the fungus Sordaria macrospora Auersw. were crossed under spaceflight conditions on the space shuttle to MIR mission S/MM 05 (STS-81). The arrangement of spores of different colors in the asci allowed conclusions on the influence of spaceflight conditions on sexual recombination in fungi. Experiments on a 1-g centrifuge in space and in parallel on the ground were used for controls. The samples were analyzed microscopically on their return to earth. Each fruiting body was assessed separately. Statistical analysis of the data showed a significant increase in gene recombination frequencies caused by the heavy ion particle stream in space radiation. The lack of gravity did not influence crossing-over frequencies. Hyphae of the flown samples were assessed for DNA strand breaks. No increase in damage was found compared with the ground samples. It was shown that S. macrospora is able to repair radiation-induced DNA strand breaks within hours.
John, S D
2007-04-01
In this paper the coherence of the precautionary principle as a guide to public health policy is considered. Two conditions that any account of the principle must meet are outlined, a condition of practicality and a condition of publicity. The principle is interpreted in terms of a tripartite division of the outcomes of action (good outcomes, normal bad outcomes and special bad outcomes). Such a division of outcomes can be justified on either "consequentialist" or "deontological" grounds. In the second half of the paper, it is argued that the precautionary principle is not necessarily opposed to risk-cost-benefit analysis, but, rather, should be interpreted as suggesting a lowering of our epistemic standards for assessing evidence that there is a link between some policy and "special bad" outcomes. This suggestion is defended against the claim that it mistakes the nature of statistical testing and against the charge that it is unscientific or antiscientific, and therefore irrational.
Liu, Xin; Huang, He; Yang, Yan-fang; Wu, He-zhen
2014-12-01
To study ecology suitability rank dividing of the total alkaloid content of Coptis Rhizoma for selecting artificial planting base and high-quality industrial raw material in Chongqing province. Based on the investigation of PCB and DEM data of Chongqing province, the relationship between the total alkaloid content in Coptis Rhizoma and topographical conditions was analyzed by statistical analysis. The geographic information systems (GIS)-based assessment and landscape ecological principles were applied to assess eco logy suitability areas of Coptis Rhizoma in Chongqing. slope, aspect and altitude are main topographical factors that affect the content of the total alkaloid content in Coptis Rhizoma The total alkaloid content in Coptis Rhizoma is higher in the lower altitude, shady slope and bigger slope areas. The total alkaloid content is higher in the south areas of Chongqing province and lower in the northeast. Terrain conditions of the southern region of Chongqing are most suitable for The accumulated of total alkaloid Coptis Rhizoma content.
Breath Analysis as a Potential and Non-Invasive Frontier in Disease Diagnosis: An Overview
Pereira, Jorge; Porto-Figueira, Priscilla; Cavaco, Carina; Taunk, Khushman; Rapole, Srikanth; Dhakne, Rahul; Nagarajaram, Hampapathalu; Câmara, José S.
2015-01-01
Currently, a small number of diseases, particularly cardiovascular (CVDs), oncologic (ODs), neurodegenerative (NDDs), chronic respiratory diseases, as well as diabetes, form a severe burden to most of the countries worldwide. Hence, there is an urgent need for development of efficient diagnostic tools, particularly those enabling reliable detection of diseases, at their early stages, preferably using non-invasive approaches. Breath analysis is a non-invasive approach relying only on the characterisation of volatile composition of the exhaled breath (EB) that in turn reflects the volatile composition of the bloodstream and airways and therefore the status and condition of the whole organism metabolism. Advanced sampling procedures (solid-phase and needle traps microextraction) coupled with modern analytical technologies (proton transfer reaction mass spectrometry, selected ion flow tube mass spectrometry, ion mobility spectrometry, e-noses, etc.) allow the characterisation of EB composition to an unprecedented level. However, a key challenge in EB analysis is the proper statistical analysis and interpretation of the large and heterogeneous datasets obtained from EB research. There is no standard statistical framework/protocol yet available in literature that can be used for EB data analysis towards discovery of biomarkers for use in a typical clinical setup. Nevertheless, EB analysis has immense potential towards development of biomarkers for the early disease diagnosis of diseases. PMID:25584743
Breath analysis as a potential and non-invasive frontier in disease diagnosis: an overview.
Pereira, Jorge; Porto-Figueira, Priscilla; Cavaco, Carina; Taunk, Khushman; Rapole, Srikanth; Dhakne, Rahul; Nagarajaram, Hampapathalu; Câmara, José S
2015-01-09
Currently, a small number of diseases, particularly cardiovascular (CVDs), oncologic (ODs), neurodegenerative (NDDs), chronic respiratory diseases, as well as diabetes, form a severe burden to most of the countries worldwide. Hence, there is an urgent need for development of efficient diagnostic tools, particularly those enabling reliable detection of diseases, at their early stages, preferably using non-invasive approaches. Breath analysis is a non-invasive approach relying only on the characterisation of volatile composition of the exhaled breath (EB) that in turn reflects the volatile composition of the bloodstream and airways and therefore the status and condition of the whole organism metabolism. Advanced sampling procedures (solid-phase and needle traps microextraction) coupled with modern analytical technologies (proton transfer reaction mass spectrometry, selected ion flow tube mass spectrometry, ion mobility spectrometry, e-noses, etc.) allow the characterisation of EB composition to an unprecedented level. However, a key challenge in EB analysis is the proper statistical analysis and interpretation of the large and heterogeneous datasets obtained from EB research. There is no standard statistical framework/protocol yet available in literature that can be used for EB data analysis towards discovery of biomarkers for use in a typical clinical setup. Nevertheless, EB analysis has immense potential towards development of biomarkers for the early disease diagnosis of diseases.
Siddiqi, Ariba; Arjunan, Sridhar P; Kumar, Dinesh K
2016-08-01
Age-associated changes in the surface electromyogram (sEMG) of Tibialis Anterior (TA) muscle can be attributable to neuromuscular alterations that precede strength loss. We have used our sEMG model of the Tibialis Anterior to interpret the age-related changes and compared with the experimental sEMG. Eighteen young (20-30 years) and 18 older (60-85 years) performed isometric dorsiflexion at 6 different percentage levels of maximum voluntary contractions (MVC), and their sEMG from the TA muscle was recorded. Six different age-related changes in the neuromuscular system were simulated using the sEMG model at the same MVCs as the experiment. The maximal power of the spectrum, Gaussianity and Linearity Test Statistics were computed from the simulated and experimental sEMG. A correlation analysis at α=0.05 was performed between the simulated and experimental age-related change in the sEMG features. The results show the loss in motor units was distinguished by the Gaussianity and Linearity test statistics; while the maximal power of the PSD distinguished between the muscular factors. The simulated condition of 40% loss of motor units with halved the number of fast fibers best correlated with the age-related change observed in the experimental sEMG higher order statistical features. The simulated aging condition found by this study corresponds with the moderate motor unit remodelling and negligible strength loss reported in literature for the cohorts aged 60-70 years.
The effects of spatially displaced visual feedback on remote manipulator performance
NASA Technical Reports Server (NTRS)
Smith, Randy L.; Stuart, Mark A.
1989-01-01
The effects of spatially displaced visual feedback on the operation of a camera viewed remote manipulation task are analyzed. A remote manipulation task is performed by operators exposed to the following different viewing conditions: direct view of the work site; normal camera view; reversed camera view; inverted/reversed camera view; and inverted camera view. The task completion performance times are statistically analyzed with a repeated measures analysis of variance, and a Newman-Keuls pairwise comparison test is administered to the data. The reversed camera view is ranked third out of four camera viewing conditions, while the normal viewing condition is found significantly slower than the direct viewing condition. It is shown that generalization to remote manipulation applications based upon the results of direct manipulation studies are quite useful, but they should be made cautiously.
Estimating individual benefits of medical or behavioral treatments in severely ill patients.
Diaz, Francisco J
2017-01-01
There is a need for statistical methods appropriate for the analysis of clinical trials from a personalized-medicine viewpoint as opposed to the common statistical practice that simply examines average treatment effects. This article proposes an approach to quantifying, reporting and analyzing individual benefits of medical or behavioral treatments to severely ill patients with chronic conditions, using data from clinical trials. The approach is a new development of a published framework for measuring the severity of a chronic disease and the benefits treatments provide to individuals, which utilizes regression models with random coefficients. Here, a patient is considered to be severely ill if the patient's basal severity is close to one. This allows the derivation of a very flexible family of probability distributions of individual benefits that depend on treatment duration and the covariates included in the regression model. Our approach may enrich the statistical analysis of clinical trials of severely ill patients because it allows investigating the probability distribution of individual benefits in the patient population and the variables that influence it, and we can also measure the benefits achieved in specific patients including new patients. We illustrate our approach using data from a clinical trial of the anti-depressant imipramine.
Is there a relationship between periodontal disease and causes of death? A cross sectional study.
Natto, Zuhair S; Aladmawy, Majdi; Alasqah, Mohammed; Papas, Athena
2015-01-01
The aim of this study was to evaluate whether there is any correlation between periodontal disease and mortality contributing factors, such as cardiovascular disease and diabetes mellitus in the elderly population. A dental evaluation was performed by a single examiner at Tufts University dental clinics for 284 patients. Periodontal assessments were performed by probing with a manual UNC-15 periodontal probe to measure pocket depth and clinical attachment level (CAL) at 6 sites. Causes of death abstracted from death certificate. Statistical analysis involved ANOVA, chi-square and multivariate logistic regression analysis. The demographics of the population sample indicated that, most were females (except for diabetes mellitus), white, married, completed 13 years of education and were 83 years old on average. CAL (continuous or dichotomous) and marital status attained statistical significance (p<0.05) in contingency table analysis (Chi-square for independence). Individuals with increased CAL were 2.16 times more likely (OR=2.16, 95% CI=1.47-3.17) to die due to CVD and this effect persisted even after control for age, marital status, gender, race, years of education (OR=2.03, 95% CI=1.35-3.03). CAL (continuous or dichotomous) was much higher among those who died due to diabetes mellitus or out of state of Massachusetts. However, these results were not statistically significant. The same pattern was observed with pocket depth (continuous or dichotomous), but these results were not statistically significant either. CAL seems to be more sensitive to chronic diseases than pocket depth. Among those conditions, cardiovascular disease has the strongest effect.
NASA Astrophysics Data System (ADS)
Santiago-Lona, Cynthia V.; Hernández-Montes, María del Socorro; Mendoza-Santoyo, Fernando; Esquivel-Tejeda, Jesús
2018-02-01
The study and quantification of the tympanic membrane (TM) displacements add important information to advance the knowledge about the hearing process. A comparative statistical analysis between two commonly used demodulation methods employed to recover the optical phase in digital holographic interferometry, namely the fast Fourier transform and phase-shifting interferometry, is presented as applied to study thin tissues such as the TM. The resulting experimental TM surface displacement data are used to contrast both methods through the analysis of variance and F tests. Data are gathered when the TMs are excited with continuous sound stimuli at levels 86, 89 and 93 dB SPL for the frequencies of 800, 1300 and 2500 Hz under the same experimental conditions. The statistical analysis shows repeatability in z-direction displacements with a standard deviation of 0.086, 0.098 and 0.080 μm using the Fourier method, and 0.080, 0.104 and 0.055 μm with the phase-shifting method at a 95% confidence level for all frequencies. The precision and accuracy are evaluated by means of the coefficient of variation; the results with the Fourier method are 0.06143, 0.06125, 0.06154 and 0.06154, 0.06118, 0.06111 with phase-shifting. The relative error between both methods is 7.143, 6.250 and 30.769%. On comparing the measured displacements, the results indicate that there is no statistically significant difference between both methods for frequencies at 800 and 1300 Hz; however, errors and other statistics increase at 2500 Hz.
AGR-1 Thermocouple Data Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeff Einerson
2012-05-01
This report documents an effort to analyze measured and simulated data obtained in the Advanced Gas Reactor (AGR) fuel irradiation test program conducted in the INL's Advanced Test Reactor (ATR) to support the Next Generation Nuclear Plant (NGNP) R&D program. The work follows up on a previous study (Pham and Einerson, 2010), in which statistical analysis methods were applied for AGR-1 thermocouple data qualification. The present work exercises the idea that, while recognizing uncertainties inherent in physics and thermal simulations of the AGR-1 test, results of the numerical simulations can be used in combination with the statistical analysis methods tomore » further improve qualification of measured data. Additionally, the combined analysis of measured and simulation data can generate insights about simulation model uncertainty that can be useful for model improvement. This report also describes an experimental control procedure to maintain fuel target temperature in the future AGR tests using regression relationships that include simulation results. The report is organized into four chapters. Chapter 1 introduces the AGR Fuel Development and Qualification program, AGR-1 test configuration and test procedure, overview of AGR-1 measured data, and overview of physics and thermal simulation, including modeling assumptions and uncertainties. A brief summary of statistical analysis methods developed in (Pham and Einerson 2010) for AGR-1 measured data qualification within NGNP Data Management and Analysis System (NDMAS) is also included for completeness. Chapters 2-3 describe and discuss cases, in which the combined use of experimental and simulation data is realized. A set of issues associated with measurement and modeling uncertainties resulted from the combined analysis are identified. This includes demonstration that such a combined analysis led to important insights for reducing uncertainty in presentation of AGR-1 measured data (Chapter 2) and interpretation of simulation results (Chapter 3). The statistics-based simulation-aided experimental control procedure described for the future AGR tests is developed and demonstrated in Chapter 4. The procedure for controlling the target fuel temperature (capsule peak or average) is based on regression functions of thermocouple readings and other relevant parameters and accounting for possible changes in both physical and thermal conditions and in instrument performance.« less
CRACK GROWTH ANALYSIS OF SOLID OXIDE FUEL CELL ELECTROLYTES
DOE Office of Scientific and Technical Information (OSTI.GOV)
S. Bandopadhyay; N. Nagabhushana
2003-10-01
Defects and Flaws control the structural and functional property of ceramics. In determining the reliability and lifetime of ceramics structures it is very important to quantify the crack growth behavior of the ceramics. In addition, because of the high variability of the strength and the relatively low toughness of ceramics, a statistical design approach is necessary. The statistical nature of the strength of ceramics is currently well recognized, and is usually accounted for by utilizing Weibull or similar statistical distributions. Design tools such as CARES using a combination of strength measurements, stress analysis, and statistics are available and reasonably wellmore » developed. These design codes also incorporate material data such as elastic constants as well as flaw distributions and time-dependent properties. The fast fracture reliability for ceramics is often different from their time-dependent reliability. Further confounding the design complexity, the time-dependent reliability varies with the environment/temperature/stress combination. Therefore, it becomes important to be able to accurately determine the behavior of ceramics under simulated application conditions to provide a better prediction of the lifetime and reliability for a given component. In the present study, Yttria stabilized Zirconia (YSZ) of 9.6 mol% Yttria composition was procured in the form of tubes of length 100 mm. The composition is of interest as tubular electrolytes for Solid Oxide Fuel Cells. Rings cut from the tubes were characterized for microstructure, phase stability, mechanical strength (Weibull modulus) and fracture mechanisms. The strength at operating condition of SOFCs (1000 C) decreased to 95 MPa as compared to room temperature strength of 230 MPa. However, the Weibull modulus remains relatively unchanged. Slow crack growth (SCG) parameter, n = 17 evaluated at room temperature in air was representative of well studied brittle materials. Based on the results, further work was planned to evaluate the strength degradation, modulus and failure in more representative environment of the SOFCs.« less
Statistical Analysis of Solar PV Power Frequency Spectrum for Optimal Employment of Building Loads
DOE Office of Scientific and Technical Information (OSTI.GOV)
Olama, Mohammed M; Sharma, Isha; Kuruganti, Teja
In this paper, a statistical analysis of the frequency spectrum of solar photovoltaic (PV) power output is conducted. This analysis quantifies the frequency content that can be used for purposes such as developing optimal employment of building loads and distributed energy resources. One year of solar PV power output data was collected and analyzed using one-second resolution to find ideal bounds and levels for the different frequency components. The annual, seasonal, and monthly statistics of the PV frequency content are computed and illustrated in boxplot format. To examine the compatibility of building loads for PV consumption, a spectral analysis ofmore » building loads such as Heating, Ventilation and Air-Conditioning (HVAC) units and water heaters was performed. This defined the bandwidth over which these devices can operate. Results show that nearly all of the PV output (about 98%) is contained within frequencies lower than 1 mHz (equivalent to ~15 min), which is compatible for consumption with local building loads such as HVAC units and water heaters. Medium frequencies in the range of ~15 min to ~1 min are likely to be suitable for consumption by fan equipment of variable air volume HVAC systems that have time constants in the range of few seconds to few minutes. This study indicates that most of the PV generation can be consumed by building loads with the help of proper control strategies, thereby reducing impact on the grid and the size of storage systems.« less
Zhang, Yingyu; Shao, Wei; Zhang, Mengjia; Li, Hejun; Yin, Shijiu; Xu, Yingjun
2016-07-01
Mining has been historically considered as a naturally high-risk industry worldwide. Deaths caused by coal mine accidents are more than the sum of all other accidents in China. Statistics of 320 coal mine accidents in Shandong province show that all accidents contain indicators of "unsafe conditions of the rules and regulations" with a frequency of 1590, accounting for 74.3% of the total frequency of 2140. "Unsafe behaviors of the operator" is another important contributory factor, which mainly includes "operator error" and "venturing into dangerous places." A systems analysis approach was applied by using structural equation modeling (SEM) to examine the interactions between the contributory factors of coal mine accidents. The analysis of results leads to three conclusions. (i) "Unsafe conditions of the rules and regulations," affect the "unsafe behaviors of the operator," "unsafe conditions of the equipment," and "unsafe conditions of the environment." (ii) The three influencing factors of coal mine accidents (with the frequency of effect relation in descending order) are "lack of safety education and training," "rules and regulations of safety production responsibility," and "rules and regulations of supervision and inspection." (iii) The three influenced factors (with the frequency in descending order) of coal mine accidents are "venturing into dangerous places," "poor workplace environment," and "operator error." Copyright © 2016 Elsevier Ltd. All rights reserved.
Forecasting volatility with neural regression: a contribution to model adequacy.
Refenes, A N; Holt, W T
2001-01-01
Neural nets' usefulness for forecasting is limited by problems of overfitting and the lack of rigorous procedures for model identification, selection and adequacy testing. This paper describes a methodology for neural model misspecification testing. We introduce a generalization of the Durbin-Watson statistic for neural regression and discuss the general issues of misspecification testing using residual analysis. We derive a generalized influence matrix for neural estimators which enables us to evaluate the distribution of the statistic. We deploy Monte Carlo simulation to compare the power of the test for neural and linear regressors. While residual testing is not a sufficient condition for model adequacy, it is nevertheless a necessary condition to demonstrate that the model is a good approximation to the data generating process, particularly as neural-network estimation procedures are susceptible to partial convergence. The work is also an important step toward developing rigorous procedures for neural model identification, selection and adequacy testing which have started to appear in the literature. We demonstrate its applicability in the nontrivial problem of forecasting implied volatility innovations using high-frequency stock index options. Each step of the model building process is validated using statistical tests to verify variable significance and model adequacy with the results confirming the presence of nonlinear relationships in implied volatility innovations.
A Comparison of Atmospheric Quantities Determined from Advanced WVR and Weather Analysis Data
NASA Astrophysics Data System (ADS)
Morabito, D.; Wu, L.; Slobin, S.
2017-05-01
Lower frequency bands used for deep space communications (e.g., 2.3 GHz and 8.4 GHz) are oversubscribed. Thus, NASA has become interested in using higher frequency bands (e.g., 26 GHz and 32 GHz) for telemetry, making use of the available wider bandwidth. However, these bands are more susceptible to atmospheric degradation. Currently, flight projects tend to be conservative in preparing their communications links by using worst-case or conservative assumptions, which result in nonoptimum data return. We previously explored the use of weather forecasting over different weather condition scenarios to determine more optimal values of atmospheric attenuation and atmospheric noise temperature for use in telecommunications link design. In this article, we present the results of a comparison of meteorological parameters (columnar water vapor and liquid water content) estimated from multifrequency Advanced Water Vapor Radiometer (AWVR) data with those estimated from weather analysis tools (FNL). We find that for the Deep Space Network's Goldstone and Madrid tracking sites, the statistics are in reasonable agreement between the two methods. We can then use the statistics of these quantities based on FNL runs to estimate statistics of atmospheric signal degradation for tracking sites that do not have the benefit of possessing multiyear WVR data sets, such as those of the NASA Near-Earth Network (NEN). The resulting statistics of atmospheric attenuation and atmospheric noise temperature increase can then be used in link budget calculations.
Yuan, Zhongshang; Liu, Hong; Zhang, Xiaoshuai; Li, Fangyu; Zhao, Jinghua; Zhang, Furen; Xue, Fuzhong
2013-01-01
Currently, the genetic variants identified by genome wide association study (GWAS) generally only account for a small proportion of the total heritability for complex disease. One crucial reason is the underutilization of gene-gene joint effects commonly encountered in GWAS, which includes their main effects and co-association. However, gene-gene co-association is often customarily put into the framework of gene-gene interaction vaguely. From the causal graph perspective, we elucidate in detail the concept and rationality of gene-gene co-association as well as its relationship with traditional gene-gene interaction, and propose two Fisher r-to-z transformation-based simple statistics to detect it. Three series of simulations further highlight that gene-gene co-association refers to the extent to which the joint effects of two genes differs from the main effects, not only due to the traditional interaction under the nearly independent condition but the correlation between two genes. The proposed statistics are more powerful than logistic regression under various situations, cannot be affected by linkage disequilibrium and can have acceptable false positive rate as long as strictly following the reasonable GWAS data analysis roadmap. Furthermore, an application to gene pathway analysis associated with leprosy confirms in practice that our proposed gene-gene co-association concepts as well as the correspondingly proposed statistics are strongly in line with reality. PMID:23923021
Das, Suchismita; Choudhury, Shamim Sultana
2016-01-01
The aim of this study was to assess the regional impacts of heavy metals (Mn, Fe, Mg, Ca, Cu, Zn, Cd, Cr, Pb and Ni) on water, sediment and a native, teleost fish species, Labeo angra, inhabiting a flood plain wetland of Barak River in Assam, India. Heavy metal concentrations in the water, sediments and fish were measured; bioaccumulation factor, metal pollution index as well as condition indices were calculated, to assess the pollution load and health status of the fish. Multivariate statistical analysis was used on wetland water and sediment heavy metals to ascertain the possible sources and seasonal variations of the pollutants. Results showed that most heavy metals in the wetland water and sediments exceeded the water (drinking and irrigation) and sediment quality guidelines, respectively. Seasonal variations were observed for geogenic heavy metals, Mn, Fe, Mg and Ca while no seasonal variations were observed for anthropogenic heavy metals, Cu, Cd, Cr, Pb and Ni. Multivariate statistical analysis showed that there was strong correlation between geogenic and anthropogenic heavy metals in water and sediment, both originating from the common anthropogenic sources. Accumulation of most of the metals in all the tissues was above the safe limits as recommended by the Food and Agriculture Organization. High bioaccumulation factors and metal pollution index for these metals in the different tissues revealed that metals were extensively bio-accumulated and bioconcentrated. Condition indices in fish from the wetland suggested metabolic abnormalities.
Genomic analysis of regulatory network dynamics reveals large topological changes
NASA Astrophysics Data System (ADS)
Luscombe, Nicholas M.; Madan Babu, M.; Yu, Haiyuan; Snyder, Michael; Teichmann, Sarah A.; Gerstein, Mark
2004-09-01
Network analysis has been applied widely, providing a unifying language to describe disparate systems ranging from social interactions to power grids. It has recently been used in molecular biology, but so far the resulting networks have only been analysed statically. Here we present the dynamics of a biological network on a genomic scale, by integrating transcriptional regulatory information and gene-expression data for multiple conditions in Saccharomyces cerevisiae. We develop an approach for the statistical analysis of network dynamics, called SANDY, combining well-known global topological measures, local motifs and newly derived statistics. We uncover large changes in underlying network architecture that are unexpected given current viewpoints and random simulations. In response to diverse stimuli, transcription factors alter their interactions to varying degrees, thereby rewiring the network. A few transcription factors serve as permanent hubs, but most act transiently only during certain conditions. By studying sub-network structures, we show that environmental responses facilitate fast signal propagation (for example, with short regulatory cascades), whereas the cell cycle and sporulation direct temporal progression through multiple stages (for example, with highly inter-connected transcription factors). Indeed, to drive the latter processes forward, phase-specific transcription factors inter-regulate serially, and ubiquitously active transcription factors layer above them in a two-tiered hierarchy. We anticipate that many of the concepts presented here-particularly the large-scale topological changes and hub transience-will apply to other biological networks, including complex sub-systems in higher eukaryotes.
Simkó, Myrtill; Remondini, Daniel; Zeni, Olga; Scarfi, Maria Rosaria
2016-01-01
Possible hazardous effects of radiofrequency electromagnetic fields (RF-EMF) at low exposure levels are controversially discussed due to inconsistent study findings. Therefore, the main focus of the present study is to detect if any statistical association exists between RF-EMF and cellular responses, considering cell proliferation and apoptosis endpoints separately and with both combined as a group of “cellular life” to increase the statistical power of the analysis. We searched for publications regarding RF-EMF in vitro studies in the PubMed database for the period 1995–2014 and extracted the data to the relevant parameters, such as cell culture type, frequency, exposure duration, SAR, and five exposure-related quality criteria. These parameters were used for an association study with the experimental outcome in terms of the defined endpoints. We identified 104 published articles, from which 483 different experiments were extracted and analyzed. Cellular responses after exposure to RF-EMF were significantly associated to cell lines rather than to primary cells. No other experimental parameter was significantly associated with cellular responses. A highly significant negative association with exposure condition-quality and cellular responses was detected, showing that the more the quality criteria requirements were satisfied, the smaller the number of detected cellular responses. According to our knowledge, this is the first systematic analysis of specific RF-EMF bio-effects in association to exposure quality, highlighting the need for more stringent quality procedures for the exposure conditions. PMID:27420084
[EEG-correlates of pilots' functional condition in simulated flight dynamics].
Kiroy, V N; Aslanyan, E V; Bakhtin, O M; Minyaeva, N R; Lazurenko, D M
2015-01-01
The spectral characteristics of the EEG recorded on two professional pilots in the simulator TU-154 aircraft in flight dynamics, including takeoff, landing and horizontal flight (in particular during difficult conditions) were analyzed. EEG recording was made with frequency band 0.1-70 Hz continuously from 15 electrodes. The EEG recordings were evaluated using analysis of variance and discriminant analysis. Statistical significant of the identified differences and the influence of the main factors and their interactions were evaluated using Greenhouse - Gaiser corrections. It was shown that the spectral characteristics of the EEG are highly informative features of the state of the pilots, reflecting the different flight phases. High validity ofthe differences including individual characteristic, indicates their non-random nature and the possibility of constructing a system of pilots' state control during all phases of flight, based on EEG features.
Small, J R
1993-01-01
This paper is a study into the effects of experimental error on the estimated values of flux control coefficients obtained using specific inhibitors. Two possible techniques for analysing the experimental data are compared: a simple extrapolation method (the so-called graph method) and a non-linear function fitting method. For these techniques, the sources of systematic errors are identified and the effects of systematic and random errors are quantified, using both statistical analysis and numerical computation. It is shown that the graph method is very sensitive to random errors and, under all conditions studied, that the fitting method, even under conditions where the assumptions underlying the fitted function do not hold, outperformed the graph method. Possible ways of designing experiments to minimize the effects of experimental errors are analysed and discussed. PMID:8257434
ASCS online fault detection and isolation based on an improved MPCA
NASA Astrophysics Data System (ADS)
Peng, Jianxin; Liu, Haiou; Hu, Yuhui; Xi, Junqiang; Chen, Huiyan
2014-09-01
Multi-way principal component analysis (MPCA) has received considerable attention and been widely used in process monitoring. A traditional MPCA algorithm unfolds multiple batches of historical data into a two-dimensional matrix and cut the matrix along the time axis to form subspaces. However, low efficiency of subspaces and difficult fault isolation are the common disadvantages for the principal component model. This paper presents a new subspace construction method based on kernel density estimation function that can effectively reduce the storage amount of the subspace information. The MPCA model and the knowledge base are built based on the new subspace. Then, fault detection and isolation with the squared prediction error (SPE) statistic and the Hotelling ( T 2) statistic are also realized in process monitoring. When a fault occurs, fault isolation based on the SPE statistic is achieved by residual contribution analysis of different variables. For fault isolation of subspace based on the T 2 statistic, the relationship between the statistic indicator and state variables is constructed, and the constraint conditions are presented to check the validity of fault isolation. Then, to improve the robustness of fault isolation to unexpected disturbances, the statistic method is adopted to set the relation between single subspace and multiple subspaces to increase the corrective rate of fault isolation. Finally fault detection and isolation based on the improved MPCA is used to monitor the automatic shift control system (ASCS) to prove the correctness and effectiveness of the algorithm. The research proposes a new subspace construction method to reduce the required storage capacity and to prove the robustness of the principal component model, and sets the relationship between the state variables and fault detection indicators for fault isolation.
Statistical properties of several models of fractional random point processes
NASA Astrophysics Data System (ADS)
Bendjaballah, C.
2011-08-01
Statistical properties of several models of fractional random point processes have been analyzed from the counting and time interval statistics points of view. Based on the criterion of the reduced variance, it is seen that such processes exhibit nonclassical properties. The conditions for these processes to be treated as conditional Poisson processes are examined. Numerical simulations illustrate part of the theoretical calculations.
Direct evidence for a dual process model of deductive inference.
Markovits, Henry; Brunet, Marie-Laurence; Thompson, Valerie; Brisson, Janie
2013-07-01
In 2 experiments, we tested a strong version of a dual process theory of conditional inference (cf. Verschueren et al., 2005a, 2005b) that assumes that most reasoners have 2 strategies available, the choice of which is determined by situational variables, cognitive capacity, and metacognitive control. The statistical strategy evaluates inferences probabilistically, accepting those with high conditional probability. The counterexample strategy rejects inferences when a counterexample shows the inference to be invalid. To discriminate strategy use, we presented reasoners with conditional statements (if p, then q) and explicit statistical information about the relative frequency of the probability of p/q (50% vs. 90%). A statistical strategy would accept the more probable inferences more frequently, whereas the counterexample one would reject both. In Experiment 1, reasoners under time pressure used the statistical strategy more, but switched to the counterexample strategy when time constraints were removed; the former took less time than the latter. These data are consistent with the hypothesis that the statistical strategy is the default heuristic. Under a free-time condition, reasoners preferred the counterexample strategy and kept it when put under time pressure. Thus, it is not simply a lack of capacity that produces a statistical strategy; instead, it seems that time pressure disrupts the ability to make good metacognitive choices. In line with this conclusion, in a 2nd experiment, we measured reasoners' confidence in their performance; those under time pressure were less confident in the statistical than the counterexample strategy and more likely to switch strategies under free-time conditions. PsycINFO Database Record (c) 2013 APA, all rights reserved.
Bossier, Han; Seurinck, Ruth; Kühn, Simone; Banaschewski, Tobias; Barker, Gareth J.; Bokde, Arun L. W.; Martinot, Jean-Luc; Lemaitre, Herve; Paus, Tomáš; Millenet, Sabina; Moerkerke, Beatrijs
2018-01-01
Given the increasing amount of neuroimaging studies, there is a growing need to summarize published results. Coordinate-based meta-analyses use the locations of statistically significant local maxima with possibly the associated effect sizes to aggregate studies. In this paper, we investigate the influence of key characteristics of a coordinate-based meta-analysis on (1) the balance between false and true positives and (2) the activation reliability of the outcome from a coordinate-based meta-analysis. More particularly, we consider the influence of the chosen group level model at the study level [fixed effects, ordinary least squares (OLS), or mixed effects models], the type of coordinate-based meta-analysis [Activation Likelihood Estimation (ALE) that only uses peak locations, fixed effects, and random effects meta-analysis that take into account both peak location and height] and the amount of studies included in the analysis (from 10 to 35). To do this, we apply a resampling scheme on a large dataset (N = 1,400) to create a test condition and compare this with an independent evaluation condition. The test condition corresponds to subsampling participants into studies and combine these using meta-analyses. The evaluation condition corresponds to a high-powered group analysis. We observe the best performance when using mixed effects models in individual studies combined with a random effects meta-analysis. Moreover the performance increases with the number of studies included in the meta-analysis. When peak height is not taken into consideration, we show that the popular ALE procedure is a good alternative in terms of the balance between type I and II errors. However, it requires more studies compared to other procedures in terms of activation reliability. Finally, we discuss the differences, interpretations, and limitations of our results. PMID:29403344
Laryngospasm during emergency department ketamine sedation: a case-control study.
Green, Steven M; Roback, Mark G; Krauss, Baruch
2010-11-01
The objective of this study was to assess predictors of emergency department (ED) ketamine-associated laryngospasm using case-control techniques. We performed a matched case-control analysis of a sample of 8282 ED ketamine sedations (including 22 occurrences of laryngospasm) assembled from 32 prior published series. We sequentially studied the association of each of 7 clinical variables with laryngospasm by assigning 4 controls to each case while matching for the remaining 6 variables. We then used univariate statistics and conditional logistic regression to analyze the matched sets. We found no statistical association of age, dose, oropharyngeal procedure, underlying physical illness, route, or coadministered anticholinergics with laryngospasm. Coadministered benzodiazepines showed a borderline association in the multivariate but not univariate analysis that was considered anomalous. This case-control analysis of the largest available sample of ED ketamine-associated laryngospasm did not demonstrate evidence of association with age, dose, or other clinical factors. Such laryngospasm seems to be idiosyncratic, and accordingly, clinicians administering ketamine must be prepared for its rapid identification and management. Given no evidence that they decrease the risk of laryngospasm, coadministered anticholinergics seem unnecessary.
Rubio-Aparicio, María; Sánchez-Meca, Julio; López-López, José Antonio; Botella, Juan; Marín-Martínez, Fulgencio
2017-11-01
Subgroup analyses allow us to examine the influence of a categorical moderator on the effect size in meta-analysis. We conducted a simulation study using a dichotomous moderator, and compared the impact of pooled versus separate estimates of the residual between-studies variance on the statistical performance of the Q B (P) and Q B (S) tests for subgroup analyses assuming a mixed-effects model. Our results suggested that similar performance can be expected as long as there are at least 20 studies and these are approximately balanced across categories. Conversely, when subgroups were unbalanced, the practical consequences of having heterogeneous residual between-studies variances were more evident, with both tests leading to the wrong statistical conclusion more often than in the conditions with balanced subgroups. A pooled estimate should be preferred for most scenarios, unless the residual between-studies variances are clearly different and there are enough studies in each category to obtain precise separate estimates. © 2017 The British Psychological Society.
Topographic ERP analyses: a step-by-step tutorial review.
Murray, Micah M; Brunet, Denis; Michel, Christoph M
2008-06-01
In this tutorial review, we detail both the rationale for as well as the implementation of a set of analyses of surface-recorded event-related potentials (ERPs) that uses the reference-free spatial (i.e. topographic) information available from high-density electrode montages to render statistical information concerning modulations in response strength, latency, and topography both between and within experimental conditions. In these and other ways these topographic analysis methods allow the experimenter to glean additional information and neurophysiologic interpretability beyond what is available from canonical waveform analyses. In this tutorial we present the example of somatosensory evoked potentials (SEPs) in response to stimulation of each hand to illustrate these points. For each step of these analyses, we provide the reader with both a conceptual and mathematical description of how the analysis is carried out, what it yields, and how to interpret its statistical outcome. We show that these topographic analysis methods are intuitive and easy-to-use approaches that can remove much of the guesswork often confronting ERP researchers and also assist in identifying the information contained within high-density ERP datasets.
NASA Technical Reports Server (NTRS)
Bremner, P. G.; Blelloch, P. A.; Hutchings, A.; Shah, P.; Streett, C. L.; Larsen, C. E.
2011-01-01
This paper describes the measurement and analysis of surface fluctuating pressure level (FPL) data and vibration data from a plume impingement aero-acoustic and vibration (PIAAV) test to validate NASA s physics-based modeling methods for prediction of panel vibration in the near field of a hot supersonic rocket plume. For this test - reported more fully in a companion paper by Osterholt & Knox at 26th Aerospace Testing Seminar, 2011 - the flexible panel was located 2.4 nozzle diameters from the plume centerline and 4.3 nozzle diameters downstream from the nozzle exit. The FPL loading is analyzed in terms of its auto spectrum, its cross spectrum, its spatial correlation parameters and its statistical properties. The panel vibration data is used to estimate the in-situ damping under plume FPL loading conditions and to validate both finite element analysis (FEA) and statistical energy analysis (SEA) methods for prediction of panel response. An assessment is also made of the effects of non-linearity in the panel elasticity.
NASA Astrophysics Data System (ADS)
Hu, Chongqing; Li, Aihua; Zhao, Xingyang
2011-02-01
This paper proposes a multivariate statistical analysis approach to processing the instantaneous engine speed signal for the purpose of locating multiple misfire events in internal combustion engines. The state of each cylinder is described with a characteristic vector extracted from the instantaneous engine speed signal following a three-step procedure. These characteristic vectors are considered as the values of various procedure parameters of an engine cycle. Therefore, determination of occurrence of misfire events and identification of misfiring cylinders can be accomplished by a principal component analysis (PCA) based pattern recognition methodology. The proposed algorithm can be implemented easily in practice because the threshold can be defined adaptively without the information of operating conditions. Besides, the effect of torsional vibration on the engine speed waveform is interpreted as the presence of super powerful cylinder, which is also isolated by the algorithm. The misfiring cylinder and the super powerful cylinder are often adjacent in the firing sequence, thus missing detections and false alarms can be avoided effectively by checking the relationship between the cylinders.
Length bias correction in gene ontology enrichment analysis using logistic regression.
Mi, Gu; Di, Yanming; Emerson, Sarah; Cumbie, Jason S; Chang, Jeff H
2012-01-01
When assessing differential gene expression from RNA sequencing data, commonly used statistical tests tend to have greater power to detect differential expression of genes encoding longer transcripts. This phenomenon, called "length bias", will influence subsequent analyses such as Gene Ontology enrichment analysis. In the presence of length bias, Gene Ontology categories that include longer genes are more likely to be identified as enriched. These categories, however, are not necessarily biologically more relevant. We show that one can effectively adjust for length bias in Gene Ontology analysis by including transcript length as a covariate in a logistic regression model. The logistic regression model makes the statistical issue underlying length bias more transparent: transcript length becomes a confounding factor when it correlates with both the Gene Ontology membership and the significance of the differential expression test. The inclusion of the transcript length as a covariate allows one to investigate the direct correlation between the Gene Ontology membership and the significance of testing differential expression, conditional on the transcript length. We present both real and simulated data examples to show that the logistic regression approach is simple, effective, and flexible.
Tomesko, Jennifer; Touger-Decker, Riva; Dreker, Margaret; Zelig, Rena; Parrott, James Scott
2017-01-01
To explore knowledge and skill acquisition outcomes related to learning physical examination (PE) through computer-assisted instruction (CAI) compared with a face-to-face (F2F) approach. A systematic literature review and meta-analysis published between January 2001 and December 2016 was conducted. Databases searched included Medline, Cochrane, CINAHL, ERIC, Ebsco, Scopus, and Web of Science. Studies were synthesized by study design, intervention, and outcomes. Statistical analyses included DerSimonian-Laird random-effects model. In total, 7 studies were included in the review, and 5 in the meta-analysis. There were no statistically significant differences for knowledge (mean difference [MD] = 5.39, 95% confidence interval [CI]: -2.05 to 12.84) or skill acquisition (MD = 0.35, 95% CI: -5.30 to 6.01). The evidence does not suggest a strong consistent preference for either CAI or F2F instruction to teach students/trainees PE. Further research is needed to identify conditions which examine knowledge and skill acquisition outcomes that favor one mode of instruction over the other.
Working Conditions, Socioeconomic Factors and Low Birth Weight: Path Analysis
Mahmoodi, Zohreh; Karimlou, Masoud; Sajjadi, Homeira; Dejman, Masoumeh; Vameghi, Meroe; Dolatian, Mahrokh
2013-01-01
Background In recent years, with socioeconomic changes in the society, the presence of women in the workplace is inevitable. The differences in working condition, especially for pregnant women, has adverse consequences like low birth weight. Objectives This study was conducted with the aim to model the relationship between working conditions, socioeconomic factors, and birth weight. Patients and Methods This study was conducted in case-control design. The control group consisted of 500 women with normal weight babies, and the case group, 250 women with low weight babies from selected hospitals in Tehran. Data were collected using a researcher-made questionnaire to determine mothers’ lifestyle during pregnancy with low birth weight with health-affecting social determinants approach. This questionnaire investigated women’s occupational lifestyle in terms of working conditions, activities, and job satisfaction. Data were analyzed with SPSS-16 and Lisrel-8.8 software using statistical path analysis. Results The final path model fitted well (CFI =1, RMSEA=0.00) and showed that among direct paths, working condition (β=-0.032), among indirect paths, household income (β=-0.42), and in the overall effect, unemployed spouse (β=-0.1828) had the most effects on the low birth weight. Negative coefficients indicate decreasing effect on birth weight. Conclusions Based on the path analysis model, working condition and socioeconomic status directly and indirectly influence birth weight. Thus, as well as attention to treatment and health care (biological aspect), special attention must also be paid to mothers’ socioeconomic factors. PMID:24616796
NASA Technical Reports Server (NTRS)
Scoggins, J. R. (Editor)
1978-01-01
Four diagnostic studies of AVE 3. are presented. AVE 3 represents a high wind speed wintertime situation, while most AVE's analyzed previously represented springtime conditions with rather low wind speeds. The general areas of analysis include the examination of budgets of vorticity, moisture, kinetic energy, and potential energy and a synoptic and statistical study of the horizontal gradients of meteorological parameters. Conclusions are integrated with and compared to those obtained in previously analyzed experiments (mostly springtime weather situations) so as to establish a more definitive understanding of the structure and dynamics of the atmosphere under a wide range of synoptic conditions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ul'yanov, A S; Lyapina, A M; Ulianova, O V
2011-04-30
Specific statistical characteristics of biospeckles, emerging under the diffraction of coherent beams on the bacterial colonies, are studied. The dependence of the fractal dimensions of biospeckles on the conditions of both illumination and growth of the colonies is studied theoretically and experimentally. Particular attention is paid to the fractal properties of biospeckles, emerging under the scattering of light by the colonies of the vaccinal strain of the plague microbe. The possibility in principle to classify the colonies of Yersinia pestis EV NIIEG using the fractal dimension analysis is demonstrated. (optical technologies in biophysics and medicine)
Studies in the use of cloud type statistics in mission simulation
NASA Technical Reports Server (NTRS)
Fowler, M. G.; Willand, J. H.; Chang, D. T.; Cogan, J. L.
1974-01-01
A study to further improve NASA's global cloud statistics for mission simulation is reported. Regional homogeneity in cloud types was examined; most of the original region boundaries defined for cloud cover amount in previous studies were supported by the statistics on cloud types and the number of cloud layers. Conditionality in cloud statistics was also examined with special emphasis on temporal and spatial dependencies, and cloud type interdependence. Temporal conditionality was found up to 12 hours, and spatial conditionality up to 200 miles; the diurnal cycle in convective cloudiness was clearly evident. As expected, the joint occurrence of different cloud types reflected the dynamic processes which form the clouds. Other phases of the study improved the cloud type statistics for several region and proposed a mission simulation scheme combining the 4-dimensional atmospheric model, sponsored by MSFC, with the global cloud model.
2016-07-01
Reports an error in "Are Cognitive Interventions Effective in Alzheimer's Disease? A Controlled Meta-Analysis of the Effects of Bias" by Javier Oltra-Cucarella, Rubén Pérez-Elvira, Raul Espert and Anita Sohn McCormick (Neuropsychology, Advanced Online Publication, Apr 7, 2016, np). In the article the first sentence of the third paragraph of the Source of bias subsection in the Statistical Analysis subsection of the Correlational Meta-Analysis section should read "For the control condition bias, three comparison groups were differentiated: (a) a structured cognitive intervention, (b) a placebo control condition, and (c) a pharma control condition without cognitive intervention or no treatment at all." (The following abstract of the original article appeared in record 2016-16656-001.) There is limited evidence about the efficacy of cognitive interventions for Alzheimer's disease (AD). However, aside from the methodological quality of the studies analyzed, the methodology used in previous meta-analyses is itself a risk of bias as different types of effect sizes (ESs) were calculated and combined. This study aimed at examining the results of nonpharmacological interventions for AD with an adequate control of statistical methods and to demonstrate a different approach to meta-analysis. ESs were calculated with the independent groups pre/post design. Average ESs for separate outcomes were calculated and moderator analyses were performed so as to offer an overview of the effects of bias. Eighty-seven outcomes from 19 studies (n = 812) were meta-analyzed. ESs were small on average for cognitive and functional outcomes after intervention. Moderator analyses showed no effect of control of bias, although ESs were different from zero only in some circumstances (e.g., memory outcomes in randomized studies). Cognitive interventions showed no more efficacy than placebo interventions, and functional ESs were consistently low across conditions. cognitive interventions delivered may not be effective in AD probably due to the fact that the assumptions behind the cognitive interventions might be inadequate. Future directions include a change in the type of intervention as well as the use of outcomes other than standardized tests. Additional studies with larger sample sizes and different designs are needed to increase the power of both primary studies and meta-analyses. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Statistical Analysis of Research Data | Center for Cancer Research
Recent advances in cancer biology have resulted in the need for increased statistical analysis of research data. The Statistical Analysis of Research Data (SARD) course will be held on April 5-6, 2018 from 9 a.m.-5 p.m. at the National Institutes of Health's Natcher Conference Center, Balcony C on the Bethesda Campus. SARD is designed to provide an overview on the general principles of statistical analysis of research data. The first day will feature univariate data analysis, including descriptive statistics, probability distributions, one- and two-sample inferential statistics.
Entropy Production in Collisionless Systems. II. Arbitrary Phase-space Occupation Numbers
NASA Astrophysics Data System (ADS)
Barnes, Eric I.; Williams, Liliya L. R.
2012-04-01
We present an analysis of two thermodynamic techniques for determining equilibria of self-gravitating systems. One is the Lynden-Bell (LB) entropy maximization analysis that introduced violent relaxation. Since we do not use the Stirling approximation, which is invalid at small occupation numbers, our systems have finite mass, unlike LB's isothermal spheres. (Instead of Stirling, we utilize a very accurate smooth approximation for ln x!.) The second analysis extends entropy production extremization to self-gravitating systems, also without the use of the Stirling approximation. In addition to the LB statistical family characterized by the exclusion principle in phase space, and designed to treat collisionless systems, we also apply the two approaches to the Maxwell-Boltzmann (MB) families, which have no exclusion principle and hence represent collisional systems. We implicitly assume that all of the phase space is equally accessible. We derive entropy production expressions for both families and give the extremum conditions for entropy production. Surprisingly, our analysis indicates that extremizing entropy production rate results in systems that have maximum entropy, in both LB and MB statistics. In other words, both thermodynamic approaches lead to the same equilibrium structures.
Keenan, Michael R; Smentkowski, Vincent S; Ulfig, Robert M; Oltman, Edward; Larson, David J; Kelly, Thomas F
2011-06-01
We demonstrate for the first time that multivariate statistical analysis techniques can be applied to atom probe tomography data to estimate the chemical composition of a sample at the full spatial resolution of the atom probe in three dimensions. Whereas the raw atom probe data provide the specific identity of an atom at a precise location, the multivariate results can be interpreted in terms of the probabilities that an atom representing a particular chemical phase is situated there. When aggregated to the size scale of a single atom (∼0.2 nm), atom probe spectral-image datasets are huge and extremely sparse. In fact, the average spectrum will have somewhat less than one total count per spectrum due to imperfect detection efficiency. These conditions, under which the variance in the data is completely dominated by counting noise, test the limits of multivariate analysis, and an extensive discussion of how to extract the chemical information is presented. Efficient numerical approaches to performing principal component analysis (PCA) on these datasets, which may number hundreds of millions of individual spectra, are put forward, and it is shown that PCA can be computed in a few seconds on a typical laptop computer.
NASA Astrophysics Data System (ADS)
Munawar, Iqra
2016-07-01
Crime mapping is a dynamic process. It can be used to assist all stages of the problem solving process. Mapping crime can help police protect citizens more effectively. The decision to utilize a certain type of map or design element may change based on the purpose of a map, the audience or the available data. If the purpose of the crime analysis map is to assist in the identification of a particular problem, selected data may be mapped to identify patterns of activity that have been previously undetected. The main objective of this research was to study the spatial distribution patterns of the four common crimes i.e Narcotics, Arms, Burglary and Robbery in Gujranwala City using spatial statistical techniques to identify the hotspots. Hotspots or location of clusters were identified using Getis-Ord Gi* Statistic. Crime analysis mapping can be used to conduct a comprehensive spatial analysis of the problem. Graphic presentations of such findings provide a powerful medium to communicate conditions, patterns and trends thus creating an avenue for analysts to bring about significant policy changes. Moreover Crime mapping also helps in the reduction of crime rate.
Shaikh, Masood Ali
2017-09-01
Assessment of research articles in terms of study designs used, statistical tests applied and the use of statistical analysis programmes help determine research activity profile and trends in the country. In this descriptive study, all original articles published by Journal of Pakistan Medical Association (JPMA) and Journal of the College of Physicians and Surgeons Pakistan (JCPSP), in the year 2015 were reviewed in terms of study designs used, application of statistical tests, and the use of statistical analysis programmes. JPMA and JCPSP published 192 and 128 original articles, respectively, in the year 2015. Results of this study indicate that cross-sectional study design, bivariate inferential statistical analysis entailing comparison between two variables/groups, and use of statistical software programme SPSS to be the most common study design, inferential statistical analysis, and statistical analysis software programmes, respectively. These results echo previously published assessment of these two journals for the year 2014.
Dasanu, Constantin A; Bockorny, Bruno; Grabska, Joanna; Codreanu, Ion
2015-04-01
Increased risk of B-cell non-Hodgkin lymphoma (NHL) in patients with autoimmune diseases is a known fact. An association may exist between marginal zone lymphoma (MZL) and certain autoimmune conditions and vice-versa. Herein, we present the analysis of a series of consecutive patients (n = 24) diagnosed with MZL at our institution between 2008-2014. Our series, analyzed both retrospectively and prospectively, consisted of a blend of nodal, extranodal and splenic MZL. The median age was 71.8 years; M/F ratio was 2:1. The presence of autoimmune conditions was compared to their documented prevalence in the general population and tested for statistical significance using both chi-square test (χ2) and Fisher test for small number of observations (95% confidence). A P-value < 0.05 was considered significant. A total of 50% of MZL patients had documented autoimmune conditions. In addition, 3 of 24 patients presented with more than one autoimmune disease. Statistically significant differences in our MZL patients were recorded for immune thrombocytopenia [ITP] (P < 0.01), autoimmune hemolytic anemia [AIHA] (P < 0.01), Hashimoto thyroiditis (P = 0.037) and rheumatoid arthritis [RA] (P = 0.021). The difference did not reach statistical significance for systemic lupus erythematosus (SLE) and psoriasis. ITP and AIHA in our cohort were synchronous with MZL diagnosis in all patients, while all non-hematologic autoimmune conditions were metachronous and diagnosed prior to MZL. In the course of caring for patients with MZL, a number of associated autoimmune disorders are recognized. Knowing these entities is important not only for making a correct diagnosis, but also for being able to recognize certain clinical events occurring during the course of the disease. A catalogue of autoimmune disorders associated with this type of NHL is important as they can pose formidable clinical problems for the MZL patients and their physicians.
Monieta, Adela; Anczurowski, Wojciech
2004-01-01
Presentation of Post-Traumatic Stress Disorder based on the approach of various authors concentrating, upon the concept of the American classification: DSM III (1980) and DSM IV (1994). We acknowledged the necessity of displaying empirical results of intensification of PTSD among the Siberian deportees population in the region of North-East part of Poland. In our analysis, we stressed the importance of the distant in time, psychological consequences of dwelling in extremely difficult living conditions that often threatened the life of those who had been deported to Siberia between 1939 and 1956. 40 "Siberian deportees" (20 men and 20 women) were examined. The method of PTSD-Interview (PTSD-I) was used here in order to obtain, in each individual case, the indicatory number indispensable for the statistical analysis. An average result of PTSD intensification in the case of women reaches a "very significant" level and in the case of men it is even higher. The disparity between the average results of women and of men are statistically significant (p<0.05). This research has confirmed the assumptions that suffering from trauma in the early stage of development (within the age range of 8-15) leaves a permanent mark in the human psyche. Statistical analysis revealed a high level of intensification of PTSD among the population of the "Siberian deportees" from the North-East region of Poland.
Analysis of Runway Incursion Data
NASA Technical Reports Server (NTRS)
Green, Lawrence L.
2013-01-01
A statistical analysis of runway incursion (RI) events was conducted to ascertain relevance to the top ten challenges of the National Aeronautics and Space Administration Aviation Safety Program (AvSP). The information contained in the RI database was found to contain data that may be relevant to several of the AvSP top ten challenges. When combined with other data from the FAA documenting air traffic volume from calendar year 2000 through 2011, the structure of a predictive model emerges that can be used to forecast the frequency of RI events at various airports for various classes of aircraft and under various environmental conditions.
McKeever, Robert
2014-01-01
This study content analyzed online direct-to-consumer advertisements (DTCA) for prescription drug treatments to explore whether ads for prescription treatments for psychiatric conditions, which are commonly untreated, differ from other drug advertisements. Coded variables included the presence of interactive technological components, use of promotional incentives, and the social contexts portrayed in images shown on each site. Statistical analysis revealed ads for psychiatric medications contained fewer interactive website features, financial incentives, and calls to action than other types of prescription drug advertisements. Implications for health communication researchers are discussed.
Discriminant analysis of Raman spectra for body fluid identification for forensic purposes.
Sikirzhytski, Vitali; Virkler, Kelly; Lednev, Igor K
2010-01-01
Detection and identification of blood, semen and saliva stains, the most common body fluids encountered at a crime scene, are very important aspects of forensic science today. This study targets the development of a nondestructive, confirmatory method for body fluid identification based on Raman spectroscopy coupled with advanced statistical analysis. Dry traces of blood, semen and saliva obtained from multiple donors were probed using a confocal Raman microscope with a 785-nm excitation wavelength under controlled laboratory conditions. Results demonstrated the capability of Raman spectroscopy to identify an unknown substance to be semen, blood or saliva with high confidence.
Frontal networks in adults with autism spectrum disorder
Catani, Marco; Dell’Acqua, Flavio; Budisavljevic, Sanja; Howells, Henrietta; Thiebaut de Schotten, Michel; Froudist-Walsh, Seán; D’Anna, Lucio; Thompson, Abigail; Sandrone, Stefano; Bullmore, Edward T.; Suckling, John; Baron-Cohen, Simon; Lombardo, Michael V.; Wheelwright, Sally J.; Chakrabarti, Bhismadev; Lai, Meng-Chuan; Ruigrok, Amber N. V.; Leemans, Alexander; Ecker, Christine; Consortium, MRC AIMS; Craig, Michael C.
2016-01-01
Abstract It has been postulated that autism spectrum disorder is underpinned by an ‘atypical connectivity’ involving higher-order association brain regions. To test this hypothesis in a large cohort of adults with autism spectrum disorder we compared the white matter networks of 61 adult males with autism spectrum disorder and 61 neurotypical controls, using two complementary approaches to diffusion tensor magnetic resonance imaging. First, we applied tract-based spatial statistics, a ‘whole brain’ non-hypothesis driven method, to identify differences in white matter networks in adults with autism spectrum disorder. Following this we used a tract-specific analysis, based on tractography, to carry out a more detailed analysis of individual tracts identified by tract-based spatial statistics. Finally, within the autism spectrum disorder group, we studied the relationship between diffusion measures and autistic symptom severity. Tract-based spatial statistics revealed that autism spectrum disorder was associated with significantly reduced fractional anisotropy in regions that included frontal lobe pathways. Tractography analysis of these specific pathways showed increased mean and perpendicular diffusivity, and reduced number of streamlines in the anterior and long segments of the arcuate fasciculus, cingulum and uncinate—predominantly in the left hemisphere. Abnormalities were also evident in the anterior portions of the corpus callosum connecting left and right frontal lobes. The degree of microstructural alteration of the arcuate and uncinate fasciculi was associated with severity of symptoms in language and social reciprocity in childhood. Our results indicated that autism spectrum disorder is a developmental condition associated with abnormal connectivity of the frontal lobes. Furthermore our findings showed that male adults with autism spectrum disorder have regional differences in brain anatomy, which correlate with specific aspects of autistic symptoms. Overall these results suggest that autism spectrum disorder is a condition linked to aberrant developmental trajectories of the frontal networks that persist in adult life. PMID:26912520
Levis, Angelo G; Minicuci, Nadia; Ricci, Paolo; Gennaro, Valerio; Garbisa, Spiridione
2011-06-17
Whether or not there is a relationship between use of mobile phones (analogue and digital cellulars, and cordless) and head tumour risk (brain tumours, acoustic neuromas, and salivary gland tumours) is still a matter of debate; progress requires a critical analysis of the methodological elements necessary for an impartial evaluation of contradictory studies. A close examination of the protocols and results from all case-control and cohort studies, pooled- and meta-analyses on head tumour risk for mobile phone users was carried out, and for each study the elements necessary for evaluating its reliability were identified. In addition, new meta-analyses of the literature data were undertaken. These were limited to subjects with mobile phone latency time compatible with the progression of the examined tumours, and with analysis of the laterality of head tumour localisation corresponding to the habitual laterality of mobile phone use. Blind protocols, free from errors, bias, and financial conditioning factors, give positive results that reveal a cause-effect relationship between long-term mobile phone use or latency and statistically significant increase of ipsilateral head tumour risk, with biological plausibility. Non-blind protocols, which instead are affected by errors, bias, and financial conditioning factors, give negative results with systematic underestimate of such risk. However, also in these studies a statistically significant increase in risk of ipsilateral head tumours is quite common after more than 10 years of mobile phone use or latency. The meta-analyses, our included, examining only data on ipsilateral tumours in subjects using mobile phones since or for at least 10 years, show large and statistically significant increases in risk of ipsilateral brain gliomas and acoustic neuromas. Our analysis of the literature studies and of the results from meta-analyses of the significant data alone shows an almost doubling of the risk of head tumours induced by long-term mobile phone use or latency.