Statistics Education Research in Malaysia and the Philippines: A Comparative Analysis
ERIC Educational Resources Information Center
Reston, Enriqueta; Krishnan, Saras; Idris, Noraini
2014-01-01
This paper presents a comparative analysis of statistics education research in Malaysia and the Philippines by modes of dissemination, research areas, and trends. An electronic search for published research papers in the area of statistics education from 2000-2012 yielded 20 for Malaysia and 19 for the Philippines. Analysis of these papers showed…
Spatial variability effects on precision and power of forage yield estimation
USDA-ARS?s Scientific Manuscript database
Spatial analyses of yield trials are important, as they adjust cultivar means for spatial variation and improve the statistical precision of yield estimation. While the relative efficiency of spatial analysis has been frequently reported in several yield trials, its application on long-term forage y...
Rojas-Rejón, Oscar A; Sánchez, Arturo
2014-07-01
This work studies the effect of initial solid load (4-32 %; w/v, DS) and particle size (0.41-50 mm) on monosaccharide yield of wheat straw subjected to dilute H(2)SO(4) (0.75 %, v/v) pretreatment and enzymatic saccharification. Response surface methodology (RSM) based on a full factorial design (FFD) was used for the statistical analysis of pretreatment and enzymatic hydrolysis. The highest xylose yield obtained during pretreatment (ca. 86 %; of theoretical) was achieved at 4 % (w/v, DS) and 25 mm. The solid fraction obtained from the first set of experiments was subjected to enzymatic hydrolysis at constant enzyme dosage (17 FPU/g); statistical analysis revealed that glucose yield was favored with solids pretreated at low initial solid loads and small particle sizes. Dynamic experiments showed that glucose yield did not increase after 48 h of enzymatic hydrolysis. Once established pretreatment conditions, experiments were carried out with several initial solid loading (4-24 %; w/v, DS) and enzyme dosages (5-50 FPU/g). Two straw sizes (0.41 and 50 mm) were used for verification purposes. The highest glucose yield (ca. 55 %; of theoretical) was achieved at 4 % (w/v, DS), 0.41 mm and 50 FPU/g. Statistical analysis of experiments showed that at low enzyme dosage, particle size had a remarkable effect over glucose yield and initial solid load was the main factor for glucose yield.
Comparison of statistical models for analyzing wheat yield time series.
Michel, Lucie; Makowski, David
2013-01-01
The world's population is predicted to exceed nine billion by 2050 and there is increasing concern about the capability of agriculture to feed such a large population. Foresight studies on food security are frequently based on crop yield trends estimated from yield time series provided by national and regional statistical agencies. Various types of statistical models have been proposed for the analysis of yield time series, but the predictive performances of these models have not yet been evaluated in detail. In this study, we present eight statistical models for analyzing yield time series and compare their ability to predict wheat yield at the national and regional scales, using data provided by the Food and Agriculture Organization of the United Nations and by the French Ministry of Agriculture. The Holt-Winters and dynamic linear models performed equally well, giving the most accurate predictions of wheat yield. However, dynamic linear models have two advantages over Holt-Winters models: they can be used to reconstruct past yield trends retrospectively and to analyze uncertainty. The results obtained with dynamic linear models indicated a stagnation of wheat yields in many countries, but the estimated rate of increase of wheat yield remained above 0.06 t ha⁻¹ year⁻¹ in several countries in Europe, Asia, Africa and America, and the estimated values were highly uncertain for several major wheat producing countries. The rate of yield increase differed considerably between French regions, suggesting that efforts to identify the main causes of yield stagnation should focus on a subnational scale.
NASA Technical Reports Server (NTRS)
Welker, J.
1981-01-01
A histogram analysis of average monthly precipitation over 30 and 84 year periods for both Maryland and Kansas was made and the results compared. A second analysis, a statistical assessment of the effect of average monthly precipitation on Kansas winter wheat yield was made. The data sets covered the three periods of 1941-1970, 1887-1970, and 1887-1921. Analyses of the limited data sets used (only the average monthly precipitation and temperature were correlated against yield) indicated that fall precipitation values, especially those of September and October, were more important to winter wheat yield than were spring values, particularly for the period 1941-1970.
Development of LACIE CCEA-1 weather/wheat yield models. [regression analysis
NASA Technical Reports Server (NTRS)
Strommen, N. D.; Sakamoto, C. M.; Leduc, S. K.; Umberger, D. E. (Principal Investigator)
1979-01-01
The advantages and disadvantages of the casual (phenological, dynamic, physiological), statistical regression, and analog approaches to modeling for grain yield are examined. Given LACIE's primary goal of estimating wheat production for the large areas of eight major wheat-growing regions, the statistical regression approach of correlating historical yield and climate data offered the Center for Climatic and Environmental Assessment the greatest potential return within the constraints of time and data sources. The basic equation for the first generation wheat-yield model is given. Topics discussed include truncation, trend variable, selection of weather variables, episodic events, strata selection, operational data flow, weighting, and model results.
Integration of statistical and physiological analyses of adaptation of near-isogenic barley lines.
Romagosa, I; Fox, P N; García Del Moral, L F; Ramos, J M; García Del Moral, B; Roca de Togores, F; Molina-Cano, J L
1993-08-01
Seven near-isogenic barley lines, differing for three independent mutant genes, were grown in 15 environments in Spain. Genotype x environment interaction (G x E) for grain yield was examined with the Additive Main Effects and Multiplicative interaction (AMMI) model. The results of this statistical analysis of multilocation yield-data were compared with a morpho-physiological characterization of the lines at two sites (Molina-Cano et al. 1990). The first two principal component axes from the AMMI analysis were strongly associated with the morpho-physiological characters. The independent but parallel discrimination among genotypes reflects genetic differences and highlights the power of the AMMI analysis as a tool to investigate G x E. Characters which appear to be positively associated with yield in the germplasm under study could be identified for some environments.
Comparison of Statistical Models for Analyzing Wheat Yield Time Series
Michel, Lucie; Makowski, David
2013-01-01
The world's population is predicted to exceed nine billion by 2050 and there is increasing concern about the capability of agriculture to feed such a large population. Foresight studies on food security are frequently based on crop yield trends estimated from yield time series provided by national and regional statistical agencies. Various types of statistical models have been proposed for the analysis of yield time series, but the predictive performances of these models have not yet been evaluated in detail. In this study, we present eight statistical models for analyzing yield time series and compare their ability to predict wheat yield at the national and regional scales, using data provided by the Food and Agriculture Organization of the United Nations and by the French Ministry of Agriculture. The Holt-Winters and dynamic linear models performed equally well, giving the most accurate predictions of wheat yield. However, dynamic linear models have two advantages over Holt-Winters models: they can be used to reconstruct past yield trends retrospectively and to analyze uncertainty. The results obtained with dynamic linear models indicated a stagnation of wheat yields in many countries, but the estimated rate of increase of wheat yield remained above 0.06 t ha−1 year−1 in several countries in Europe, Asia, Africa and America, and the estimated values were highly uncertain for several major wheat producing countries. The rate of yield increase differed considerably between French regions, suggesting that efforts to identify the main causes of yield stagnation should focus on a subnational scale. PMID:24205280
Applied Behavior Analysis and Statistical Process Control?
ERIC Educational Resources Information Center
Hopkins, B. L.
1995-01-01
Incorporating statistical process control (SPC) methods into applied behavior analysis is discussed. It is claimed that SPC methods would likely reduce applied behavior analysts' intimate contacts with problems and would likely yield poor treatment and research decisions. Cases and data presented by Pfadt and Wheeler (1995) are cited as examples.…
Statistical Analysis Experiment for Freshman Chemistry Lab.
ERIC Educational Resources Information Center
Salzsieder, John C.
1995-01-01
Describes a laboratory experiment dissolving zinc from galvanized nails in which data can be gathered very quickly for statistical analysis. The data have sufficient significant figures and the experiment yields a nice distribution of random errors. Freshman students can gain an appreciation of the relationships between random error, number of…
NASA Astrophysics Data System (ADS)
Jiang, H.; Lin, T.
2017-12-01
Rain-fed corn production systems are subject to sub-seasonal variations of precipitation and temperature during the growing season. As each growth phase has varied inherent physiological process, plants necessitate different optimal environmental conditions during each phase. However, this temporal heterogeneity towards climate variability alongside the lifecycle of crops is often simplified and fixed as constant responses in large scale statistical modeling analysis. To capture the time-variant growing requirements in large scale statistical analysis, we develop and compare statistical models at various spatial and temporal resolutions to quantify the relationship between corn yield and weather factors for 12 corn belt states from 1981 to 2016. The study compares three spatial resolutions (county, agricultural district, and state scale) and three temporal resolutions (crop growth phase, monthly, and growing season) to characterize the effects of spatial and temporal variability. Our results show that the agricultural district model together with growth phase resolution can explain 52% variations of corn yield caused by temperature and precipitation variability. It provides a practical model structure balancing the overfitting problem in county specific model and weak explanation power in state specific model. In US corn belt, precipitation has positive impact on corn yield in growing season except for vegetative stage while extreme heat attains highest sensitivity from silking to dough phase. The results show the northern counties in corn belt area are less interfered by extreme heat but are more vulnerable to water deficiency.
75 FR 7412 - Reporting Information Regarding Falsification of Data
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-19
... concomitant medications or treatments; omitting data so that a statistical analysis yields a result that would..., results, statistics, items of information, or statements made by individuals. This proposed rule would..., Bureau of Labor Statistics ( www.bls.gov/oes/current/naics4_325400.htm ); compliance officer wage rate...
AMMI adjustment for statistical analysis of an international wheat yield trial.
Crossa, J; Fox, P N; Pfeiffer, W H; Rajaram, S; Gauch, H G
1991-01-01
Multilocation trials are important for the CIMMYT Bread Wheat Program in producing high-yielding, adapted lines for a wide range of environments. This study investigated procedures for improving predictive success of a yield trial, grouping environments and genotypes into homogeneous subsets, and determining the yield stability of 18 CIMMYT bread wheats evaluated at 25 locations. Additive Main effects and Multiplicative Interaction (AMMI) analysis gave more precise estimates of genotypic yields within locations than means across replicates. This precision facilitated formation by cluster analysis of more cohesive groups of genotypes and locations for biological interpretation of interactions than occurred with unadjusted means. Locations were clustered into two subsets for which genotypes with positive interactions manifested in high, stable yields were identified. The analyses highlighted superior selections with both broad and specific adaptation.
Statistical Analysis of Large Simulated Yield Datasets for Studying Climate Effects
NASA Technical Reports Server (NTRS)
Makowski, David; Asseng, Senthold; Ewert, Frank; Bassu, Simona; Durand, Jean-Louis; Martre, Pierre; Adam, Myriam; Aggarwal, Pramod K.; Angulo, Carlos; Baron, Chritian;
2015-01-01
Many studies have been carried out during the last decade to study the effect of climate change on crop yields and other key crop characteristics. In these studies, one or several crop models were used to simulate crop growth and development for different climate scenarios that correspond to different projections of atmospheric CO2 concentration, temperature, and rainfall changes (Semenov et al., 1996; Tubiello and Ewert, 2002; White et al., 2011). The Agricultural Model Intercomparison and Improvement Project (AgMIP; Rosenzweig et al., 2013) builds on these studies with the goal of using an ensemble of multiple crop models in order to assess effects of climate change scenarios for several crops in contrasting environments. These studies generate large datasets, including thousands of simulated crop yield data. They include series of yield values obtained by combining several crop models with different climate scenarios that are defined by several climatic variables (temperature, CO2, rainfall, etc.). Such datasets potentially provide useful information on the possible effects of different climate change scenarios on crop yields. However, it is sometimes difficult to analyze these datasets and to summarize them in a useful way due to their structural complexity; simulated yield data can differ among contrasting climate scenarios, sites, and crop models. Another issue is that it is not straightforward to extrapolate the results obtained for the scenarios to alternative climate change scenarios not initially included in the simulation protocols. Additional dynamic crop model simulations for new climate change scenarios are an option but this approach is costly, especially when a large number of crop models are used to generate the simulated data, as in AgMIP. Statistical models have been used to analyze responses of measured yield data to climate variables in past studies (Lobell et al., 2011), but the use of a statistical model to analyze yields simulated by complex process-based crop models is a rather new idea. We demonstrate herewith that statistical methods can play an important role in analyzing simulated yield data sets obtained from the ensembles of process-based crop models. Formal statistical analysis is helpful to estimate the effects of different climatic variables on yield, and to describe the between-model variability of these effects.
Chou, C P; Bentler, P M; Satorra, A
1991-11-01
Research studying robustness of maximum likelihood (ML) statistics in covariance structure analysis has concluded that test statistics and standard errors are biased under severe non-normality. An estimation procedure known as asymptotic distribution free (ADF), making no distributional assumption, has been suggested to avoid these biases. Corrections to the normal theory statistics to yield more adequate performance have also been proposed. This study compares the performance of a scaled test statistic and robust standard errors for two models under several non-normal conditions and also compares these with the results from ML and ADF methods. Both ML and ADF test statistics performed rather well in one model and considerably worse in the other. In general, the scaled test statistic seemed to behave better than the ML test statistic and the ADF statistic performed the worst. The robust and ADF standard errors yielded more appropriate estimates of sampling variability than the ML standard errors, which were usually downward biased, in both models under most of the non-normal conditions. ML test statistics and standard errors were found to be quite robust to the violation of the normality assumption when data had either symmetric and platykurtic distributions, or non-symmetric and zero kurtotic distributions.
NASA Astrophysics Data System (ADS)
Hoffman, A.; Forest, C. E.; Kemanian, A.
2016-12-01
A significant number of food-insecure nations exist in regions of the world where dust plays a large role in the climate system. While the impacts of common climate variables (e.g. temperature, precipitation, ozone, and carbon dioxide) on crop yields are relatively well understood, the impact of mineral aerosols on yields have not yet been thoroughly investigated. This research aims to develop the data and tools to progress our understanding of mineral aerosol impacts on crop yields. Suspended dust affects crop yields by altering the amount and type of radiation reaching the plant, modifying local temperature and precipitation. While dust events (i.e. dust storms) affect crop yields by depleting the soil of nutrients or by defoliation via particle abrasion. The impact of dust on yields is modeled statistically because we are uncertain which impacts will dominate the response on national and regional scales considered in this study. Multiple linear regression is used in a number of large-scale statistical crop modeling studies to estimate yield responses to various climate variables. In alignment with previous work, we develop linear crop models, but build upon this simple method of regression with machine-learning techniques (e.g. random forests) to identify important statistical predictors and isolate how dust affects yields on the scales of interest. To perform this analysis, we develop a crop-climate dataset for maize, soybean, groundnut, sorghum, rice, and wheat for the regions of West Africa, East Africa, South Africa, and the Sahel. Random forest regression models consistently model historic crop yields better than the linear models. In several instances, the random forest models accurately capture the temperature and precipitation threshold behavior in crops. Additionally, improving agricultural technology has caused a well-documented positive trend that dominates time series of global and regional yields. This trend is often removed before regression with traditional crop models, but likely at the cost of removing climate information. Our random forest models consistently discover the positive trend without removing any additional data. The application of random forests as a statistical crop model provides insight into understanding the impact of dust on yields in marginal food producing regions.
Afshari, Kasra; Samavati, Vahid; Shahidi, Seyed-Ahmad
2015-03-01
The effects of ultrasonic power, extraction time, extraction temperature, and the water-to-raw material ratio on extraction yield of crude polysaccharide from the leaf of Hibiscus rosa-sinensis (HRLP) were optimized by statistical analysis using response surface methodology. The response surface methodology (RSM) was used to optimize HRLP extraction yield by implementing the Box-Behnken design (BBD). The experimental data obtained were fitted to a second-order polynomial equation using multiple regression analysis and also analyzed by appropriate statistical methods (ANOVA). Analysis of the results showed that the linear and quadratic terms of these four variables had significant effects. The optimal conditions for the highest extraction yield of HRLP were: ultrasonic power, 93.59 W; extraction time, 25.71 min; extraction temperature, 93.18°C; and the water to raw material ratio, 24.3 mL/g. Under these conditions, the experimental yield was 9.66±0.18%, which is well in close agreement with the value predicted by the model 9.526%. The results demonstrated that HRLP had strong scavenging activities in vitro on DPPH and hydroxyl radicals. Copyright © 2014 Elsevier B.V. All rights reserved.
Method for factor analysis of GC/MS data
Van Benthem, Mark H; Kotula, Paul G; Keenan, Michael R
2012-09-11
The method of the present invention provides a fast, robust, and automated multivariate statistical analysis of gas chromatography/mass spectroscopy (GC/MS) data sets. The method can involve systematic elimination of undesired, saturated peak masses to yield data that follow a linear, additive model. The cleaned data can then be subjected to a combination of PCA and orthogonal factor rotation followed by refinement with MCR-ALS to yield highly interpretable results.
Supaporn, Pansuwan; Yeom, Sung Ho
2018-04-30
This study investigated the biological conversion of crude glycerol generated from a commercial biodiesel production plant as a by-product to 1,3-propanediol (1,3-PD). Statistical analysis was employed to derive a statistical model for the individual and interactive effects of glycerol, (NH 4 ) 2 SO 4 , trace elements, pH, and cultivation time on the four objectives: 1,3-PD concentration, yield, selectivity, and productivity. Optimum conditions for each objective with its maximum value were predicted by statistical optimization, and experiments under the optimum conditions verified the predictions. In addition, by systematic analysis of the values of four objectives, optimum conditions for 1,3-PD concentration (49.8 g/L initial glycerol, 4.0 g/L of (NH 4 ) 2 SO 4 , 2.0 mL/L of trace element, pH 7.5, and 11.2 h of cultivation time) were determined to be the global optimum culture conditions for 1,3-PD production. Under these conditions, we could achieve high 1,3-PD yield (47.4%), 1,3-PD selectivity (88.8%), and 1,3-PD productivity (2.1/g/L/h) as well as high 1,3-PD concentration (23.6 g/L).
Dadaser-Celik, Filiz; Azgin, Sukru Taner; Yildiz, Yalcin Sevki
2016-12-01
Biogas production from food waste has been used as an efficient waste treatment option for years. The methane yields from decomposition of waste are, however, highly variable under different operating conditions. In this study, a statistical experimental design method (Taguchi OA 9 ) was implemented to investigate the effects of simultaneous variations of three parameters on methane production. The parameters investigated were solid content (SC), carbon/nitrogen ratio (C/N) and food/inoculum ratio (F/I). Two sets of experiments were conducted with nine anaerobic reactors operating under different conditions. Optimum conditions were determined using statistical analysis, such as analysis of variance (ANOVA). A confirmation experiment was carried out at optimum conditions to investigate the validity of the results. Statistical analysis showed that SC was the most important parameter for methane production with a 45% contribution, followed by F/I ratio with a 35% contribution. The optimum methane yield of 151 l kg -1 volatile solids (VS) was achieved after 24 days of digestion when SC was 4%, C/N was 28 and F/I were 0.3. The confirmation experiment provided a methane yield of 167 l kg -1 VS after 24 days. The analysis showed biogas production from food waste may be increased by optimization of operating conditions. © The Author(s) 2016.
Mathematical and statistical analysis of the effect of boron on yield parameters of wheat
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rawashdeh, Hamzeh; Sala, Florin; Boldea, Marius
The main objective of this research is to investigate the effect of foliar applications of boron at different growth stages on yield and yield parameters of wheat. The contribution of boron in achieving yield parameters is described by second degree polynomial equations, with high statistical confidence (p<0.01; F theoretical < F calculated, according to ANOVA test, for Alfa = 0.05). Regression analysis, based on R{sup 2} values obtained, made it possible to evaluate the particular contribution of boron to the realization of yield parameters. This was lower for spike length (R{sup 2} = 0.812), thousand seeds weight (R{sup 2} =more » 0.850) and higher in the case of the number of spikelets (R{sup 2} = 0.936) and the number of seeds on a spike (R{sup 2} = 0.960). These results confirm that boron plays an important part in achieving the number of seeds on a spike in the case of wheat, as the contribution of this element to the process of flower fertilization is well-known. In regards to productivity elements, the contribution of macroelements to yield quantity is clear, the contribution of B alone being R{sup 2} = 0.868.« less
Ion induced electron emission statistics under Agm- cluster bombardment of Ag
NASA Astrophysics Data System (ADS)
Breuers, A.; Penning, R.; Wucher, A.
2018-05-01
The electron emission from a polycrystalline silver surface under bombardment with Agm- cluster ions (m = 1, 2, 3) is investigated in terms of ion induced kinetic excitation. The electron yield γ is determined directly by a current measurement method on the one hand and implicitly by the analysis of the electron emission statistics on the other hand. Successful measurements of the electron emission spectra ensure a deeper understanding of the ion induced kinetic electron emission process, with particular emphasis on the effect of the projectile cluster size to the yield as well as to emission statistics. The results allow a quantitative comparison to computer simulations performed for silver atoms and clusters impinging onto a silver surface.
Johnston, David J; Moreau, Robert A
2017-02-01
The aim of this study was to determine if the compositional difference between grain sorghum and corn impact ethanol yields and coproduct value when grain sorghum is incorporated into existing corn ethanol facilities. Fermentation properties of corn and grain sorghum were compared utilizing two fermentation systems (conventional thermal starch liquefaction and native starch hydrolysis). Fermentation results indicated that protease addition influenced the fermentation rate and yield for grain sorghum, improving yields by 1-2% over non-protease treated fermentations. Distillers Dried Grains with Solubles produced from sorghum had a statistically significant higher yields and significantly higher protein content relative to corn. Lipid analysis of the Distillers Dried Grains with Solubles showed statistically significant differences between corn and sorghum in triacylglycerol, diacylglycerol and free fatty acid levels. Published by Elsevier Ltd.
Theoretical analysis of HVAC duct hanger systems
NASA Technical Reports Server (NTRS)
Miller, R. D.
1987-01-01
Several methods are presented which, together, may be used in the analysis of duct hanger systems over a wide range of frequencies. The finite element method (FEM) and component mode synthesis (CMS) method are used for low- to mid-frequency range computations and have been shown to yield reasonably close results. The statistical energy analysis (SEA) method yields predictions which agree with the CMS results for the 800 to 1000 Hz range provided that a sufficient number of modes participate. The CMS approach has been shown to yield valuable insight into the mid-frequency range of the analysis. It has been demonstrated that it is possible to conduct an analysis of a duct/hanger system in a cost-effective way for a wide frequency range, using several methods which overlap for several frequency bands.
Multivariate Statistical Analysis of Cigarette Design Feature Influence on ISO TNCO Yields.
Agnew-Heard, Kimberly A; Lancaster, Vicki A; Bravo, Roberto; Watson, Clifford; Walters, Matthew J; Holman, Matthew R
2016-06-20
The aim of this study is to explore how differences in cigarette physical design parameters influence tar, nicotine, and carbon monoxide (TNCO) yields in mainstream smoke (MSS) using the International Organization of Standardization (ISO) smoking regimen. Standardized smoking methods were used to evaluate 50 U.S. domestic brand cigarettes and a reference cigarette representing a range of TNCO yields in MSS collected from linear smoking machines using a nonintense smoking regimen. Multivariate statistical methods were used to form clusters of cigarettes based on their ISO TNCO yields and then to explore the relationship between the ISO generated TNCO yields and the nine cigarette physical design parameters between and within each cluster simultaneously. The ISO generated TNCO yields in MSS are 1.1-17.0 mg tar/cigarette, 0.1-2.2 mg nicotine/cigarette, and 1.6-17.3 mg CO/cigarette. Cluster analysis divided the 51 cigarettes into five discrete clusters based on their ISO TNCO yields. No one physical parameter dominated across all clusters. Predicting ISO machine generated TNCO yields based on these nine physical design parameters is complex due to the correlation among and between the nine physical design parameters and TNCO yields. From these analyses, it is estimated that approximately 20% of the variability in the ISO generated TNCO yields comes from other parameters (e.g., filter material, filter type, inclusion of expanded or reconstituted tobacco, and tobacco blend composition, along with differences in tobacco leaf origin and stalk positions and added ingredients). A future article will examine the influence of these physical design parameters on TNCO yields under a Canadian Intense (CI) smoking regimen. Together, these papers will provide a more robust picture of the design features that contribute to TNCO exposure across the range of real world smoking patterns.
Multi-trait analysis of genome-wide association summary statistics using MTAG.
Turley, Patrick; Walters, Raymond K; Maghzian, Omeed; Okbay, Aysu; Lee, James J; Fontana, Mark Alan; Nguyen-Viet, Tuan Anh; Wedow, Robbee; Zacher, Meghan; Furlotte, Nicholas A; Magnusson, Patrik; Oskarsson, Sven; Johannesson, Magnus; Visscher, Peter M; Laibson, David; Cesarini, David; Neale, Benjamin M; Benjamin, Daniel J
2018-02-01
We introduce multi-trait analysis of GWAS (MTAG), a method for joint analysis of summary statistics from genome-wide association studies (GWAS) of different traits, possibly from overlapping samples. We apply MTAG to summary statistics for depressive symptoms (N eff = 354,862), neuroticism (N = 168,105), and subjective well-being (N = 388,538). As compared to the 32, 9, and 13 genome-wide significant loci identified in the single-trait GWAS (most of which are themselves novel), MTAG increases the number of associated loci to 64, 37, and 49, respectively. Moreover, association statistics from MTAG yield more informative bioinformatics analyses and increase the variance explained by polygenic scores by approximately 25%, matching theoretical expectations.
Statistical and Economic Techniques for Site-specific Nematode Management.
Liu, Zheng; Griffin, Terry; Kirkpatrick, Terrence L
2014-03-01
Recent advances in precision agriculture technologies and spatial statistics allow realistic, site-specific estimation of nematode damage to field crops and provide a platform for the site-specific delivery of nematicides within individual fields. This paper reviews the spatial statistical techniques that model correlations among neighboring observations and develop a spatial economic analysis to determine the potential of site-specific nematicide application. The spatial econometric methodology applied in the context of site-specific crop yield response contributes to closing the gap between data analysis and realistic site-specific nematicide recommendations and helps to provide a practical method of site-specifically controlling nematodes.
A Random Variable Approach to Nuclear Targeting and Survivability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Undem, Halvor A.
We demonstrate a common mathematical formalism for analyzing problems in nuclear survivability and targeting. This formalism, beginning with a random variable approach, can be used to interpret past efforts in nuclear-effects analysis, including targeting analysis. It can also be used to analyze new problems brought about by the post Cold War Era, such as the potential effects of yield degradation in a permanently untested nuclear stockpile. In particular, we illustrate the formalism through four natural case studies or illustrative problems, linking these to actual past data, modeling, and simulation, and suggesting future uses. In the first problem, we illustrate themore » case of a deterministically modeled weapon used against a deterministically responding target. Classic "Cookie Cutter" damage functions result. In the second problem, we illustrate, with actual target test data, the case of a deterministically modeled weapon used against a statistically responding target. This case matches many of the results of current nuclear targeting modeling and simulation tools, including the result of distance damage functions as complementary cumulative lognormal functions in the range variable. In the third problem, we illustrate the case of a statistically behaving weapon used against a deterministically responding target. In particular, we show the dependence of target damage on weapon yield for an untested nuclear stockpile experiencing yield degradation. Finally, and using actual unclassified weapon test data, we illustrate in the fourth problem the case of a statistically behaving weapon used against a statistically responding target.« less
Davis, J.C.
2000-01-01
Geologists may feel that geological data are not amenable to statistical analysis, or at best require specialized approaches such as nonparametric statistics and geostatistics. However, there are many circumstances, particularly in systematic studies conducted for environmental or regulatory purposes, where traditional parametric statistical procedures can be beneficial. An example is the application of analysis of variance to data collected in an annual program of measuring groundwater levels in Kansas. Influences such as well conditions, operator effects, and use of the water can be assessed and wells that yield less reliable measurements can be identified. Such statistical studies have resulted in yearly improvements in the quality and reliability of the collected hydrologic data. Similar benefits may be achieved in other geological studies by the appropriate use of classical statistical tools.
Sex differences in discriminative power of volleyball game-related statistics.
João, Paulo Vicente; Leite, Nuno; Mesquita, Isabel; Sampaio, Jaime
2010-12-01
To identify sex differences in volleyball game-related statistics, the game-related statistics of several World Championships in 2007 (N=132) were analyzed using the software VIS from the International Volleyball Federation. Discriminant analysis was used to identify the game-related statistics which better discriminated performances by sex. Analysis yielded an emphasis on fault serves (SC = -.40), shot spikes (SC = .40), and reception digs (SC = .31). Specific robust numbers represent that considerable variability was evident in the game-related statistics profile, as men's volleyball games were better associated with terminal actions (errors of service), and women's volleyball games were characterized by continuous actions (in defense and attack). These differences may be related to the anthropometric and physiological differences between women and men and their influence on performance profiles.
The role of climatic variables in winter cereal yields: a retrospective analysis.
Luo, Qunying; Wen, Li
2015-02-01
This study examined the effects of observed climate including [CO2] on winter cereal [winter wheat (Triticum aestivum), barley (Hordeum vulgare) and oat (Avena sativa)] yields by adopting robust statistical analysis/modelling approaches (i.e. autoregressive fractionally integrated moving average, generalised addition model) based on long time series of historical climate data and cereal yield data at three locations (Moree, Dubbo and Wagga Wagga) in New South Wales, Australia. Research results show that (1) growing season rainfall was significantly, positively and non-linearly correlated with crop yield at all locations considered; (2) [CO2] was significantly, positively and non-linearly correlated with crop yields in all cases except wheat and barley yields at Wagga Wagga; (3) growing season maximum temperature was significantly, negatively and non-linearly correlated with crop yields at Dubbo and Moree (except for barley); and (4) radiation was only significantly correlated with oat yield at Wagga Wagga. This information will help to identify appropriate management adaptation options in dealing with the risk and in taking the opportunities of climate change.
Lamm, Steven H; Ferdosi, Hamid; Dissen, Elisabeth K; Li, Ji; Ahn, Jaeil
2015-12-07
High levels (> 200 µg/L) of inorganic arsenic in drinking water are known to be a cause of human lung cancer, but the evidence at lower levels is uncertain. We have sought the epidemiological studies that have examined the dose-response relationship between arsenic levels in drinking water and the risk of lung cancer over a range that includes both high and low levels of arsenic. Regression analysis, based on six studies identified from an electronic search, examined the relationship between the log of the relative risk and the log of the arsenic exposure over a range of 1-1000 µg/L. The best-fitting continuous meta-regression model was sought and found to be a no-constant linear-quadratic analysis where both the risk and the exposure had been logarithmically transformed. This yielded both a statistically significant positive coefficient for the quadratic term and a statistically significant negative coefficient for the linear term. Sub-analyses by study design yielded results that were similar for both ecological studies and non-ecological studies. Statistically significant X-intercepts consistently found no increased level of risk at approximately 100-150 µg/L arsenic.
Hosseini Koupaie, E; Barrantes Leiva, M; Eskicioglu, C; Dutil, C
2014-01-01
The feasibility of anaerobic co-digestion of two juice-based beverage industrial wastes, screen cake (SC) and thickened waste activated sludge (TWAS), along with municipal sludge cake (MC) was investigated. Experiments were conducted in twenty mesophilic batch 160 ml serum bottles with no inhibition occurred. The statistical analysis proved that the substrate type had statistically significant effect on both ultimate biogas and methane yields (P=0.0003<0.05). The maximum and minimum ultimate cumulative methane yields were 890.90 and 308.34 mL/g-VSremoved from the digesters containing only TWAS and SC as substrate. First-order reaction model well described VS utilization in all digesters. The first 2-day and 10-day specific biodegradation rate constants were statistically higher in the digesters containing SC (P=0.004<0.05) and MC (P=0.0005<0.05), respectively. The cost-benefit analysis showed that the capital, operating and total costs can be decreased by 21.5%, 29.8% and 27.6%, respectively using a co-digester rather than two separate digesters. Copyright © 2013 Elsevier Ltd. All rights reserved.
Lamm, Steven H.; Ferdosi, Hamid; Dissen, Elisabeth K.; Li, Ji; Ahn, Jaeil
2015-01-01
High levels (> 200 µg/L) of inorganic arsenic in drinking water are known to be a cause of human lung cancer, but the evidence at lower levels is uncertain. We have sought the epidemiological studies that have examined the dose-response relationship between arsenic levels in drinking water and the risk of lung cancer over a range that includes both high and low levels of arsenic. Regression analysis, based on six studies identified from an electronic search, examined the relationship between the log of the relative risk and the log of the arsenic exposure over a range of 1–1000 µg/L. The best-fitting continuous meta-regression model was sought and found to be a no-constant linear-quadratic analysis where both the risk and the exposure had been logarithmically transformed. This yielded both a statistically significant positive coefficient for the quadratic term and a statistically significant negative coefficient for the linear term. Sub-analyses by study design yielded results that were similar for both ecological studies and non-ecological studies. Statistically significant X-intercepts consistently found no increased level of risk at approximately 100–150 µg/L arsenic. PMID:26690190
Effects of Hydrological Parameters on Palm Oil Fresh Fruit Bunch Yield)
NASA Astrophysics Data System (ADS)
Nda, M.; Adnan, M. S.; Suhadak, M. A.; Zakaria, M. S.; Lopa, R. T.
2018-04-01
Climate change effects and variability have been studied by many researchers in diverse geophysical fields. Malaysia produces large volume of palm oil, the effects of climate change on hydrological parameters (rainfall and precipitation) could have adverse effects on palm oil fresh fruit bunch (FFB) production with implications at both local and international market. It is important to understand the effects of climate change on crop yield to adopt new cultivation techniques and guaranteeing food security globally. Based on this background, the paper’s objective is to investigate the effects of rainfall and temperature pattern on crop yield (FFB) within five years period (2013 - 2017) at Batu Pahat District. The Man - Kendall rank technique (trend test) and statistical analyses (correlation and regression) were applied to the dataset used for the study. The results reveal that there are variabilities in rainfall and temperature from one month to the other and the statistical analysis reveals that the hydrological parameters have an insignificant effect on crop yield.
Qumseya, Bashar J; Wang, Haibo; Badie, Nicole; Uzomba, Rosemary N; Parasa, Sravanthi; White, Donna L; Wolfsen, Herbert; Sharma, Prateek; Wallace, Michael B
2013-12-01
US guidelines recommend surveillance of patients with Barrett's esophagus (BE) to detect dysplasia. BE conventionally is monitored via white-light endoscopy (WLE) and a collection of random biopsy specimens. However, this approach does not definitively or consistently detect areas of dysplasia. Advanced imaging technologies can increase the detection of dysplasia and cancer. We investigated whether these imaging technologies can increase the diagnostic yield for the detection of neoplasia in patients with BE, compared with WLE and analysis of random biopsy specimens. We performed a systematic review, using Medline and Embase, to identify relevant peer-review studies. Fourteen studies were included in the final analysis, with a total of 843 patients. Our metameter (estimate) of interest was the paired-risk difference (RD), defined as the difference in yield of the detection of dysplasia or cancer using advanced imaging vs WLE. The estimated paired-RD and 95% confidence interval (CI) were obtained using random-effects models. Heterogeneity was assessed by means of the Q statistic and the I(2) statistic. An exploratory meta-regression was performed to look for associations between the metameter and potential confounders or modifiers. Overall, advanced imaging techniques increased the diagnostic yield for detection of dysplasia or cancer by 34% (95% CI, 20%-56%; P < .0001). A subgroup analysis showed that virtual chromoendoscopy significantly increased the diagnostic yield (RD, 0.34; 95% CI, 0.14-0.56; P < .0001). The RD for chromoendoscopy was 0.35 (95% CI, 0.13-0.56; P = .0001). There was no significant difference between virtual chromoendoscopy and chromoendoscopy, based on Student t test analysis (P = .45). Based on a meta-analysis, advanced imaging techniques such as chromoendoscopy or virtual chromoendoscopy significantly increase the diagnostic yield for identification of dysplasia or cancer in patients with BE. Copyright © 2013 AGA Institute. Published by Elsevier Inc. All rights reserved.
Statistical analysis of CSP plants by simulating extensive meteorological series
NASA Astrophysics Data System (ADS)
Pavón, Manuel; Fernández, Carlos M.; Silva, Manuel; Moreno, Sara; Guisado, María V.; Bernardos, Ana
2017-06-01
The feasibility analysis of any power plant project needs the estimation of the amount of energy it will be able to deliver to the grid during its lifetime. To achieve this, its feasibility study requires a precise knowledge of the solar resource over a long term period. In Concentrating Solar Power projects (CSP), financing institutions typically requires several statistical probability of exceedance scenarios of the expected electric energy output. Currently, the industry assumes a correlation between probabilities of exceedance of annual Direct Normal Irradiance (DNI) and energy yield. In this work, this assumption is tested by the simulation of the energy yield of CSP plants using as input a 34-year series of measured meteorological parameters and solar irradiance. The results of this work show that, even if some correspondence between the probabilities of exceedance of annual DNI values and energy yields is found, the intra-annual distribution of DNI may significantly affect this correlation. This result highlights the need of standardized procedures for the elaboration of representative DNI time series representative of a given probability of exceedance of annual DNI.
Yield of bedrock wells in the Nashoba terrane, central and eastern Massachusetts
DeSimone, Leslie A.; Barbaro, Jeffrey R.
2012-01-01
The yield of bedrock wells in the fractured-bedrock aquifers of the Nashoba terrane and surrounding area, central and eastern Massachusetts, was investigated with analyses of existing data. Reported well yield was compiled for 7,287 wells from Massachusetts Department of Environmental Protection and U.S. Geological Survey databases. Yield of these wells ranged from 0.04 to 625 gallons per minute. In a comparison with data from 103 supply wells, yield and specific capacity from aquifer tests were well correlated, indicating that reported well yield was a reasonable measure of aquifer characteristics in the study area. Statistically significant relations were determined between well yield and a number of cultural and hydrogeologic factors. Cultural variables included intended water use, well depth, year of construction, and method of yield measurement. Bedrock geology, topography, surficial geology, and proximity to surface waters were statistically significant hydrogeologic factors. Yield of wells was higher in areas of granites, mafic intrusive rocks, and amphibolites than in areas of schists and gneisses or pelitic rocks; higher in valleys and low-slope areas than on hills, ridges, or high slopes; higher in areas overlain by stratified glacial deposits than in areas overlain by till; and higher in close proximity to streams, ponds, and wetlands than at greater distances from these surface-water features. Proximity to mapped faults and to lineaments from aerial photographs also were related to well yield by some measures in three quadrangles in the study area. Although the statistical significance of these relations was high, their predictive power was low, and these relations explained little of the variability in the well-yield data. Similar results were determined from a multivariate regression analysis. Multivariate regression models for the Nashoba terrane and for a three-quadrangle subarea included, as significant variables, many of the cultural and hydrogeologic factors that were individually related to well yield, in ways that are consistent with conceptual understanding of their effects, but the models explained only 21 percent (regional model for the entire terrane) and 30 percent (quadrangle model) of the overall variance in yield. Moreover, most of the explained variance was due to well characteristics rather than hydrogeologic factors. Hydrogeologic factors such as topography and geology are likely important. However, the overall high variability in the well-yield data, which results from the high variability in aquifer hydraulic properties as well as from limitations of the dataset, would make it difficult to use hydrogeologic factors to predict well yield in the study area. Geostatistical analysis (variograms), on the other hand, indicated that, although highly variable, the well-yield data are spatially correlated. The spatial continuity appears greater in the northeast-southwest direction and less in the southeast-northwest direction, directions that are parallel and perpendicular, respectively, to the regional geologic structural trends. Geostatistical analysis (kriging), used to estimate yield values throughout the study area, identified regional-scale areas of higher and lower yield that may be related to regional structural features—in particular, to a northeast-southwest trending regional fault zone within the Nashoba terrane. It also would be difficult to use kriging to predict yield at specific locations, however, because of the spatial variability in yield, particularly at small scales. The regional-scale analyses in this study, both with hydrogeologic variables and geostatistics, provide a context for understanding the variability in well yield, rather a basis for precise predictions, and site-specific information would be needed to understand local conditions.
Statistical analysis of flight times for space shuttle ferry flights
NASA Technical Reports Server (NTRS)
Graves, M. E.; Perlmutter, M.
1974-01-01
Markov chain and Monte Carlo analysis techniques are applied to the simulated Space Shuttle Orbiter Ferry flights to obtain statistical distributions of flight time duration between Edwards Air Force Base and Kennedy Space Center. The two methods are compared, and are found to be in excellent agreement. The flights are subjected to certain operational and meteorological requirements, or constraints, which cause eastbound and westbound trips to yield different results. Persistence of events theory is applied to the occurrence of inclement conditions to find their effect upon the statistical flight time distribution. In a sensitivity test, some of the constraints are varied to observe the corresponding changes in the results.
A Measurement of the D+(s) lifetime
DOE Office of Scientific and Technical Information (OSTI.GOV)
Link, J.M.; Yager, P.M.; /UC, Davis
2005-04-01
A high statistics measurement of the D{sub s}{sup +} lifetime from the Fermilab fixed-target FOCUS photoproduction experiment is presented. They describe the analysis of the two decay modes, D{sub s}{sup +} {yields} {phi}(1020){pi}{sup +} and D{sub s}{sup +} {yields} {bar K}*(892){sup 0}K{sup +}, used for the measurement. The measured lifetime is 507.4 {+-} 5.5(stat.) {+-} 5.1(syst.) is using 8961 {+-} 105 D{sub s}{sup +} {yields} {phi}(1020){pi}{sup +} and 4680 {+-} 90 D{sub s}{sup +} {yields} {bar K}*(892){sup 0} K{sup +} decays. This is a significant improvement over the present world average.
An Application of M[subscript 2] Statistic to Evaluate the Fit of Cognitive Diagnostic Models
ERIC Educational Resources Information Center
Liu, Yanlou; Tian, Wei; Xin, Tao
2016-01-01
The fit of cognitive diagnostic models (CDMs) to response data needs to be evaluated, since CDMs might yield misleading results when they do not fit the data well. Limited-information statistic M[subscript 2] and the associated root mean square error of approximation (RMSEA[subscript 2]) in item factor analysis were extended to evaluate the fit of…
NASA Technical Reports Server (NTRS)
Morain, S. A. (Principal Investigator); Williams, D. L.
1974-01-01
The author has identified the following significant results. Wheat area, yield, and production statistics as derived from satellite image analysis, combined with a weather model, are presented for a ten county area in southwest Kansas. The data (representing the 1972-73 crop year) are compared for accuracy against both the USDA August estimate and its final (official) tabulation. The area estimates from imagery for both dryland and irrigated winter wheat were within 5% of the official figures for the same area, and predated them by almost one year. Yield on dryland wheat was estimated by the Thompson weather model to within 0.1% of the observed yield. A combined irrigated and dryland wheat production estimate for the ten county area was completed in July, 1973 and was within 1% of the production reported by USDA in February, 1974.
A Comparison of Imputation Methods for Bayesian Factor Analysis Models
ERIC Educational Resources Information Center
Merkle, Edgar C.
2011-01-01
Imputation methods are popular for the handling of missing data in psychology. The methods generally consist of predicting missing data based on observed data, yielding a complete data set that is amiable to standard statistical analyses. In the context of Bayesian factor analysis, this article compares imputation under an unrestricted…
An efficient scan diagnosis methodology according to scan failure mode for yield enhancement
NASA Astrophysics Data System (ADS)
Kim, Jung-Tae; Seo, Nam-Sik; Oh, Ghil-Geun; Kim, Dae-Gue; Lee, Kyu-Taek; Choi, Chi-Young; Kim, InSoo; Min, Hyoung Bok
2008-12-01
Yield has always been a driving consideration during fabrication of modern semiconductor industry. Statistically, the largest portion of wafer yield loss is defective scan failure. This paper presents efficient failure analysis methods for initial yield ramp up and ongoing product with scan diagnosis. Result of our analysis shows that more than 60% of the scan failure dies fall into the category of shift mode in the very deep submicron (VDSM) devices. However, localization of scan shift mode failure is very difficult in comparison to capture mode failure because it is caused by the malfunction of scan chain. Addressing the biggest challenge, we propose the most suitable analysis method according to scan failure mode (capture / shift) for yield enhancement. In the event of capture failure mode, this paper describes the method that integrates scan diagnosis flow and backside probing technology to obtain more accurate candidates. We also describe several unique techniques, such as bulk back-grinding solution, efficient backside probing and signal analysis method. Lastly, we introduce blocked chain analysis algorithm for efficient analysis of shift failure mode. In this paper, we contribute to enhancement of the yield as a result of the combination of two methods. We confirm the failure candidates with physical failure analysis (PFA) method. The direct feedback of the defective visualization is useful to mass-produce devices in a shorter time. The experimental data on mass products show that our method produces average reduction by 13.7% in defective SCAN & SRAM-BIST failure rates and by 18.2% in wafer yield rates.
Biometric Analysis – A Reliable Indicator for Diagnosing Taurodontism using Panoramic Radiographs
Hegde, Veda; Anegundi, Rajesh Trayambhak; Pravinchandra, K.R.
2013-01-01
Background: Taurodontism is a clinical entity with a morpho–anatomical change in the shape of the tooth, which was thought to be absent in modern man. Taurodontism is mostly observed as an isolated trait or a component of a syndrome. Various techniques have been devised to diagnose taurodontism. Aim: The aim of this study was to analyze whether a biometric analysis was useful in diagnosing taurodontism, in radiographs which appeared to be normal on cursory observations. Setting and Design: This study was carried out in our institution by using radiographs which were taken for routine procedures. Material and Methods: In this retrospective study, panoramic radiographs were obtained from dental records of children who were aged between 9–14 years, who did not have any abnormality on cursory observations. Biometric analyses were carried out on permanent mandibular first molar(s) by using a novel biometric method. The values were tabulated and analysed. Statistics: Fischer exact probability test, Chi square test and Chi-square test with Yates correction were used for statistical analysis of the data. Results: Cursory observation did not yield us any case of taurodontism. In contrast, the biometric analysis yielded us a statistically significant number of cases of taurodontism. However, there was no statistically significant difference in the number of cases with taurodontism, which was obtained between the genders and the age group which was considered. Conclusion: Thus, taurodontism was diagnosed on a biometric analysis, which was otherwise missed on a cursory observation. It is therefore necessary from the clinical point of view, to diagnose even the mildest form of taurodontism by using metric analysis rather than just relying on a visual radiographic assessment, as its occurrence has many clinical implications and a diagnostic importance. PMID:24086912
NASA Technical Reports Server (NTRS)
Stefanski, Philip L.
2015-01-01
Commercially available software packages today allow users to quickly perform the routine evaluations of (1) descriptive statistics to numerically and graphically summarize both sample and population data, (2) inferential statistics that draws conclusions about a given population from samples taken of it, (3) probability determinations that can be used to generate estimates of reliability allowables, and finally (4) the setup of designed experiments and analysis of their data to identify significant material and process characteristics for application in both product manufacturing and performance enhancement. This paper presents examples of analysis and experimental design work that has been conducted using Statgraphics®(Registered Trademark) statistical software to obtain useful information with regard to solid rocket motor propellants and internal insulation material. Data were obtained from a number of programs (Shuttle, Constellation, and Space Launch System) and sources that include solid propellant burn rate strands, tensile specimens, sub-scale test motors, full-scale operational motors, rubber insulation specimens, and sub-scale rubber insulation analog samples. Besides facilitating the experimental design process to yield meaningful results, statistical software has demonstrated its ability to quickly perform complex data analyses and yield significant findings that might otherwise have gone unnoticed. One caveat to these successes is that useful results not only derive from the inherent power of the software package, but also from the skill and understanding of the data analyst.
Alkarkhi, Abbas F M; Ramli, Saifullah Bin; Easa, Azhar Mat
2009-01-01
Major (sodium, potassium, calcium, magnesium) and minor elements (iron, copper, zinc, manganese) and one heavy metal (lead) of Cavendish banana flour and Dream banana flour were determined, and data were analyzed using multivariate statistical techniques of factor analysis and discriminant analysis. Factor analysis yielded four factors explaining more than 81% of the total variance: the first factor explained 28.73%, comprising magnesium, sodium, and iron; the second factor explained 21.47%, comprising only manganese and copper; the third factor explained 15.66%, comprising zinc and lead; while the fourth factor explained 15.50%, comprising potassium. Discriminant analysis showed that magnesium and sodium exhibited a strong contribution in discriminating the two types of banana flour, affording 100% correct assignation. This study presents the usefulness of multivariate statistical techniques for analysis and interpretation of complex mineral content data from banana flour of different varieties.
Econophysical visualization of Adam Smith’s invisible hand
NASA Astrophysics Data System (ADS)
Cohen, Morrel H.; Eliazar, Iddo I.
2013-02-01
Consider a complex system whose macrostate is statistically observable, but yet whose operating mechanism is an unknown black-box. In this paper we address the problem of inferring, from the system’s macrostate statistics, the system’s intrinsic force yielding the observed statistics. The inference is established via two diametrically opposite approaches which result in the very same intrinsic force: a top-down approach based on the notion of entropy, and a bottom-up approach based on the notion of Langevin dynamics. The general results established are applied to the problem of visualizing the intrinsic socioeconomic force-Adam Smith’s invisible hand-shaping the distribution of wealth in human societies. Our analysis yields quantitative econophysical representations of figurative socioeconomic forces, quantitative definitions of “poor” and “rich”, and a quantitative characterization of the “poor-get-poorer” and the “rich-get-richer” phenomena.
NASA Astrophysics Data System (ADS)
Donges, J. F.; Schleussner, C.-F.; Siegmund, J. F.; Donner, R. V.
2016-05-01
Studying event time series is a powerful approach for analyzing the dynamics of complex dynamical systems in many fields of science. In this paper, we describe the method of event coincidence analysis to provide a framework for quantifying the strength, directionality and time lag of statistical interrelationships between event series. Event coincidence analysis allows to formulate and test null hypotheses on the origin of the observed interrelationships including tests based on Poisson processes or, more generally, stochastic point processes with a prescribed inter-event time distribution and other higher-order properties. Applying the framework to country-level observational data yields evidence that flood events have acted as triggers of epidemic outbreaks globally since the 1950s. Facing projected future changes in the statistics of climatic extreme events, statistical techniques such as event coincidence analysis will be relevant for investigating the impacts of anthropogenic climate change on human societies and ecosystems worldwide.
Li, Li; Paulo, Maria-João; van Eeuwijk, Fred
2010-01-01
Association mapping using DNA-based markers is a novel tool in plant genetics for the analysis of complex traits. Potato tuber yield, starch content, starch yield and chip color are complex traits of agronomic relevance, for which carbohydrate metabolism plays an important role. At the functional level, the genes and biochemical pathways involved in carbohydrate metabolism are among the best studied in plants. Quantitative traits such as tuber starch and sugar content are therefore models for association genetics in potato based on candidate genes. In an association mapping experiment conducted with a population of 243 tetraploid potato varieties and breeding clones, we previously identified associations between individual candidate gene alleles and tuber starch content, starch yield and chip quality. In the present paper, we tested 190 DNA markers at 36 loci scored in the same association mapping population for pairwise statistical epistatic interactions. Fifty marker pairs were associated mainly with tuber starch content and/or starch yield, at a cut-off value of q ≤ 0.20 for the experiment-wide false discovery rate (FDR). Thirteen marker pairs had an FDR of q ≤ 0.10. Alleles at loci encoding ribulose-bisphosphate carboxylase/oxygenase activase (Rca), sucrose phosphate synthase (Sps) and vacuolar invertase (Pain1) were most frequently involved in statistical epistatic interactions. The largest effect on tuber starch content and starch yield was observed for the paired alleles Pain1-8c and Rca-1a, explaining 9 and 10% of the total variance, respectively. The combination of these two alleles increased the means of tuber starch content and starch yield. Biological models to explain the observed statistical epistatic interactions are discussed. Electronic supplementary material The online version of this article (doi:10.1007/s00122-010-1389-3) contains supplementary material, which is available to authorized users. PMID:20603706
Hijri, Mohamed
2016-04-01
An increasing human population requires more food production in nutrient-efficient systems in order to simultaneously meet global food needs while reducing the environmental footprint of agriculture. Arbuscular mycorrhizal fungi (AMF) have the potential to enhance crop yield, but their efficiency has yet to be demonstrated in large-scale crop production systems. This study reports an analysis of a dataset consisting of 231 field trials in which the same AMF inoculant (Rhizophagus irregularis DAOM 197198) was applied to potato over a 4-year period in North America and Europe under authentic field conditions. The inoculation was performed using a liquid suspension of AMF spores that was sprayed onto potato seed pieces, yielding a calculated 71 spores per seed piece. Statistical analysis showed a highly significant increase in marketable potato yield (ANOVA, P < 0.0001) for inoculated fields (42.2 tons/ha) compared with non-inoculated controls (38.3 tons/ha), irrespective of trial year. The average yield increase was 3.9 tons/ha, representing 9.5 % of total crop yield. Inoculation was profitable with a 0.67-tons/ha increase in yield, a threshold reached in almost 79 % of all trials. This finding clearly demonstrates the benefits of mycorrhizal-based inoculation on crop yield, using potato as a case study. Further improvements of these beneficial inoculants will help compensate for crop production deficits, both now and in the future.
An Analysis of Construction Contractor Performance Evaluation System
2009-03-01
65 8. Summary of Determinant and KMO Values for Finalized...principle component analysis output is the KMO and Bartlett‘s Test. KMO or Kaiser-Meyer-Olkin measure of sampling adequacy is used to identify if a...set of variables, when factored together, yield distinct and reliable factors (Field, 2005). KMO statistics vary between values of 0 to 1. Kaiser
Negative impacts of climate change on cereal yields: statistical evidence from France
NASA Astrophysics Data System (ADS)
Gammans, Matthew; Mérel, Pierre; Ortiz-Bobea, Ariel
2017-05-01
In several world regions, climate change is predicted to negatively affect crop productivity. The recent statistical yield literature emphasizes the importance of flexibly accounting for the distribution of growing-season temperature to better represent the effects of warming on crop yields. We estimate a flexible statistical yield model using a long panel from France to investigate the impacts of temperature and precipitation changes on wheat and barley yields. Winter varieties appear sensitive to extreme cold after planting. All yields respond negatively to an increase in spring-summer temperatures and are a decreasing function of precipitation about historical precipitation levels. Crop yields are predicted to be negatively affected by climate change under a wide range of climate models and emissions scenarios. Under warming scenario RCP8.5 and holding growing areas and technology constant, our model ensemble predicts a 21.0% decline in winter wheat yield, a 17.3% decline in winter barley yield, and a 33.6% decline in spring barley yield by the end of the century. Uncertainty from climate projections dominates uncertainty from the statistical model. Finally, our model predicts that continuing technology trends would counterbalance most of the effects of climate change.
Statistical parsimony networks and species assemblages in Cephalotrichid nemerteans (nemertea).
Chen, Haixia; Strand, Malin; Norenburg, Jon L; Sun, Shichun; Kajihara, Hiroshi; Chernyshev, Alexey V; Maslakova, Svetlana A; Sundberg, Per
2010-09-21
It has been suggested that statistical parsimony network analysis could be used to get an indication of species represented in a set of nucleotide data, and the approach has been used to discuss species boundaries in some taxa. Based on 635 base pairs of the mitochondrial protein-coding gene cytochrome c oxidase I (COI), we analyzed 152 nemertean specimens using statistical parsimony network analysis with the connection probability set to 95%. The analysis revealed 15 distinct networks together with seven singletons. Statistical parsimony yielded three networks supporting the species status of Cephalothrix rufifrons, C. major and C. spiralis as they currently have been delineated by morphological characters and geographical location. Many other networks contained haplotypes from nearby geographical locations. Cladistic structure by maximum likelihood analysis overall supported the network analysis, but indicated a false positive result where subnetworks should have been connected into one network/species. This probably is caused by undersampling of the intraspecific haplotype diversity. Statistical parsimony network analysis provides a rapid and useful tool for detecting possible undescribed/cryptic species among cephalotrichid nemerteans based on COI gene. It should be combined with phylogenetic analysis to get indications of false positive results, i.e., subnetworks that would have been connected with more extensive haplotype sampling.
Fadil, Mouhcine; Farah, Abdellah; Ihssane, Bouchaib; Haloui, Taoufik; Lebrazi, Sara; Zghari, Badreddine; Rachiq, Saâd
2016-01-01
To investigate the effect of environmental factors such as light and shade on essential oil yield and morphological traits of Moroccan Myrtus communis, a chemometric study was conducted on 20 individuals growing under two contrasting light environments. The study of individual's parameters by principal component analysis has shown that essential oil yield, altitude, and leaves thickness were positively correlated between them and negatively correlated with plants height, leaves length and leaves width. Principal component analysis and hierarchical cluster analysis have also shown that the individuals of each sampling site were grouped separately. The one-way ANOVA test has confirmed the effect of light and shade on essential oil yield and morphological parameters by showing a statistically significant difference between them from the shaded side to the sunny one. Finally, the multiple linear model containing main, interaction and quadratic terms was chosen for the modeling of essential oil yield in terms of morphological parameters. Sun plants have a small height, small leaves length and width, but they are thicker and richer in essential oil than shade plants which have shown almost the opposite. The highlighted multiple linear model can be used to predict essential oil yield in the studied area.
Persistence of space radiation induced cytogenetic damage in the blood lymphocytes of astronauts.
George, K; Chappell, L J; Cucinotta, F A
2010-08-14
Cytogenetic damage was assessed in blood lymphocytes from 16 astronauts before and after they participated in long-duration space missions of 3 months or more. The frequency of chromosome damage was measured by fluorescence in situ hybridization (FISH) chromosome painting before flight and at various intervals from a few days to many months after return from the mission. For all individuals, the frequency of chromosome exchanges measured within a month of return from space was higher than their preflight yield. However, some individuals showed a temporal decline in chromosome damage with time after flight. Statistical analysis using combined data for all astronauts indicated a significant overall decreasing trend in total chromosome exchanges with time after flight, although this trend was not seen for all astronauts and the yield of chromosome damage in some individuals actually increased with time after flight. The decreasing trend in total exchanges was slightly more significant when statistical analysis was restricted to data collected more than 220 days after return from flight. When analysis was restricted to data collected within 220 days of return from the mission there was no relationship between total exchanges and time. Translocation yields varied more between astronauts and there was only a slight non-significant decrease with time after flight that was similar for both later and earlier sampling times. Copyright (c) 2010. Published by Elsevier B.V.
Separate-channel analysis of two-channel microarrays: recovering inter-spot information.
Smyth, Gordon K; Altman, Naomi S
2013-05-26
Two-channel (or two-color) microarrays are cost-effective platforms for comparative analysis of gene expression. They are traditionally analysed in terms of the log-ratios (M-values) of the two channel intensities at each spot, but this analysis does not use all the information available in the separate channel observations. Mixed models have been proposed to analyse intensities from the two channels as separate observations, but such models can be complex to use and the gain in efficiency over the log-ratio analysis is difficult to quantify. Mixed models yield test statistics for the null distributions can be specified only approximately, and some approaches do not borrow strength between genes. This article reformulates the mixed model to clarify the relationship with the traditional log-ratio analysis, to facilitate information borrowing between genes, and to obtain an exact distributional theory for the resulting test statistics. The mixed model is transformed to operate on the M-values and A-values (average log-expression for each spot) instead of on the log-expression values. The log-ratio analysis is shown to ignore information contained in the A-values. The relative efficiency of the log-ratio analysis is shown to depend on the size of the intraspot correlation. A new separate channel analysis method is proposed that assumes a constant intra-spot correlation coefficient across all genes. This approach permits the mixed model to be transformed into an ordinary linear model, allowing the data analysis to use a well-understood empirical Bayes analysis pipeline for linear modeling of microarray data. This yields statistically powerful test statistics that have an exact distributional theory. The log-ratio, mixed model and common correlation methods are compared using three case studies. The results show that separate channel analyses that borrow strength between genes are more powerful than log-ratio analyses. The common correlation analysis is the most powerful of all. The common correlation method proposed in this article for separate-channel analysis of two-channel microarray data is no more difficult to apply in practice than the traditional log-ratio analysis. It provides an intuitive and powerful means to conduct analyses and make comparisons that might otherwise not be possible.
Optimization of the omega-3 extraction as a functional food from flaxseed.
Hassan-Zadeh, A; Sahari, M A; Barzegar, M
2008-09-01
The fatty acid content, total lipid, refractive index, peroxide, iodine, acid and saponification values of Iranian linseed oil (Linum usitatissimum) were studied. For optimization of extraction conditions, this oil was extracted by solvents (petroleum benzene and methanol-water-petroleum benzene) in 1:2, 1:3 and 1:4 ratios at 2, 5 and 8 h. Then its fatty acid content, omega-3 content and extraction yield were determined. According to the statistical analysis, petroleum benzene in a ratio of 1:3 at 5 h was chosen for the higher fatty acid, extraction yield, and economical feasibility. For preservation of omega-3 ingredients, oil with specified characters containing 46.8% omega-3 was kept under a nitrogen atmosphere at -30 degrees C during 0, 7, 30, 60 and 90 days and its peroxide value was determined. Statistical analysis showed a significant difference in the average amount of peroxide value only on the first 7 days of storage, and its increase (8.30%) conformed to the international standard.
Statistical methodology: II. Reliability and validity assessment in study design, Part B.
Karras, D J
1997-02-01
Validity measures the correspondence between a test and other purported measures of the same or similar qualities. When a reference standard exists, a criterion-based validity coefficient can be calculated. If no such standard is available, the concepts of content and construct validity may be used, but quantitative analysis may not be possible. The Pearson and Spearman tests of correlation are often used to assess the correspondence between tests, but do not account for measurement biases and may yield misleading results. Techniques that measure interest differences may be more meaningful in validity assessment, and the kappa statistic is useful for analyzing categorical variables. Questionnaires often can be designed to allow quantitative assessment of reliability and validity, although this may be difficult. Inclusion of homogeneous questions is necessary to assess reliability. Analysis is enhanced by using Likert scales or similar techniques that yield ordinal data. Validity assessment of questionnaires requires careful definition of the scope of the test and comparison with previously validated tools.
Nelms, David L.; Harlow, George E.; Hayes, Donald C.
1997-01-01
Growth within the Valley and Ridge, Blue Ridge, and Piedmont physiographic provinces of Virginia has focused concern about allocation of surface-water flow and increased demands on the ground-water resources. Potential surface-water yield was determined from statistical analysis of base-flow characteristics of streams. Base-flow characteristics also may provide a relative indication of the potential ground-water yield for areas that lack sufficient specific capacity or will-yield data; however, other factors need to be considered, such as geologic structure, lithology, precipitation, relief, and the degree of hydraulic interconnection between the regolith and bedrock.
Cryobiopsy: should this be used in place of endobronchial forceps biopsies?
Rubio, Edmundo R; le, Susanti R; Whatley, Ralph E; Boyd, Michael B
2013-01-01
Forceps biopsies of airway lesions have variable yields. The yield increases when combining techniques in order to collect more material. With the use of cryotherapy probes (cryobiopsy) larger specimens can be obtained, resulting in an increase in the diagnostic yield. However, the utility and safety of cryobiopsy with all types of lesions, including flat mucosal lesions, is not established. Demonstrate the utility/safety of cryobiopsy versus forceps biopsy to sample exophytic and flat airway lesions. Teaching hospital-based retrospective analysis. Retrospective analysis of patients undergoing cryobiopsies (singly or combined with forceps biopsies) from August 2008 through August 2010. Statistical Analysis. Wilcoxon signed-rank test. The comparative analysis of 22 patients with cryobiopsy and forceps biopsy of the same lesion showed the mean volumes of material obtained with cryobiopsy were significantly larger (0.696 cm(3) versus 0.0373 cm(3), P = 0.0014). Of 31 cryobiopsies performed, one had minor bleeding. Cryopbiopsy allowed sampling of exophytic and flat lesions that were located centrally or distally. Cryobiopsies were shown to be safe, free of artifact, and provided a diagnostic yield of 96.77%. Cryobiopsy allows safe sampling of exophytic and flat airway lesions, with larger specimens, excellent tissue preservation and high diagnostic accuracy.
2008-07-07
analyzing multivariate data sets. The system was developed using the Java Development Kit (JDK) version 1.5; and it yields interactive performance on a... script and captures output from the MATLAB’s “regress” and “stepwisefit” utilities that perform simple and stepwise regression, respectively. The MATLAB...Statistical Association, vol. 85, no. 411, pp. 664–675, 1990. [9] H. Hauser, F. Ledermann, and H. Doleisch, “ Angular brushing of extended parallel coordinates
Lonni, Audrey Alesandra Stinghen Garcia; Longhini, Renata; Lopes, Gisely Cristiny; de Mello, João Carlos Palazzo; Scarminio, Ieda Spacino
2012-03-16
Statistical design mixtures of water, methanol, acetone and ethanol were used to extract material from Trichilia catigua (Meliaceae) barks to study the effects of different solvents and their mixtures on its yield, total polyphenol content and antioxidant activity. The experimental results and their response surface models showed that quaternary mixtures with approximately equal proportions of all four solvents provided the highest yields, total polyphenol contents and antioxidant activities of the crude extracts followed by ternary design mixtures. Principal component and hierarchical clustering analysis of the HPLC-DAD spectra of the chromatographic peaks of 1:1:1:1 water-methanol-acetone-ethanol mixture extracts indicate the presence of cinchonains, gallic acid derivatives, natural polyphenols, flavanoids, catechins, and epicatechins. Copyright © 2011 Elsevier B.V. All rights reserved.
Bowling, Mark R; Kohan, Matthew W; Walker, Paul; Efird, Jimmy; Ben Or, Sharon
2015-01-01
Navigational bronchoscopy is utilized to guide biopsies of peripheral lung nodules and place fiducial markers for treatment of limited stage lung cancer with stereotactic body radiotherapy. The type of sedation used for this procedure remains controversial. We performed a retrospective chart review to evaluate the differences of diagnostic yield and overall success of the procedure based on anesthesia type. Electromagnetic navigational bronchoscopy was performed using the superDimension software system. Once the targeted lesion was within reach, multiple tissue samples were obtained. Statistical analysis was used to correlate the yield with the type of sedation among other factors. A successful procedure was defined if a diagnosis was made or a fiducial marker was adequately placed. Navigational bronchoscopy was performed on a total of 120 targeted lesions. The overall complication rate of the procedure was 4.1%. The diagnostic yield and success of the procedure was 74% and 87%, respectively. Duration of the procedure was the only significant difference between the general anesthesia and IV sedation groups (mean, 58 vs. 43 min, P=0.0005). A larger tumor size was associated with a higher diagnostic yield (P=0.032). All other variables in terms of effect on diagnostic yield and an unsuccessful procedure did not meet statistical significance. Navigational bronchoscopy is a safe and effective pulmonary diagnostic tool with relatively low complication rate. The diagnostic yield and overall success of the procedure does not seem to be affected by the type of sedation used.
Factors related to well yield in the fractured-bedrock aquifer of New Hampshire
Moore, Richard Bridge; Schwartz, Gregory E.; Clark, Stewart F.; Walsh, Gregory J.; Degnan, James R.
2002-01-01
The New Hampshire Bedrock Aquifer Assessment was designed to provide information that can be used by communities, industry, professional consultants, and other interests to evaluate the ground-water development potential of the fractured-bedrock aquifer in the State. The assessment was done at statewide, regional, and well field scales to identify relations that potentially could increase the success in locating high-yield water supplies in the fractured-bedrock aquifer. statewide, data were collected for well construction and yield information, bedrock lithology, surficial geology, lineaments, topography, and various derivatives of these basic data sets. Regionally, geologic, fracture, and lineament data were collected for the Pinardville and Windham quadrangles in New Hampshire. The regional scale of the study examined the degree to which predictive well-yield relations, developed as part of the statewide reconnaissance investigation, could be improved by use of quadrangle-scale geologic mapping. Beginning in 1984, water-well contractors in the State were required to report detailed information on newly constructed wells to the New Hampshire Department of Environmental Services (NHDES). The reports contain basic data on well construction, including six characteristics used in this study?well yield, well depth, well use, method of construction, date drilled, and depth to bedrock (or length of casing). The NHDES has determined accurate georeferenced locations for more than 20,000 wells reported since 1984. The availability of this large data set provided an opportunity for a statistical analysis of bedrock-well yields. Well yields in the database ranged from zero to greater than 500 gallons per minute (gal/min). Multivariate regression was used as the primary statistical method of analysis because it is the most efficient tool for predicting a single variable with many potentially independent variables. The dependent variable that was explored in this study was the natural logarithm (ln) of the reported well yield. One complication with using well yield as a dependent variable is that yield also is a function of demand. An innovative statistical technique that involves the use of instrumental variables was implemented to compensate for the effect of demand on well yield. Results of the multivariate-regression model show that a variety of factors are either positively or negatively related to well yields. Using instrumental variables, well depth is positively related to total well yield. Other factors that were found to be positively related to well yield include (1) distance to the nearest waterbody; (2) size of the drainage area upgradient of a well; (3) well location in swales or valley bottoms in the Massabesic Gneiss Complex and Breakfast Hill Granite; (4) well proximity to lineaments, identified using high-altitude (1:80,000-scale) aerial photography, which are correlated with the primary fracture direction (regional analysis); (5) use of a cable tool rig for well drilling; and (6) wells drilled for commercial or public supply. Factors negatively related to well yields include sites underlain by foliated plutons, sites on steep slopes sites at high elevations, and sites on hilltops. Additionally, seven detailed geologic map units, identified during the detailed geologic mapping of the Pinardville and Windham quadrangles, were found to be positively or negatively related to well yields. Twenty-four geologic map units, depicted on the Bedrock Geologic Map of New Hampshire, also were found to be positively or negatively related to well yields. Maps or geographic information system (GIS) data sets identifying areas of various yield probabilities clearly display model results. Probability criteria developed in this investigation can be used to select areas where other techniques, such as geophysical techniques, can be applied to more closely identify potential drilling sites for high-yielding
NASA Technical Reports Server (NTRS)
Jansen, Mark J.; Jones, William R., Jr.; Wheeler, Donald R.; Keller, Dennis J.
2000-01-01
Because CFC 113, an ozone depleting chemical (ODC), can no longer be produced, alternative bearing cleaning methods must be studied. The objective of this work was to study the effect of the new cleaning methods on lubricant lifetime using a vacuum bearing simulator (spiral orbit rolling contact tribometer). Four alternative cleaning methods were studied: ultra-violet (UV) ozone, aqueous levigated alumina slurry (ALAS), super critical fluid (SCF) CO2 and aqueous Brulin 815GD. Baseline tests were done using CFC 113. Test conditions were the following: a vacuum of at least 1.3 x 10(exp -6) Pa, 440C steel components, a rotational speed of 10 RPM, a lubricant charge of between 60-75 micrograms, a perfluoropolyalkylether lubricant (Z-25), and a load of 200N (44.6 lbs., a mean Hertzian stress of 1.5 GPa). Normalized lubricant lifetime was determined by dividing the total number of ball orbits by the amount of lubricant. The failure condition was a friction coefficient of 0.38. Post-test XPS analysis was also performed, showing slight variations in post-cleaning surface chemistry. Statistical analysis of the resultant data was conducted and it was determined that the data sets were most directly comparable when subjected to a natural log transformation. The natural log life (NL-Life) data for each cleaning method were reasonably normally (statistically) distributed and yielded standard deviations that were not significantly different among the five cleaning methods investigated. This made comparison of their NL-Life means very straightforward using a Bonferroni multiple comparison of means procedure. This procedure showed that the ALAS, UV-ozone and CFC 113 methods were not statistically significantly different from one another with respect to mean NL-Life. It also found that the SCF CO2 method yielded a significantly higher mean NL-Life than the mean NL-Lives of the ALAS, UV-ozone and CFC 113 methods. It also determined that the aqueous Brulin 815GD method yielded a mean NL-Life that was statistically significantly higher than the mean NL-Lives of each of the other four methods. Baseline tests using CFC 113 cleaned parts yielded a mean NL-Life 3.62 orbits/micro-g. ALAS and UV-ozone yielded similar mean NL-Life (3.31 orbits/mg and 3.33 orbits/micro-g, respectively). SCF CO2, gave a mean NL-Life of 4.08 orbits/mg and aqueous Brulin 8l5GD data yielded the longest mean NL-Life (4.66 orbits/micro-g).
Statistical analysis of large simulated yield datasets for studying climate effects
USDA-ARS?s Scientific Manuscript database
Ensembles of process-based crop models are now commonly used to simulate crop growth and development for climate scenarios of temperature and/or precipitation changes corresponding to different projections of atmospheric CO2 concentrations. This approach generates large datasets with thousands of de...
Language Learning Strategy Use and Reading Achievement
ERIC Educational Resources Information Center
Ghafournia, Narjes
2014-01-01
The current study investigated the differences across the varying levels of EFL learners in the frequency and choice of learning strategies. Using a reading test, questionnaire, and parametric statistical analysis, the findings yielded up discrepancies among the participants in the implementation of language-learning strategies concerning their…
Bem, Daryl; Tressoldi, Patrizio; Rabeyron, Thomas; Duggan, Michael
2015-01-01
In 2011, one of the authors (DJB) published a report of nine experiments in the Journal of Personality and Social Psychology purporting to demonstrate that an individual's cognitive and affective responses can be influenced by randomly selected stimulus events that do not occur until after his or her responses have already been made and recorded, a generalized variant of the phenomenon traditionally denoted by the term precognition. To encourage replications, all materials needed to conduct them were made available on request. We here report a meta-analysis of 90 experiments from 33 laboratories in 14 countries which yielded an overall effect greater than 6 sigma, z = 6.40, p = 1.2 × 10 (-10 ) with an effect size (Hedges' g) of 0.09. A Bayesian analysis yielded a Bayes Factor of 5.1 × 10 (9), greatly exceeding the criterion value of 100 for "decisive evidence" in support of the experimental hypothesis. When DJB's original experiments are excluded, the combined effect size for replications by independent investigators is 0.06, z = 4.16, p = 1.1 × 10 (-5), and the BF value is 3,853, again exceeding the criterion for "decisive evidence." The number of potentially unretrieved experiments required to reduce the overall effect size of the complete database to a trivial value of 0.01 is 544, and seven of eight additional statistical tests support the conclusion that the database is not significantly compromised by either selection bias or by intense " p-hacking"-the selective suppression of findings or analyses that failed to yield statistical significance. P-curve analysis, a recently introduced statistical technique, estimates the true effect size of the experiments to be 0.20 for the complete database and 0.24 for the independent replications, virtually identical to the effect size of DJB's original experiments (0.22) and the closely related "presentiment" experiments (0.21). We discuss the controversial status of precognition and other anomalous effects collectively known as psi.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ma, Xuedan; Diroll, Benjamin T.; Cho, Wooje
Quasi-two-dimensional nanoplatelets (NPLs) possess fundamentally different excitonic properties from zero-dimensional quantum dots. We study lateral size-dependent photon emission statistics and carrier dynamics of individual NPLs using second-order photon correlation (g( 2)(τ)) spectroscopy and photoluminescence (PL) intensity-dependent lifetime analysis. Room-temperature radiative lifetimes of NPLs can be derived from maximum PL intensity periods in PL time traces. It first decreases with NPL lateral size and then stays constant, deviating from the electric dipole approximation. Analysis of the PL time traces further reveals that the single exciton quantum yield in NPLs decreases with NPL lateral size and increases with protecting shell thickness, indicatingmore » the importance of surface passivation on NPL emission quality. Second-order photon correlation (g( 2)(τ)) studies of single NPLs show that the biexciton quantum yield is strongly dependent on the lateral size and single exciton quantum yield of the NPLs. In large NPLs with unity single exciton quantum yield, the corresponding biexciton quantum yield can reach unity. In conclusion, these findings reveal that by careful growth control and core–shell material engineering, NPLs can be of great potential for light amplification and integrated quantum photonic applications.« less
Ma, Xuedan; Diroll, Benjamin T.; Cho, Wooje; ...
2017-08-08
Quasi-two-dimensional nanoplatelets (NPLs) possess fundamentally different excitonic properties from zero-dimensional quantum dots. We study lateral size-dependent photon emission statistics and carrier dynamics of individual NPLs using second-order photon correlation (g( 2)(τ)) spectroscopy and photoluminescence (PL) intensity-dependent lifetime analysis. Room-temperature radiative lifetimes of NPLs can be derived from maximum PL intensity periods in PL time traces. It first decreases with NPL lateral size and then stays constant, deviating from the electric dipole approximation. Analysis of the PL time traces further reveals that the single exciton quantum yield in NPLs decreases with NPL lateral size and increases with protecting shell thickness, indicatingmore » the importance of surface passivation on NPL emission quality. Second-order photon correlation (g( 2)(τ)) studies of single NPLs show that the biexciton quantum yield is strongly dependent on the lateral size and single exciton quantum yield of the NPLs. In large NPLs with unity single exciton quantum yield, the corresponding biexciton quantum yield can reach unity. In conclusion, these findings reveal that by careful growth control and core–shell material engineering, NPLs can be of great potential for light amplification and integrated quantum photonic applications.« less
Simulating and Predicting Cereal Crop Yields in Ethiopia: Model Calibration and Verification
NASA Astrophysics Data System (ADS)
Yang, M.; Wang, G.; Ahmed, K. F.; Eggen, M.; Adugna, B.; Anagnostou, E. N.
2017-12-01
Agriculture in developing countries are extremely vulnerable to climate variability and changes. In East Africa, most people live in the rural areas with outdated agriculture techniques and infrastructure. Smallholder agriculture continues to play a key role in this area, and the rate of irrigation is among the lowest of the world. As a result, seasonal and inter-annual weather patterns play an important role in the spatiotemporal variability of crop yields. This study investigates how various climate variables (e.g., temperature, precipitation, sunshine) and agricultural practice (e.g., fertilization, irrigation, planting date) influence cereal crop yields using a process-based model (DSSAT) and statistical analysis, and focuses on the Blue Nile Basin of Ethiopia. The DSSAT model is driven with meteorological forcing from the ECMWF's latest reanalysis product that cover the past 35 years; the statistical model will be developed by linking the same meteorological reanalysis data with harvest data at the woreda level from the Ethiopian national dataset. Results from this study will set the stage for the development of a seasonal prediction system for weather and crop yields in Ethiopia, which will serve multiple sectors in coping with the agricultural impact of climate variability.
Salutogenic factors for mental health promotion in work settings and organizations.
Graeser, Silke
2011-12-01
Accompanied by an increasing awareness of companies and organizations for mental health conditions in work settings and organizations, the salutogenic perspective provides a promising approach to identify supportive factors and resources of organizations to promote mental health. Based on the sense of coherence (SOC) - usually treated as an individual and personality trait concept - an organization-based SOC scale was developed to identify potential salutogenic factors of a university as an organization and work place. Based on results of two samples of employees (n = 362, n = 204), factors associated with the organization-based SOC were evaluated. Statistical analysis yielded significant correlations between mental health and the setting-based SOC as well as the three factors of the SOC yielded by factor analysis yielded three factors comprehensibility, manageability and meaningfulness. Significant statistic results of bivariate and multivariate analyses emphasize the significance of aspects such as participation and comprehensibility referring to the organization, social cohesion and social climate on the social level, and recognition on the individual level for an organization-based SOC. Potential approaches for the further development of interventions for work-place health promotion based on salutogenic factors and resources on the individual, social and organization level are elaborated and the transcultural dimensions of these factors discussed.
Waites, Anthony B; Mannfolk, Peter; Shaw, Marnie E; Olsrud, Johan; Jackson, Graeme D
2007-02-01
Clinical functional magnetic resonance imaging (fMRI) occasionally fails to detect significant activation, often due to variability in task performance. The present study seeks to test whether a more flexible statistical analysis can better detect activation, by accounting for variance associated with variable compliance to the task over time. Experimental results and simulated data both confirm that even at 80% compliance to the task, such a flexible model outperforms standard statistical analysis when assessed using the extent of activation (experimental data), goodness of fit (experimental data), and area under the operator characteristic curve (simulated data). Furthermore, retrospective examination of 14 clinical fMRI examinations reveals that in patients where the standard statistical approach yields activation, there is a measurable gain in model performance in adopting the flexible statistical model, with little or no penalty in lost sensitivity. This indicates that a flexible model should be considered, particularly for clinical patients who may have difficulty complying fully with the study task.
Circulation Clusters--An Empirical Approach to Decentralization of Academic Libraries.
ERIC Educational Resources Information Center
McGrath, William E.
1986-01-01
Discusses the issue of centralization or decentralization of academic library collections, and describes a statistical analysis of book circulation at the University of Southwestern Louisiana that yielded subject area clusters as a compromise solution to the problem. Applications of the cluster model for all types of library catalogs are…
The open-source movement: an introduction for forestry professionals
Patrick Proctor; Paul C. Van Deusen; Linda S. Heath; Jeffrey H. Gove
2005-01-01
In recent years, the open-source movement has yielded a generous and powerful suite of software and utilities that rivals those developed by many commercial software companies. Open-source programs are available for many scientific needs: operating systems, databases, statistical analysis, Geographic Information System applications, and object-oriented programming....
Pulmonary Infiltrates in Immunosuppressed Patients: Analysis of a Diagnostic Protocol
Danés, Cristina; González-Martín, Julián; Pumarola, Tomàs; Rañó, Ana; Benito, Natividad; Torres, Antoni; Moreno, Asunción; Rovira, Montserrat; Puig de la Bellacasa, Jorge
2002-01-01
A diagnostic protocol was started to study the etiology of pulmonary infiltrates in immunosuppressed patients. The diagnostic yields of the different techniques were analyzed, with special emphasis on the importance of the sample quality and the role of rapid techniques in the diagnostic strategy. In total, 241 patients with newly developed pulmonary infiltrates within a period of 19 months were included. Noninvasive or invasive evaluation was performed according to the characteristics of the infiltrates. Diagnosis was achieved in 202 patients (84%); 173 patients (72%) had pneumonia, and specific etiologic agents were found in 114 (66%). Bronchoaspirate and bronchoalveolar lavage showed the highest yields, either on global analysis (23 of 35 specimens [66%] and 70 of 134 specimens [52%], respectively) or on analysis of each type of pneumonia. A tendency toward better results with optimal-quality samples was observed, and a statistically significant difference was found in sputum bacterial culture. Rapid diagnostic tests yielded results in 71 of 114 (62.2%) diagnoses of etiological pneumonia. PMID:12037077
Biometric Analysis - A Reliable Indicator for Diagnosing Taurodontism using Panoramic Radiographs.
Hegde, Veda; Anegundi, Rajesh Trayambhak; Pravinchandra, K R
2013-08-01
Taurodontism is a clinical entity with a morpho-anatomical change in the shape of the tooth, which was thought to be absent in modern man. Taurodontism is mostly observed as an isolated trait or a component of a syndrome. Various techniques have been devised to diagnose taurodontism. The aim of this study was to analyze whether a biometric analysis was useful in diagnosing taurodontism, in radiographs which appeared to be normal on cursory observations. This study was carried out in our institution by using radiographs which were taken for routine procedures. In this retrospective study, panoramic radiographs were obtained from dental records of children who were aged between 9-14 years, who did not have any abnormality on cursory observations. Biometric analyses were carried out on permanent mandibular first molar(s) by using a novel biometric method. The values were tabulated and analysed. Fischer exact probability test, Chi square test and Chi-square test with Yates correction were used for statistical analysis of the data. Cursory observation did not yield us any case of taurodontism. In contrast, the biometric analysis yielded us a statistically significant number of cases of taurodontism. However, there was no statistically significant difference in the number of cases with taurodontism, which was obtained between the genders and the age group which was considered. Thus, taurodontism was diagnosed on a biometric analysis, which was otherwise missed on a cursory observation. It is therefore necessary from the clinical point of view, to diagnose even the mildest form of taurodontism by using metric analysis rather than just relying on a visual radiographic assessment, as its occurrence has many clinical implications and a diagnostic importance.
Statistical emulators of maize, rice, soybean and wheat yields from global gridded crop models
Blanc, Élodie
2017-01-26
This study provides statistical emulators of crop yields based on global gridded crop model simulations from the Inter-Sectoral Impact Model Intercomparison Project Fast Track project. The ensemble of simulations is used to build a panel of annual crop yields from five crop models and corresponding monthly summer weather variables for over a century at the grid cell level globally. This dataset is then used to estimate, for each crop and gridded crop model, the statistical relationship between yields, temperature, precipitation and carbon dioxide. This study considers a new functional form to better capture the non-linear response of yields to weather,more » especially for extreme temperature and precipitation events, and now accounts for the effect of soil type. In- and out-of-sample validations show that the statistical emulators are able to replicate spatial patterns of yields crop levels and changes overtime projected by crop models reasonably well, although the accuracy of the emulators varies by model and by region. This study therefore provides a reliable and accessible alternative to global gridded crop yield models. By emulating crop yields for several models using parsimonious equations, the tools provide a computationally efficient method to account for uncertainty in climate change impact assessments.« less
Statistical emulators of maize, rice, soybean and wheat yields from global gridded crop models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blanc, Élodie
This study provides statistical emulators of crop yields based on global gridded crop model simulations from the Inter-Sectoral Impact Model Intercomparison Project Fast Track project. The ensemble of simulations is used to build a panel of annual crop yields from five crop models and corresponding monthly summer weather variables for over a century at the grid cell level globally. This dataset is then used to estimate, for each crop and gridded crop model, the statistical relationship between yields, temperature, precipitation and carbon dioxide. This study considers a new functional form to better capture the non-linear response of yields to weather,more » especially for extreme temperature and precipitation events, and now accounts for the effect of soil type. In- and out-of-sample validations show that the statistical emulators are able to replicate spatial patterns of yields crop levels and changes overtime projected by crop models reasonably well, although the accuracy of the emulators varies by model and by region. This study therefore provides a reliable and accessible alternative to global gridded crop yield models. By emulating crop yields for several models using parsimonious equations, the tools provide a computationally efficient method to account for uncertainty in climate change impact assessments.« less
NASA Astrophysics Data System (ADS)
Liang, Jing; Yu, Jian-xing; Yu, Yang; Lam, W.; Zhao, Yi-yu; Duan, Jing-hui
2016-06-01
Energy transfer ratio is the basic-factor affecting the level of pipe damage during the impact between dropped object and submarine pipe. For the purpose of studying energy transfer and damage mechanism of submarine pipe impacted by dropped objects, series of experiments are designed and carried out. The effective yield strength is deduced to make the quasi-static analysis more reliable, and the normal distribution of energy transfer ratio caused by lateral impact on pipes is presented by statistic analysis of experimental results based on the effective yield strength, which provides experimental and theoretical basis for the risk analysis of submarine pipe system impacted by dropped objects. Failure strains of pipe material are confirmed by comparing experimental results with finite element simulation. In addition, impact contact area and impact time are proved to be the major influence factors of energy transfer by sensitivity analysis of the finite element simulation.
Piepho, H P
1994-11-01
Multilocation trials are often used to analyse the adaptability of genotypes in different environments and to find for each environment the genotype that is best adapted; i.e. that is highest yielding in that environment. For this purpose, it is of interest to obtain a reliable estimate of the mean yield of a cultivar in a given environment. This article compares two different statistical estimation procedures for this task: the Additive Main Effects and Multiplicative Interaction (AMMI) analysis and Best Linear Unbiased Prediction (BLUP). A modification of a cross validation procedure commonly used with AMMI is suggested for trials that are laid out as a randomized complete block design. The use of these procedure is exemplified using five faba bean datasets from German registration trails. BLUP was found to outperform AMMI in four of five faba bean datasets.
Robust LOD scores for variance component-based linkage analysis.
Blangero, J; Williams, J T; Almasy, L
2000-01-01
The variance component method is now widely used for linkage analysis of quantitative traits. Although this approach offers many advantages, the importance of the underlying assumption of multivariate normality of the trait distribution within pedigrees has not been studied extensively. Simulation studies have shown that traits with leptokurtic distributions yield linkage test statistics that exhibit excessive Type I error when analyzed naively. We derive analytical formulae relating the deviation from the expected asymptotic distribution of the lod score to the kurtosis and total heritability of the quantitative trait. A simple correction constant yields a robust lod score for any deviation from normality and for any pedigree structure, and effectively eliminates the problem of inflated Type I error due to misspecification of the underlying probability model in variance component-based linkage analysis.
Schäffer, Beat; Pieren, Reto; Mendolia, Franco; Basner, Mathias; Brink, Mark
2017-05-01
Noise exposure-response relationships are used to estimate the effects of noise on individuals or a population. Such relationships may be derived from independent or repeated binary observations, and modeled by different statistical methods. Depending on the method by which they were established, their application in population risk assessment or estimation of individual responses may yield different results, i.e., predict "weaker" or "stronger" effects. As far as the present body of literature on noise effect studies is concerned, however, the underlying statistical methodology to establish exposure-response relationships has not always been paid sufficient attention. This paper gives an overview on two statistical approaches (subject-specific and population-averaged logistic regression analysis) to establish noise exposure-response relationships from repeated binary observations, and their appropriate applications. The considerations are illustrated with data from three noise effect studies, estimating also the magnitude of differences in results when applying exposure-response relationships derived from the two statistical approaches. Depending on the underlying data set and the probability range of the binary variable it covers, the two approaches yield similar to very different results. The adequate choice of a specific statistical approach and its application in subsequent studies, both depending on the research question, are therefore crucial.
Effect of warming temperatures on US wheat yields.
Tack, Jesse; Barkley, Andrew; Nalley, Lawton Lanier
2015-06-02
Climate change is expected to increase future temperatures, potentially resulting in reduced crop production in many key production regions. Research quantifying the complex relationship between weather variables and wheat yields is rapidly growing, and recent advances have used a variety of model specifications that differ in how temperature data are included in the statistical yield equation. A unique data set that combines Kansas wheat variety field trial outcomes for 1985-2013 with location-specific weather data is used to analyze the effect of weather on wheat yield using regression analysis. Our results indicate that the effect of temperature exposure varies across the September-May growing season. The largest drivers of yield loss are freezing temperatures in the Fall and extreme heat events in the Spring. We also find that the overall effect of warming on yields is negative, even after accounting for the benefits of reduced exposure to freezing temperatures. Our analysis indicates that there exists a tradeoff between average (mean) yield and ability to resist extreme heat across varieties. More-recently released varieties are less able to resist heat than older lines. Our results also indicate that warming effects would be partially offset by increased rainfall in the Spring. Finally, we find that the method used to construct measures of temperature exposure matters for both the predictive performance of the regression model and the forecasted warming impacts on yields.
Differential Impacts of Climate Change on Crops and Agricultural Regions in India
NASA Astrophysics Data System (ADS)
Sharma, A. N.
2015-12-01
As India's farmers and policymakers consider potential adaptation strategies to climate change, some questions loom large: - Which climate variables best explain the variability of crop yields? - How does the vulnerability of crop yields to climate vary regionally? - How are these risks likely to change in the future? While process-based crop modelling has started to answer many of these questions, we believe statistical approaches can complement these in improving our understanding of climate vulnerabilities and appropriate responses. We use yield data collected over three decades for more than ten food crops grown in India along with a variety of statistical approaches to answer the above questions. The ability of climate variables to explain yield variation varies greatly by crop and season, which is expected. Equally important, the ability of models to predict crop yields as well as their coefficients varies greatly by district even for districts which are relatively close to each other and similar in their agricultural practices. We believe these results encourage caution and nuance when making projections about climate impacts on crop yields in the future. Most studies about climate impacts on crop yields focus on a handful of major food crops. By extending our analysis to all the crops with long-term district level data in India as well as two growing seasons we gain a more comprehensive picture. Our results indicate that there is a great deal of variability even at relatively small scales, and that this must be taken into account if projections are to be made useful to policymakers.
Statistical crystallography of surface micelle spacing
NASA Technical Reports Server (NTRS)
Noever, David A.
1992-01-01
The aggregation of the recently reported surface micelles of block polyelectrolytes is analyzed using techniques of statistical crystallography. A polygonal lattice (Voronoi mosaic) connects center-to-center points, yielding statistical agreement with crystallographic predictions; Aboav-Weaire's law and Lewis's law are verified. This protocol supplements the standard analysis of surface micelles leading to aggregation number determination and, when compared to numerical simulations, allows further insight into the random partitioning of surface films. In particular, agreement with Lewis's law has been linked to the geometric packing requirements of filling two-dimensional space which compete with (or balance) physical forces such as interfacial tension, electrostatic repulsion, and van der Waals attraction.
NASA Astrophysics Data System (ADS)
Fragkaki, A. G.; Angelis, Y. S.; Tsantili-Kakoulidou, A.; Koupparis, M.; Georgakopoulos, C.
2009-08-01
Anabolic androgenic steroids (AAS) are included in the List of prohibited substances of the World Anti-Doping Agency (WADA) as substances abused to enhance athletic performance. Gas chromatography coupled to mass spectrometry (GC-MS) plays an important role in doping control analyses identifying AAS as their enolized-trimethylsilyl (TMS)-derivatives using the electron ionization (EI) mode. This paper explores the suitability of complementary GC-MS mass spectra and statistical analysis (principal component analysis, PCA and partial least squares-discriminant analysis, PLS-DA) to differentiate AAS as a function of their structural and conformational features expressed by their fragment ions. The results obtained showed that the application of PCA yielded a classification among the AAS molecules which became more apparent after applying PLS-DA to the dataset. The application of PLS-DA yielded a clear separation among the AAS molecules which were, thus, classified as: 1-ene-3-keto, 3-hydroxyl with saturated A-ring, 1-ene-3-hydroxyl, 4-ene-3-keto, 1,4-diene-3-keto and 3-keto with saturated A-ring anabolic steroids. The study of this paper also presents structurally diagnostic fragment ions and dissociation routes providing evidence for the presence of unknown AAS or chemically modified molecules known as designer steroids.
NASA Astrophysics Data System (ADS)
Moore, Frances C.; Baldos, Uris Lantz C.; Hertel, Thomas
2017-06-01
A large number of studies have been published examining the implications of climate change for agricultural productivity that, broadly speaking, can be divided into process-based modeling and statistical approaches. Despite a general perception that results from these methods differ substantially, there have been few direct comparisons. Here we use a data-base of yield impact studies compiled for the IPCC Fifth Assessment Report (Porter et al 2014) to systematically compare results from process-based and empirical studies. Controlling for differences in representation of CO2 fertilization between the two methods, we find little evidence for differences in the yield response to warming. The magnitude of CO2 fertilization is instead a much larger source of uncertainty. Based on this set of impact results, we find a very limited potential for on-farm adaptation to reduce yield impacts. We use the Global Trade Analysis Project (GTAP) global economic model to estimate welfare consequences of yield changes and find negligible welfare changes for warming of 1 °C-2 °C if CO2 fertilization is included and large negative effects on welfare without CO2. Uncertainty bounds on welfare changes are highly asymmetric, showing substantial probability of large declines in welfare for warming of 2 °C-3 °C even including the CO2 fertilization effect.
Cryobiopsy: Should This Be Used in Place of Endobronchial Forceps Biopsies?
Rubio, Edmundo R.; le, Susanti R.; Whatley, Ralph E.; Boyd, Michael B.
2013-01-01
Forceps biopsies of airway lesions have variable yields. The yield increases when combining techniques in order to collect more material. With the use of cryotherapy probes (cryobiopsy) larger specimens can be obtained, resulting in an increase in the diagnostic yield. However, the utility and safety of cryobiopsy with all types of lesions, including flat mucosal lesions, is not established. Aims. Demonstrate the utility/safety of cryobiopsy versus forceps biopsy to sample exophytic and flat airway lesions. Settings and Design. Teaching hospital-based retrospective analysis. Methods. Retrospective analysis of patients undergoing cryobiopsies (singly or combined with forceps biopsies) from August 2008 through August 2010. Statistical Analysis. Wilcoxon signed-rank test. Results. The comparative analysis of 22 patients with cryobiopsy and forceps biopsy of the same lesion showed the mean volumes of material obtained with cryobiopsy were significantly larger (0.696 cm3 versus 0.0373 cm3, P = 0.0014). Of 31 cryobiopsies performed, one had minor bleeding. Cryopbiopsy allowed sampling of exophytic and flat lesions that were located centrally or distally. Cryobiopsies were shown to be safe, free of artifact, and provided a diagnostic yield of 96.77%. Conclusions. Cryobiopsy allows safe sampling of exophytic and flat airway lesions, with larger specimens, excellent tissue preservation and high diagnostic accuracy. PMID:24066296
Jaime-Pérez, José Carlos; Jiménez-Castillo, Raúl Alberto; Vázquez-Hernández, Karina Elizabeth; Salazar-Riojas, Rosario; Méndez-Ramírez, Nereida; Gómez-Almaguer, David
2017-10-01
Advances in automated cell separators have improved the efficiency of plateletpheresis and the possibility of obtaining double products (DP). We assessed cell processor accuracy of predicted platelet (PLT) yields with the goal of a better prediction of DP collections. This retrospective proof-of-concept study included 302 plateletpheresis procedures performed on a Trima Accel v6.0 at the apheresis unit of a hematology department. Donor variables, software predicted yield and actual PLT yield were statistically evaluated. Software prediction was optimized by linear regression analysis and its optimal cut-off to obtain a DP assessed by receiver operating characteristic curve (ROC) modeling. Three hundred and two plateletpheresis procedures were performed; in 271 (89.7%) occasions, donors were men and in 31 (10.3%) women. Pre-donation PLT count had the best direct correlation with actual PLT yield (r = 0.486. P < .001). Means of software machine-derived values differed significantly from actual PLT yield, 4.72 × 10 11 vs.6.12 × 10 11 , respectively, (P < .001). The following equation was developed to adjust these values: actual PLT yield= 0.221 + (1.254 × theoretical platelet yield). ROC curve model showed an optimal apheresis device software prediction cut-off of 4.65 × 10 11 to obtain a DP, with a sensitivity of 82.2%, specificity of 93.3%, and an area under the curve (AUC) of 0.909. Trima Accel v6.0 software consistently underestimated PLT yields. Simple correction derived from linear regression analysis accurately corrected this underestimation and ROC analysis identified a precise cut-off to reliably predict a DP. © 2016 Wiley Periodicals, Inc.
Statistical analysis and interpolation of compositional data in materials science.
Pesenson, Misha Z; Suram, Santosh K; Gregoire, John M
2015-02-09
Compositional data are ubiquitous in chemistry and materials science: analysis of elements in multicomponent systems, combinatorial problems, etc., lead to data that are non-negative and sum to a constant (for example, atomic concentrations). The constant sum constraint restricts the sampling space to a simplex instead of the usual Euclidean space. Since statistical measures such as mean and standard deviation are defined for the Euclidean space, traditional correlation studies, multivariate analysis, and hypothesis testing may lead to erroneous dependencies and incorrect inferences when applied to compositional data. Furthermore, composition measurements that are used for data analytics may not include all of the elements contained in the material; that is, the measurements may be subcompositions of a higher-dimensional parent composition. Physically meaningful statistical analysis must yield results that are invariant under the number of composition elements, requiring the application of specialized statistical tools. We present specifics and subtleties of compositional data processing through discussion of illustrative examples. We introduce basic concepts, terminology, and methods required for the analysis of compositional data and utilize them for the spatial interpolation of composition in a sputtered thin film. The results demonstrate the importance of this mathematical framework for compositional data analysis (CDA) in the fields of materials science and chemistry.
NASA Astrophysics Data System (ADS)
Yan, Maoling; Liu, Pingzeng; Zhang, Chao; Zheng, Yong; Wang, Xizhi; Zhang, Yan; Chen, Weijie; Zhao, Rui
2018-01-01
Agroclimatological resources provide material and energy for agricultural production. This study is aimed to analyze the impact of selected climate factors change on wheat yield over the different growth period applied quantitatively method, by comparing two different time division modules of wheat growth cycle- monthly empirical-statistical multiple regression models ( From October to June of next year ) and growth stage empirical-statistical multiple regression models (Including sowing stage, seedling stage, tillering stage, overwintering period, regreening period, jointing stage, heading stage, maturity stage) analysis of relationship between agrometeorological data and growth stage records and winter wheat production in Yanzhou, Shandong Province of China. Correlation analysis(CA)was done for 35 years (from 1981 to 2015) between crop yield and corresponding weather parameters including daily mean temperature, sunshine duration, and average daily precipitation selected from 18 different meteorological factors. The results shows that the greatest impact on the winter wheat yield is the precipitation overwintering period in this area, each 1mm increase in daily mean rainfall was associated with 201.64 kg/hm2 lowered output. Moreover, the temperature and sunshine duration in heading period and maturity stage also exert significant influence on the output, every 1°C increase in daily mean temperature was associated with 199.85kg/hm2 adding output, every 1h increase in mean sunshine duration was associated with 130.68kg/hm2 reduced output. Comparing with the results of experiment which using months as step sizes and using farming as step sizes was in better agreement with the fluctuation in meteorological yield, offered a better explanation on the growth mechanism of wheat. Eventually the results indicated that 3 factors affects the yield during different growing periods of wheat in different extent and provided more specific reference to guide the agricultural production management in this area.
NASA Astrophysics Data System (ADS)
Jeffries, G. R.; Cohn, A.
2016-12-01
Soy-corn double cropping (DC) has been widely adopted in Central Brazil alongside single cropped (SC) soybean production. DC involves different cropping calendars, soy varieties, and may be associated with different crop yield patterns and volatility than SC. Study of the performance of the region's agriculture in a changing climate depends on tracking differences in the productivity of SC vs. DC, but has been limited by crop yield data that conflate the two systems. We predicted SC and DC yields across Central Brazil, drawing on field observations and remotely sensed data. We first modeled field yield estimates as a function of remotely sensed DC status and vegetation index (VI) metrics, and other management and biophysical factors. We then used the statistical model estimated to predict SC and DC soybean yields at each 500 m2 grid cell of Central Brazil for harvest years 2001 - 2015. The yield estimation model was constructed using 1) a repeated cross-sectional survey of soybean yields and management factors for years 2007-2015, 2) a custom agricultural land cover classification dataset which assimilates earlier datasets for the region, and 3) 500m 8-day MODIS image composites used to calculate the wide dynamic range vegetation index (WDRVI) and derivative metrics such as area under the curve for WDRVI values in critical crop development periods. A statistical yield estimation model which primarily entails WDRVI metrics, DC status, and spatial fixed effects was developed on a subset of the yield dataset. Model validation was conducted by predicting previously withheld yield records, and then assessing error and goodness-of-fit for predicted values with metrics including root mean squared error (RMSE), mean squared error (MSE), and R2. We found a statistical yield estimation model which incorporates WDRVI and DC status to be way to estimate crop yields over the region. Statistical properties of the resulting gridded yield dataset may be valuable for understanding linkages between crop yields, farm management factors, and climate.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aubert, B.; Barate, R.; Boutigny, D.
2005-09-16
We report on a measurement of the Cabibbo-Kobayashi-Maskawa CP-violating phase {gamma} through a Dalitz analysis of neutral D decays to K{sub S}{sup 0}{pi}{sup -}{pi}{sup +} in the processes B{sup {+-}}{yields}D{sup (*)}K{sup {+-}}, D*{yields}D{pi}{sup 0}, D{gamma}. Using a sample of 227x10{sup 6} BB pairs collected by the BABAR detector, we measure the amplitude ratios r{sub B}=0.12{+-}0.08{+-}0.03{+-}0.04 and r{sub B}*=0.17{+-}0.10{+-}0.03{+-}0.03, the relative strong phases {delta}{sub B}=(104{+-}45{sub -21-24}{sup +17+16}) deg. and {delta}{sub B}*=(-64{+-}41{sub -12}{sup +14}{+-}15) deg. between the amplitudes A(B{sup -}{yields}D{sup (*)0}K{sup -}) and A(B{sup -}{yields}D{sup (*)0}K{sup -}), and {gamma}=(70{+-}31{sub -10-11}{sup +12+14}) deg. The first error is statistical, the second is the experimentalmore » systematic uncertainty, and the third reflects the Dalitz model uncertainty. The results for the strong and weak phases have a twofold ambiguity.« less
Evaluating the consistency of gene sets used in the analysis of bacterial gene expression data.
Tintle, Nathan L; Sitarik, Alexandra; Boerema, Benjamin; Young, Kylie; Best, Aaron A; Dejongh, Matthew
2012-08-08
Statistical analyses of whole genome expression data require functional information about genes in order to yield meaningful biological conclusions. The Gene Ontology (GO) and Kyoto Encyclopedia of Genes and Genomes (KEGG) are common sources of functionally grouped gene sets. For bacteria, the SEED and MicrobesOnline provide alternative, complementary sources of gene sets. To date, no comprehensive evaluation of the data obtained from these resources has been performed. We define a series of gene set consistency metrics directly related to the most common classes of statistical analyses for gene expression data, and then perform a comprehensive analysis of 3581 Affymetrix® gene expression arrays across 17 diverse bacteria. We find that gene sets obtained from GO and KEGG demonstrate lower consistency than those obtained from the SEED and MicrobesOnline, regardless of gene set size. Despite the widespread use of GO and KEGG gene sets in bacterial gene expression data analysis, the SEED and MicrobesOnline provide more consistent sets for a wide variety of statistical analyses. Increased use of the SEED and MicrobesOnline gene sets in the analysis of bacterial gene expression data may improve statistical power and utility of expression data.
Stanzel, Sven; Weimer, Marc; Kopp-Schneider, Annette
2013-06-01
High-throughput screening approaches are carried out for the toxicity assessment of a large number of chemical compounds. In such large-scale in vitro toxicity studies several hundred or thousand concentration-response experiments are conducted. The automated evaluation of concentration-response data using statistical analysis scripts saves time and yields more consistent results in comparison to data analysis performed by the use of menu-driven statistical software. Automated statistical analysis requires that concentration-response data are available in a standardised data format across all compounds. To obtain consistent data formats, a standardised data management workflow must be established, including guidelines for data storage, data handling and data extraction. In this paper two procedures for data management within large-scale toxicological projects are proposed. Both procedures are based on Microsoft Excel files as the researcher's primary data format and use a computer programme to automate the handling of data files. The first procedure assumes that data collection has not yet started whereas the second procedure can be used when data files already exist. Successful implementation of the two approaches into the European project ACuteTox is illustrated. Copyright © 2012 Elsevier Ltd. All rights reserved.
Estimating total maximum daily loads with the Stochastic Empirical Loading and Dilution Model
Granato, Gregory; Jones, Susan Cheung
2017-01-01
The Massachusetts Department of Transportation (DOT) and the Rhode Island DOT are assessing and addressing roadway contributions to total maximum daily loads (TMDLs). Example analyses for total nitrogen, total phosphorus, suspended sediment, and total zinc in highway runoff were done by the U.S. Geological Survey in cooperation with FHWA to simulate long-term annual loads for TMDL analyses with the stochastic empirical loading and dilution model known as SELDM. Concentration statistics from 19 highway runoff monitoring sites in Massachusetts were used with precipitation statistics from 11 long-term monitoring sites to simulate long-term pavement yields (loads per unit area). Highway sites were stratified by traffic volume or surrounding land use to calculate concentration statistics for rural roads, low-volume highways, high-volume highways, and ultraurban highways. The median of the event mean concentration statistics in each traffic volume category was used to simulate annual yields from pavement for a 29- or 30-year period. Long-term average yields for total nitrogen, phosphorus, and zinc from rural roads are lower than yields from the other categories, but yields of sediment are higher than for the low-volume highways. The average yields of the selected water quality constituents from high-volume highways are 1.35 to 2.52 times the associated yields from low-volume highways. The average yields of the selected constituents from ultraurban highways are 1.52 to 3.46 times the associated yields from high-volume highways. Example simulations indicate that both concentration reduction and flow reduction by structural best management practices are crucial for reducing runoff yields.
NASA Technical Reports Server (NTRS)
George, Kerry; Rhone, Jordan; Chappell, L. J.; Cucinotta, F. A.
2010-01-01
Cytogenetic damage was assessed in blood lymphocytes from astronauts before and after they participated in long-duration space missions of three months or more. The frequency of chromosome damage was measured by fluorescence in situ hybridization (FISH) chromosome painting before flight and at various intervals from a few days to many months after return from the mission. For all individuals, the frequency of chromosome exchanges measured within a month of return from space was higher than their prefight yield. However, some individuals showed a temporal decline in chromosome damage with time after flight. Statistical analysis using combined data for all astronauts indicated a significant overall decreasing trend in total chromosome exchanges with time after flight, although this trend was not seen for all astronauts and the yield of chromosome damage in some individuals actually increased with time after flight. The decreasing trend in total exchanges was slightly more significant when statistical analysis was restricted to data collected more than 220 days after return from flight. In addition, limited data on multiple flights show a lack of correlation between time in space and translocation yields. Data from three crewmembers who has participated in two separate long-duration space missions provide limited information on the effect of repeat flights and show a possible adaptive response to space radiation exposure.
Predictive data modeling of human type II diabetes related statistics
NASA Astrophysics Data System (ADS)
Jaenisch, Kristina L.; Jaenisch, Holger M.; Handley, James W.; Albritton, Nathaniel G.
2009-04-01
During the course of routine Type II treatment of one of the authors, it was decided to derive predictive analytical Data Models of the daily sampled vital statistics: namely weight, blood pressure, and blood sugar, to determine if the covariance among the observed variables could yield a descriptive equation based model, or better still, a predictive analytical model that could forecast the expected future trend of the variables and possibly eliminate the number of finger stickings required to montior blood sugar levels. The personal history and analysis with resulting models are presented.
AutoBayes Program Synthesis System Users Manual
NASA Technical Reports Server (NTRS)
Schumann, Johann; Jafari, Hamed; Pressburger, Tom; Denney, Ewen; Buntine, Wray; Fischer, Bernd
2008-01-01
Program synthesis is the systematic, automatic construction of efficient executable code from high-level declarative specifications. AutoBayes is a fully automatic program synthesis system for the statistical data analysis domain; in particular, it solves parameter estimation problems. It has seen many successful applications at NASA and is currently being used, for example, to analyze simulation results for Orion. The input to AutoBayes is a concise description of a data analysis problem composed of a parameterized statistical model and a goal that is a probability term involving parameters and input data. The output is optimized and fully documented C/C++ code computing the values for those parameters that maximize the probability term. AutoBayes can solve many subproblems symbolically rather than having to rely on numeric approximation algorithms, thus yielding effective, efficient, and compact code. Statistical analysis is faster and more reliable, because effort can be focused on model development and validation rather than manual development of solution algorithms and code.
Yue Xu, Selene; Nelson, Sandahl; Kerr, Jacqueline; Godbole, Suneeta; Patterson, Ruth; Merchant, Gina; Abramson, Ian; Staudenmayer, John; Natarajan, Loki
2018-04-01
Physical inactivity is a recognized risk factor for many chronic diseases. Accelerometers are increasingly used as an objective means to measure daily physical activity. One challenge in using these devices is missing data due to device nonwear. We used a well-characterized cohort of 333 overweight postmenopausal breast cancer survivors to examine missing data patterns of accelerometer outputs over the day. Based on these observed missingness patterns, we created psuedo-simulated datasets with realistic missing data patterns. We developed statistical methods to design imputation and variance weighting algorithms to account for missing data effects when fitting regression models. Bias and precision of each method were evaluated and compared. Our results indicated that not accounting for missing data in the analysis yielded unstable estimates in the regression analysis. Incorporating variance weights and/or subject-level imputation improved precision by >50%, compared to ignoring missing data. We recommend that these simple easy-to-implement statistical tools be used to improve analysis of accelerometer data.
NASA Astrophysics Data System (ADS)
Campbell, B. D.; Higgins, S. R.
2008-12-01
Developing a method for bridging the gap between macroscopic and microscopic measurements of reaction kinetics at the mineral-water interface has important implications in geological and chemical fields. Investigating these reactions on the nanometer scale with SPM is often limited by image analysis and data extraction due to the large quantity of data usually obtained in SPM experiments. Here we present a computer algorithm for automated analysis of mineral-water interface reactions. This algorithm automates the analysis of sequential SPM images by identifying the kinetically active surface sites (i.e., step edges), and by tracking the displacement of these sites from image to image. The step edge positions in each image are readily identified and tracked through time by a standard edge detection algorithm followed by statistical analysis on the Hough Transform of the edge-mapped image. By quantifying this displacement as a function of time, the rate of step edge displacement is determined. Furthermore, the total edge length, also determined from analysis of the Hough Transform, combined with the computed step speed, yields the surface area normalized rate of the reaction. The algorithm was applied to a study of the spiral growth of the calcite(104) surface from supersaturated solutions, yielding results almost 20 times faster than performing this analysis by hand, with results being statistically similar for both analysis methods. This advance in analysis of kinetic data from SPM images will facilitate the building of experimental databases on the microscopic kinetics of mineral-water interface reactions.
Local yield stress statistics in model amorphous solids
NASA Astrophysics Data System (ADS)
Barbot, Armand; Lerbinger, Matthias; Hernandez-Garcia, Anier; García-García, Reinaldo; Falk, Michael L.; Vandembroucq, Damien; Patinet, Sylvain
2018-03-01
We develop and extend a method presented by Patinet, Vandembroucq, and Falk [Phys. Rev. Lett. 117, 045501 (2016), 10.1103/PhysRevLett.117.045501] to compute the local yield stresses at the atomic scale in model two-dimensional Lennard-Jones glasses produced via differing quench protocols. This technique allows us to sample the plastic rearrangements in a nonperturbative manner for different loading directions on a well-controlled length scale. Plastic activity upon shearing correlates strongly with the locations of low yield stresses in the quenched states. This correlation is higher in more structurally relaxed systems. The distribution of local yield stresses is also shown to strongly depend on the quench protocol: the more relaxed the glass, the higher the local plastic thresholds. Analysis of the magnitude of local plastic relaxations reveals that stress drops follow exponential distributions, justifying the hypothesis of an average characteristic amplitude often conjectured in mesoscopic or continuum models. The amplitude of the local plastic rearrangements increases on average with the yield stress, regardless of the system preparation. The local yield stress varies with the shear orientation tested and strongly correlates with the plastic rearrangement locations when the system is sheared correspondingly. It is thus argued that plastic rearrangements are the consequence of shear transformation zones encoded in the glass structure that possess weak slip planes along different orientations. Finally, we justify the length scale employed in this work and extract the yield threshold statistics as a function of the size of the probing zones. This method makes it possible to derive physically grounded models of plasticity for amorphous materials by directly revealing the relevant details of the shear transformation zones that mediate this process.
Sunspot activity and influenza pandemics: a statistical assessment of the purported association.
Towers, S
2017-10-01
Since 1978, a series of papers in the literature have claimed to find a significant association between sunspot activity and the timing of influenza pandemics. This paper examines these analyses, and attempts to recreate the three most recent statistical analyses by Ertel (1994), Tapping et al. (2001), and Yeung (2006), which all have purported to find a significant relationship between sunspot numbers and pandemic influenza. As will be discussed, each analysis had errors in the data. In addition, in each analysis arbitrary selections or assumptions were also made, and the authors did not assess the robustness of their analyses to changes in those arbitrary assumptions. Varying the arbitrary assumptions to other, equally valid, assumptions negates the claims of significance. Indeed, an arbitrary selection made in one of the analyses appears to have resulted in almost maximal apparent significance; changing it only slightly yields a null result. This analysis applies statistically rigorous methodology to examine the purported sunspot/pandemic link, using more statistically powerful un-binned analysis methods, rather than relying on arbitrarily binned data. The analyses are repeated using both the Wolf and Group sunspot numbers. In all cases, no statistically significant evidence of any association was found. However, while the focus in this particular analysis was on the purported relationship of influenza pandemics to sunspot activity, the faults found in the past analyses are common pitfalls; inattention to analysis reproducibility and robustness assessment are common problems in the sciences, that are unfortunately not noted often enough in review.
Orientation-controlled parallel assembly at the air-water interface
NASA Astrophysics Data System (ADS)
Park, Kwang Soon; Hao Hoo, Ji; Baskaran, Rajashree; Böhringer, Karl F.
2012-10-01
This paper presents an experimental and theoretical study with statistical analysis of a high-yield, orientation-specific fluidic self-assembly process on a preprogrammed template. We demonstrate self-assembly of thin (less than few hundred microns in thickness) parts, which is vital for many applications in miniaturized platforms but problematic for today's pick-and-place robots. The assembly proceeds row-by-row as the substrate is pulled up through an air-water interface. Experiments and analysis are presented with an emphasis on the combined effect of controlled surface waves and magnetic force. For various gap values between a magnet and Ni-patterned parts, magnetic force distributions are generated using Monte Carlo simulation and employed to predict assembly yield. An analysis of these distributions shows that a gradual decline in yield following the probability density function can be expected with degrading conditions. The experimentally determined critical magnetic force is in good agreement with a derived value from a model of competing forces acting on a part. A general set of design guidelines is also presented from the developed model and experimental data.
Bem, Daryl; Tressoldi, Patrizio; Rabeyron, Thomas; Duggan, Michael
2016-01-01
In 2011, one of the authors (DJB) published a report of nine experiments in the Journal of Personality and Social Psychology purporting to demonstrate that an individual’s cognitive and affective responses can be influenced by randomly selected stimulus events that do not occur until after his or her responses have already been made and recorded, a generalized variant of the phenomenon traditionally denoted by the term precognition. To encourage replications, all materials needed to conduct them were made available on request. We here report a meta-analysis of 90 experiments from 33 laboratories in 14 countries which yielded an overall effect greater than 6 sigma, z = 6.40, p = 1.2 × 10 -10 with an effect size (Hedges’ g) of 0.09. A Bayesian analysis yielded a Bayes Factor of 5.1 × 10 9, greatly exceeding the criterion value of 100 for “decisive evidence” in support of the experimental hypothesis. When DJB’s original experiments are excluded, the combined effect size for replications by independent investigators is 0.06, z = 4.16, p = 1.1 × 10 -5, and the BF value is 3,853, again exceeding the criterion for “decisive evidence.” The number of potentially unretrieved experiments required to reduce the overall effect size of the complete database to a trivial value of 0.01 is 544, and seven of eight additional statistical tests support the conclusion that the database is not significantly compromised by either selection bias or by intense “ p-hacking”—the selective suppression of findings or analyses that failed to yield statistical significance. P-curve analysis, a recently introduced statistical technique, estimates the true effect size of the experiments to be 0.20 for the complete database and 0.24 for the independent replications, virtually identical to the effect size of DJB’s original experiments (0.22) and the closely related “presentiment” experiments (0.21). We discuss the controversial status of precognition and other anomalous effects collectively known as psi. PMID:26834996
The Assessment of Climatological Impacts on Agricultural Production and Residential Energy Demand
NASA Astrophysics Data System (ADS)
Cooter, Ellen Jean
The assessment of climatological impacts on selected economic activities is presented as a multi-step, inter -disciplinary problem. The assessment process which is addressed explicitly in this report focuses on (1) user identification, (2) direct impact model selection, (3) methodological development, (4) product development and (5) product communication. Two user groups of major economic importance were selected for study; agriculture and gas utilities. The broad agricultural sector is further defined as U.S.A. corn production. The general category of utilities is narrowed to Oklahoma residential gas heating demand. The CERES physiological growth model was selected as the process model for corn production. The statistical analysis for corn production suggests that (1) although this is a statistically complex model, it can yield useful impact information, (2) as a result of output distributional biases, traditional statistical techniques are not adequate analytical tools, (3) the model yield distribution as a whole is probably non-Gausian, particularly in the tails and (4) there appears to be identifiable weekly patterns of forecasted yields throughout the growing season. Agricultural quantities developed include point yield impact estimates and distributional characteristics, geographic corn weather distributions, return period estimates, decision making criteria (confidence limits) and time series of indices. These products were communicated in economic terms through the use of a Bayesian decision example and an econometric model. The NBSLD energy load model was selected to represent residential gas heating consumption. A cursory statistical analysis suggests relationships among weather variables across the Oklahoma study sites. No linear trend in "technology -free" modeled energy demand or input weather variables which would correspond to that contained in observed state -level residential energy use was detected. It is suggested that this trend is largely the result of non-weather factors such as population and home usage patterns rather than regional climate change. Year-to-year changes in modeled residential heating demand on the order of 10('6) Btu's per household were determined and later related to state -level components of the Oklahoma economy. Products developed include the definition of regional forecast areas, likelihood estimates of extreme seasonal conditions and an energy/climate index. This information is communicated in economic terms through an input/output model which is used to estimate changes in Gross State Product and Household income attributable to weather variability.
NASA Astrophysics Data System (ADS)
Lee, K.; Leng, G.; Huang, M.; Sheffield, J.; Zhao, G.; Gao, H.
2017-12-01
Texas has the largest farm area in the U.S, and its revenue from crop production ranks third overall. With the changing climate, hydrological extremes such as droughts are becoming more frequent and intensified, causing significant yield reduction in rainfed agricultural systems. The objective of this study is to investigate the potential impacts of agricultural drought on crop yields (corn, sorghum, and wheat) under a changing climate in Texas. The Variable Infiltration Capacity (VIC) model, which is calibrated and validated over 10 major Texas river basins during the historical period, is employed in this study.The model is forced by a set of statistically downscaled climate projections from Coupled Model Intercomparison Project Phase 5 (CMIP5) model ensembles at a spatial resolution of 1/8°. The CMIP5 projections contain four Representative Concentration Pathways (RCP) that represent different greenhouse gas concentration (4.5 and 8.5 w/m2 are selected in this study). To carry out the analysis, VIC simulations from 1950 to 2099 are first analyzed to investigate how the frequency and severity of agricultural droughts will be altered in Texas (under a changing climate). Second, future crop yields are projected using a statistical crop model. Third, the effects of agricultural drought on crop yields are quantitatively analyzed. The results are expected to contribute to future water resources planning, with a goal of mitigating the negative impacts of future droughts on agricultural production in Texas.
Bae, Sangok; Shoda, Makoto
2005-04-05
Culture conditions in a jar fermentor for bacterial cellulose (BC) production from A. xylinum BPR2001 were optimized by statistical analysis using Box-Behnken design. Response surface methodology was used to predict the levels of the factors, fructose (X1), corn steep liquor (CSL) (X2), dissolved oxygen (DO) (X3), and agar concentration (X4). Total 27 experimental runs by combination of each factor were carried out in a 10-L jar fermentor, and a three-dimensional response surface was generated to determine the effect of the factors and to find out the optimum concentration of each factor for maximum BC production and BC yield. The fructose and agar concentration highly influenced the BC production and BC yield. However, the optimum conditions according to changes in CSL and DO concentrations were predicted at almost central values of tested ranges. The predicted results showed that BC production was 14.3 g/L under the condition of 4.99% fructose, 2.85% CSL, 28.33% DO, and 0.38% agar concentration. On the other hand, BC yield was predicted in 0.34 g/g under the condition of 3.63% fructose, 2.90% CSL, 31.14% DO, and 0.42% agar concentration. Under optimized culture conditions, improvement of BC production and BC yield were experimentally confirmed, which increased 76% and 57%, respectively, compared to BC production and BC yield before optimizing the culture conditions. Copyright (c) 2005 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Lee, J.; Kang, S.; Jang, K.; Ko, J.; Hong, S.
2012-12-01
Crop productivity is associated with the food security and hence, several models have been developed to estimate crop yield by combining remote sensing data with carbon cycle processes. In present study, we attempted to estimate crop GPP and NPP using algorithm based on the LUE model and a simplified respiration model. The state of Iowa and Illinois was chosen as the study site for estimating the crop yield for a period covering the 5 years (2006-2010), as it is the main Corn-Belt area in US. Present study focuses on developing crop-specific parameters for corn and soybean to estimate crop productivity and yield mapping using satellite remote sensing data. We utilized a 10 km spatial resolution daily meteorological data from WRF to provide cloudy-day meteorological variables but in clear-say days, MODIS-based meteorological data were utilized to estimate daily GPP, NPP, and biomass. County-level statistics on yield, area harvested, and productions were used to test model predicted crop yield. The estimated input meteorological variables from MODIS and WRF showed with good agreements with the ground observations from 6 Ameriflux tower sites in 2006. For examples, correlation coefficients ranged from 0.93 to 0.98 for Tmin and Tavg ; from 0.68 to 0.85 for daytime mean VPD; from 0.85 to 0.96 for daily shortwave radiation, respectively. We developed county-specific crop conversion coefficient, i.e. ratio of yield to biomass on 260 DOY and then, validated the estimated county-level crop yield with the statistical yield data. The estimated corn and soybean yields at the county level ranged from 671 gm-2 y-1 to 1393 gm-2 y-1 and from 213 gm-2 y-1 to 421 gm-2 y-1, respectively. The county-specific yield estimation mostly showed errors less than 10%. Furthermore, we estimated crop yields at the state level which were validated against the statistics data and showed errors less than 1%. Further analysis for crop conversion coefficient was conducted for 200 DOY and 280 DOY. For the case of 280 DOY, Crop yield estimation showed better accuracy for soybean at county level. Though the case of 200 DOY resulted in less accuracy (i.e. 20% mean bias), it provides a useful tool for early forecasting of crop yield. We improved the spatial accuracy of estimated crop yield at county level by developing county-specific crop conversion coefficient. Our results indicate that the aboveground crop biomass can be estimated successfully with the simple LUE and respiration models combined with MODIS data and then, county-specific conversion coefficient can be different with each other across different counties. Hence, applying region-specific conversion coefficient is necessary to estimate crop yield with better accuracy.
A bayesian approach to classification criteria for spectacled eiders
Taylor, B.L.; Wade, P.R.; Stehn, R.A.; Cochrane, J.F.
1996-01-01
To facilitate decisions to classify species according to risk of extinction, we used Bayesian methods to analyze trend data for the Spectacled Eider, an arctic sea duck. Trend data from three independent surveys of the Yukon-Kuskokwim Delta were analyzed individually and in combination to yield posterior distributions for population growth rates. We used classification criteria developed by the recovery team for Spectacled Eiders that seek to equalize errors of under- or overprotecting the species. We conducted both a Bayesian decision analysis and a frequentist (classical statistical inference) decision analysis. Bayesian decision analyses are computationally easier, yield basically the same results, and yield results that are easier to explain to nonscientists. With the exception of the aerial survey analysis of the 10 most recent years, both Bayesian and frequentist methods indicated that an endangered classification is warranted. The discrepancy between surveys warrants further research. Although the trend data are abundance indices, we used a preliminary estimate of absolute abundance to demonstrate how to calculate extinction distributions using the joint probability distributions for population growth rate and variance in growth rate generated by the Bayesian analysis. Recent apparent increases in abundance highlight the need for models that apply to declining and then recovering species.
NASA Technical Reports Server (NTRS)
Parada, N. D. J. (Principal Investigator); Cappelletti, C. A.
1982-01-01
A stratification oriented to crop area and yield estimation problems was performed using an algorithm of clustering. The variables used were a set of agroclimatological characteristics measured in each one of the 232 municipalities of the State of Rio Grande do Sul, Brazil. A nonhierarchical cluster analysis was used and the pseudo F-statistics criterion was implemented for determining the "cut point" in the number of strata.
Heterogeneous variances in multi-environment yield trials for corn hybrids
USDA-ARS?s Scientific Manuscript database
Recent developments in statistics and computing have enabled much greater levels of complexity in statistical models of multi-environment yield trial data. One particular feature of interest to breeders is simultaneously modeling heterogeneity of variances among environments and cultivars. Our obj...
Guo, Jing; Yuan, Yahong; Dou, Pei; Yue, Tianli
2017-10-01
Fifty-one kiwifruit juice samples of seven kiwifruit varieties from five regions in China were analyzed to determine their polyphenols contents and to trace fruit varieties and geographical origins by multivariate statistical analysis. Twenty-one polyphenols belonging to four compound classes were determined by ultra-high-performance liquid chromatography coupled with ultra-high-resolution TOF mass spectrometry. (-)-Epicatechin, (+)-catechin, procyanidin B1 and caffeic acid derivatives were the predominant phenolic compounds in the juices. Principal component analysis (PCA) allowed a clear separation of the juices according to kiwifruit varieties. Stepwise linear discriminant analysis (SLDA) yielded satisfactory categorization of samples, provided 100% success rate according to kiwifruit varieties and 92.2% success rate according to geographical origins. The result showed that polyphenolic profiles of kiwifruit juices contain enough information to trace fruit varieties and geographical origins. Copyright © 2017 Elsevier Ltd. All rights reserved.
Random Forests for Global and Regional Crop Yield Predictions.
Jeong, Jig Han; Resop, Jonathan P; Mueller, Nathaniel D; Fleisher, David H; Yun, Kyungdahm; Butler, Ethan E; Timlin, Dennis J; Shim, Kyo-Moon; Gerber, James S; Reddy, Vangimalla R; Kim, Soo-Hyung
2016-01-01
Accurate predictions of crop yield are critical for developing effective agricultural and food policies at the regional and global scales. We evaluated a machine-learning method, Random Forests (RF), for its ability to predict crop yield responses to climate and biophysical variables at global and regional scales in wheat, maize, and potato in comparison with multiple linear regressions (MLR) serving as a benchmark. We used crop yield data from various sources and regions for model training and testing: 1) gridded global wheat grain yield, 2) maize grain yield from US counties over thirty years, and 3) potato tuber and maize silage yield from the northeastern seaboard region. RF was found highly capable of predicting crop yields and outperformed MLR benchmarks in all performance statistics that were compared. For example, the root mean square errors (RMSE) ranged between 6 and 14% of the average observed yield with RF models in all test cases whereas these values ranged from 14% to 49% for MLR models. Our results show that RF is an effective and versatile machine-learning method for crop yield predictions at regional and global scales for its high accuracy and precision, ease of use, and utility in data analysis. RF may result in a loss of accuracy when predicting the extreme ends or responses beyond the boundaries of the training data.
Tuttolomondo, Teresa; Dugo, Giacomo; Ruberto, Giuseppe; Leto, Claudio; Napoli, Edoardo M; Cicero, Nicola; Gervasi, Teresa; Virga, Giuseppe; Leone, Raffaele; Licata, Mario; La Bella, Salvatore
2015-01-01
In this study the chemical characterisation of 10 Sicilian Rosmarinus officinalis L. biotypes essential oils is reported. The main goal of this work was to analyse the relationship between the essential oils yield and the geographical distribution of the species plants. The essential oils were analysed by GC-FID and GC-MS. Hierarchical cluster analysis and principal component analysis statistical methods were used to cluster biotypes according to the essential oils chemical composition. The essential oil yield ranged from 0.8 to 2.3 (v/w). In total 82 compounds have been identified, these represent 96.7-99.9% of the essential oil. The most represented compounds in the essential oils were 1.8-cineole, linalool, α-terpineol, verbenone, α-pinene, limonene, bornyl acetate and terpinolene. The results show that the essential oil yield of the 10 biotypes is affected by the environmental characteristics of the sampling sites while the chemical composition is linked to the genetic characteristics of different biotypes.
Escorza-Treviño, S; Dizon, A E
2000-08-01
Mitochondrial DNA (mtDNA) control-region sequences and microsatellite loci length polymorphisms were used to estimate phylogeographical patterns (historical patterns underlying contemporary distribution), intraspecific population structure and gender-biased dispersal of Phocoenoides dalli dalli across its entire range. One-hundred and thirteen animals from several geographical strata were sequenced over 379 bp of mtDNA, resulting in 58 mtDNA haplotypes. Analysis using F(ST) values (based on haplotype frequencies) and phi(ST) values (based on frequencies and genetic distances between haplotypes) yielded statistically significant separation (bootstrap values P < 0.05) among most of the stocks currently used for management purposes. A minimum spanning network of haplotypes showed two very distinctive clusters, differentially occupied by western and eastern populations, with some common widespread haplotypes. This suggests some degree of phyletic radiation from west to east, superimposed on gene flow. Highly male-biased migration was detected for several population comparisons. Nuclear microsatellite DNA markers (119 individuals and six loci) provided additional support for population subdivision and gender-biased dispersal detected in the mtDNA sequences. Analysis using F(ST) values (based on allelic frequencies) yielded statistically significant separation between some, but not all, populations distinguished by mtDNA analysis. R(ST) values (based on frequencies of and genetic distance between alleles) showed no statistically significant subdivision. Again, highly male-biased dispersal was detected for all population comparisons, suggesting, together with morphological and reproductive data, the existence of sexual selection. Our molecular results argue for nine distinct dalli-type populations that should be treated as separate units for management purposes.
Ashengroph, Morahem; Ababaf, Sajad
2014-12-01
Microbial caffeine removal is a green solution for treatment of caffeinated products and agro-industrial effluents. We directed this investigation to optimizing a bio-decaffeination process with growing cultures of Pseudomonas pseudoalcaligenes through Taguchi methodology which is a structured statistical approach that can be lowered variations in a process through Design of Experiments (DOE). Five parameters, i.e. initial fructose, tryptone, Zn(+2) ion and caffeine concentrations and also incubation time selected and an L16 orthogonal array was applied to design experiments with four 4-level factors and one 3-level factor (4(4) × 1(3)). Data analysis was performed using the statistical analysis of variance (ANOVA) method. Furthermore, the optimal conditions were determined by combining the optimal levels of the significant factors and verified by a confirming experiment. Measurement of residual caffeine concentration in the reaction mixture was performed using high-performance liquid chromatography (HPLC). Use of Taguchi methodology for optimization of design parameters resulted in about 86.14% reduction of caffeine in 48 h incubation when 5g/l fructose, 3 mM Zn(+2) ion and 4.5 g/l of caffeine are present in the designed media. Under the optimized conditions, the yield of degradation of caffeine (4.5 g/l) by the native strain of Pseudomonas pseudoalcaligenes TPS8 has been increased from 15.8% to 86.14% which is 5.4 fold higher than the normal yield. According to the experimental results, Taguchi methodology provides a powerful methodology for identifying the favorable parameters on caffeine removal using strain TPS8 which suggests that the approach also has potential application with similar strains to improve the yield of caffeine removal from caffeine containing solutions.
Microscopic analysis of currency and stock exchange markets.
Kador, L
1999-08-01
Recently it was shown that distributions of short-term price fluctuations in foreign-currency exchange exhibit striking similarities to those of velocity differences in turbulent flows. Similar profiles represent the spectral-diffusion behavior of impurity molecules in disordered solids at low temperatures. It is demonstrated that a microscopic statistical theory of the spectroscopic line shapes can be applied to the other two phenomena. The theory interprets the financial data in terms of information which becomes available to the traders and their reactions as a function of time. The analysis shows that there is no characteristic time scale in financial markets, but that instead stretched-exponential or algebraic memory functions yield good agreement with the price data. For an algebraic function, the theory yields truncated Lévy distributions which are often observed in stock exchange markets.
Microscopic analysis of currency and stock exchange markets
NASA Astrophysics Data System (ADS)
Kador, L.
1999-08-01
Recently it was shown that distributions of short-term price fluctuations in foreign-currency exchange exhibit striking similarities to those of velocity differences in turbulent flows. Similar profiles represent the spectral-diffusion behavior of impurity molecules in disordered solids at low temperatures. It is demonstrated that a microscopic statistical theory of the spectroscopic line shapes can be applied to the other two phenomena. The theory interprets the financial data in terms of information which becomes available to the traders and their reactions as a function of time. The analysis shows that there is no characteristic time scale in financial markets, but that instead stretched-exponential or algebraic memory functions yield good agreement with the price data. For an algebraic function, the theory yields truncated Lévy distributions which are often observed in stock exchange markets.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amo Sanchez, P. del; Lees, J. P.; Poireau, V.
2010-09-17
We report the measurement of the Cabibbo-Kobayashi-Maskawa CP-violating angle {gamma} through a Dalitz plot analysis of neutral D-meson decays to K{sub S}{sup 0}{pi}{sup +}{pi}{sup -} and K{sub S}{sup 0}K{sup +}K{sup -} produced in the processes B{sup {+-}}{yields}DK{sup {+-}}, B{sup {+-}}{yields}D*K{sup {+-}} with D*{yields}D{pi}{sup 0}, D{gamma}, and B{sup {+-}}{yields}DK*{sup {+-}} with K*{sup {+-}}{yields}K{sub S}{sup 0}{pi}{+-}, using 468 million BB pairs collected by the BABAR detector at the PEP-II asymmetric-energy e{sup +}e{sup -} collider at SLAC. We measure {gamma}=(68{+-}14{+-}4{+-}3) deg. (modulo 180 deg.), where the first error is statistical, the second is the experimental systematic uncertainty, and the third reflects the uncertaintymore » in the description of the neutral D decay amplitudes. This result is inconsistent with {gamma}=0 (no direct CP violation) with a significance of 3.5 standard deviations.« less
Cardona, Samir Julián Calvo; Cadavid, Henry Cardona; Corrales, Juan David; Munilla, Sebastián; Cantet, Rodolfo J C; Rogberg-Muñoz, Andrés
2016-09-01
The κ-casein (CSN-3) and β-lactoglobulin (BLG) genes are extensively polymorphic in ruminants. Several association studies have estimated the effects of polymorphisms in these genes on milk yield, milk composition, and cheese-manufacturing properties. Usually, these results are based on production integrated over the lactation curve or on cross-sectional studies at specific days in milk (DIM). However, as differential expression of milk protein genes occurs over lactation, the effect of the polymorphisms may change over time. In this study, we fitted a mixed-effects regression model to test-day records of milk yield and milk quality traits (fat, protein, and total solids yields) from Colombian tropical dairy goats. We used the well-characterized A/B polymorphisms in the CSN-3 and BLG genes. We argued that this approach provided more efficient estimators than cross-sectional designs, given the same number and pattern of observations, and allowed exclusion of between-subject variation from model error. The BLG genotype AA showed a greater performance than the BB genotype for all traits along the whole lactation curve, whereas the heterozygote showed an intermediate performance. We observed no such constant pattern for the CSN-3 gene between the AA homozygote and the heterozygote (the BB genotype was absent from the sample). The differences among the genotypic effects of the BLG and the CSN-3 polymorphisms were statistically significant during peak and mid lactation (around 40-160 DIM) for the BLG gene and only for mid lactation (80-145 DIM) for the CSN-3 gene. We also estimated the additive and dominant effects of the BLG locus. The locus showed a statistically significant additive behavior along the whole lactation trajectory for all quality traits, whereas for milk yield the effect was not significant at later stages. In turn, we detected a statistically significant dominance effect only for fat yield in the early and peak stages of lactation (at about 1-45 DIM). The longitudinal analysis of test-day records allowed us to estimate the differential effects of polymorphisms along the lactation curve, pointing toward stages that could be affected by the gene. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Jaffke, Patrick; Möller, Peter; Stetcu, Ionel; Talou, Patrick; Schmitt, Christelle
2018-03-01
We implement fission fragment yields, calculated using Brownian shape-motion on a macroscopic-microscopic potential energy surface in six dimensions, into the Hauser-Feshbach statistical decay code CGMF. This combination allows us to test the impact of utilizing theoretically-calculated fission fragment yields on the subsequent prompt neutron and γ-ray emission. We draw connections between the fragment yields and the total kinetic energy TKE of the fission fragments and demonstrate that the use of calculated yields can introduce a difference in the 〈TKE〉 and, thus, the prompt neutron multiplicity
Extraction of the proton radius from electron-proton scattering data
Lee, Gabriel; Arrington, John R.; Hill, Richard J.
2015-07-27
We perform a new analysis of electron-proton scattering data to determine the proton electric and magnetic radii, enforcing model-independent constraints from form factor analyticity. A wide-ranging study of possible systematic effects is performed. An improved analysis is developed that rebins data taken at identical kinematic settings and avoids a scaling assumption of systematic errors with statistical errors. Employing standard models for radiative corrections, our improved analysis of the 2010 Mainz A1 Collaboration data yields a proton electric radius r E = 0.895(20) fm and magnetic radius r M = 0.776(38) fm. A similar analysis applied to world data (excluding Mainzmore » data) implies r E = 0.916(24) fm and r M = 0.914(35) fm. The Mainz and world values of the charge radius are consistent, and a simple combination yields a value r E = 0.904(15) fm that is 4σ larger than the CREMA Collaboration muonic hydrogen determination. The Mainz and world values of the magnetic radius differ by 2.7σ, and a simple average yields r M = 0.851(26) fm. As a result, the circumstances under which published muonic hydrogen and electron scattering data could be reconciled are discussed, including a possible deficiency in the standard radiative correction model which requires further analysis.« less
On the analysis of very small samples of Gaussian repeated measurements: an alternative approach.
Westgate, Philip M; Burchett, Woodrow W
2017-03-15
The analysis of very small samples of Gaussian repeated measurements can be challenging. First, due to a very small number of independent subjects contributing outcomes over time, statistical power can be quite small. Second, nuisance covariance parameters must be appropriately accounted for in the analysis in order to maintain the nominal test size. However, available statistical strategies that ensure valid statistical inference may lack power, whereas more powerful methods may have the potential for inflated test sizes. Therefore, we explore an alternative approach to the analysis of very small samples of Gaussian repeated measurements, with the goal of maintaining valid inference while also improving statistical power relative to other valid methods. This approach uses generalized estimating equations with a bias-corrected empirical covariance matrix that accounts for all small-sample aspects of nuisance correlation parameter estimation in order to maintain valid inference. Furthermore, the approach utilizes correlation selection strategies with the goal of choosing the working structure that will result in the greatest power. In our study, we show that when accurate modeling of the nuisance correlation structure impacts the efficiency of regression parameter estimation, this method can improve power relative to existing methods that yield valid inference. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Search for correlation between geomagnetic disturbances and mortality
NASA Technical Reports Server (NTRS)
Lipa, B. J.; Sturrock, P. A.; Rogot, F.
1976-01-01
A search is conducted for a possible correlation between solar activity and myocardial infarction and stroke in the United States. A statistical analysis is performed using data on geomagnetic activity and the daily U.S. mortality due to coronary heart disease and stroke for the years 1962 through 1966. None of the results are found to yield any evidence of a correlation. It is concluded that correlations claimed by Soviet workers between geomagnetic activity and the incidence of various human diseases are probably not statistically significant or probably are not due to a causal relation between geomagnetic activity and disease.
Impacts of climate change and inter-annual variability on cereal crops in China from 1980 to 2008.
Zhang, Tianyi; Huang, Yao
2012-06-01
Negative climate impacts on crop yield increase pressures on food security in China. In this study, climatic impacts on cereal yields (rice, wheat and maize) were investigated by analyzing climate-yield relationships from 1980 to 2008. Results indicated that warming was significant, but trends in precipitation and solar radiation were not statistically significant in most of China. In general, maize is particularly sensitive to warming. However, increase in temperature was correlated with both lower and higher yield of rice and wheat, which is inconsistent with the current view that warming results in decline in yields. Of the three cereal crops, further analysis suggested that reduction in yields with higher temperature is accompanied by lower precipitation, which mainly occurred in northern parts of China, suggesting droughts reduced yield due to lack of water resources. Similarly, a positive correlation between temperature and yield can be alternatively explained by the effect of solar radiation, mainly in the southern part of China where water resources are abundant. Overall, our study suggests that it is inter-annual variations in precipitation and solar radiation that have driven change in cereal yields in China over the last three decades. Copyright © 2011 Society of Chemical Industry.
NASA Astrophysics Data System (ADS)
Papadavid, G.; Hadjimitsis, D.
2014-08-01
Remote sensing techniques development have provided the opportunity for optimizing yields in the agricultural procedure and moreover to predict the forthcoming yield. Yield prediction plays a vital role in Agricultural Policy and provides useful data to policy makers. In this context, crop and soil parameters along with NDVI index which are valuable sources of information have been elaborated statistically to test if a) Durum wheat yield can be predicted and b) when is the actual time-window to predict the yield in the district of Paphos, where Durum wheat is the basic cultivation and supports the rural economy of the area. 15 plots cultivated with Durum wheat from the Agricultural Research Institute of Cyprus for research purposes, in the area of interest, have been under observation for three years to derive the necessary data. Statistical and remote sensing techniques were then applied to derive and map a model that can predict yield of Durum wheat in this area. Indeed the semi-empirical model developed for this purpose, with very high correlation coefficient R2=0.886, has shown in practice that can predict yields very good. Students T test has revealed that predicted values and real values of yield have no statistically significant difference. The developed model can and will be further elaborated with more parameters and applied for other crops in the near future.
ERIC Educational Resources Information Center
Hartsoe, Joseph K.; Barclay, Susan R.
2017-01-01
The purpose of this study was to investigate faculty belief, knowledge, and confidence in the principles of Universal Design for Instruction (UDI). Results yielded statistically significant correlations between participant's belief and knowledge of the principles of UDI. Furthermore, findings yielded statistically significant differences between…
Texture analysis with statistical methods for wheat ear extraction
NASA Astrophysics Data System (ADS)
Bakhouche, M.; Cointault, F.; Gouton, P.
2007-01-01
In agronomic domain, the simplification of crop counting, necessary for yield prediction and agronomic studies, is an important project for technical institutes such as Arvalis. Although the main objective of our global project is to conceive a mobile robot for natural image acquisition directly in a field, Arvalis has proposed us first to detect by image processing the number of wheat ears in images before to count them, which will allow to obtain the first component of the yield. In this paper we compare different texture image segmentation techniques based on feature extraction by first and higher order statistical methods which have been applied on our images. The extracted features are used for unsupervised pixel classification to obtain the different classes in the image. So, the K-means algorithm is implemented before the choice of a threshold to highlight the ears. Three methods have been tested in this feasibility study with very average error of 6%. Although the evaluation of the quality of the detection is visually done, automatic evaluation algorithms are currently implementing. Moreover, other statistical methods of higher order will be implemented in the future jointly with methods based on spatio-frequential transforms and specific filtering.
NASA Astrophysics Data System (ADS)
Santos, João A.; Malheiro, Aureliano C.; Karremann, Melanie K.; Pinto, Joaquim G.
2011-03-01
The impact of projected climate change on wine production was analysed for the Demarcated Region of Douro, Portugal. A statistical grapevine yield model (GYM) was developed using climate parameters as predictors. Statistically significant correlations were identified between annual yield and monthly mean temperatures and monthly precipitation totals during the growing cycle. These atmospheric factors control grapevine yield in the region, with the GYM explaining 50.4% of the total variance in the yield time series in recent decades. Anomalously high March rainfall (during budburst, shoot and inflorescence development) favours yield, as well as anomalously high temperatures and low precipitation amounts in May and June (May: flowering and June: berry development). The GYM was applied to a regional climate model output, which was shown to realistically reproduce the GYM predictors. Finally, using ensemble simulations under the A1B emission scenario, projections for GYM-derived yield in the Douro Region, and for the whole of the twenty-first century, were analysed. A slight upward trend in yield is projected to occur until about 2050, followed by a steep and continuous increase until the end of the twenty-first century, when yield is projected to be about 800 kg/ha above current values. While this estimate is based on meteorological parameters alone, changes due to elevated CO2 may further enhance this effect. In spite of the associated uncertainties, it can be stated that projected climate change may significantly benefit wine yield in the Douro Valley.
Selection of Drought Tolerant Maize Hybrids Using Path Coefficient Analysis and Selection Index.
Dao, Abdalla; Sanou, Jacob; V S Traore, Edgar; Gracen, Vernon; Danquah, Eric Y
2017-01-01
In drought-prone environments, direct selection for yield is not adequate because of the variable environment and genotype x environment interaction. Therefore, the use of secondary traits in addition to yield has been suggested. The relative usefulness of secondary traits as indirect selection criteria for maize grain yield is determined by the magnitudes of their genetic variance, heritability and genetic correlation with the grain yield. Forty eight testcross hybrids derived from lines with different genetic background and geographical origins plus 7 checks were evaluated in both well-watered and water-stressed conditions over two years for grain yield and secondary traits to determine the most appropriate secondary traits and select drought tolerant hybrids. Study found that broad-sense heritability of grain yield and Ear Per Plant (EPP) increased under drought stress. Ear aspect (EASP) and ear height (EHT) had larger correlation coefficients and direct effect on grain yield but in opposite direction, negative and positive respectively. Traits like, EPP, Tassel Size (TS) and Plant Recovery (PR) contributed to increase yield via EASP by a large negative indirect effect. Under drought stress, EHT had positive and high direct effect and negative indirect effect via plant height on grain yield indicating that the ratio between ear and plant heights (R-EPH) was associated to grain yield. Path coefficient analysis showed that traits EPP, TS, PR, EASP, R-EPH were important secondary traits in the present experiment. These traits were used in a selection index to classify hybrids according to their performance under drought. The selection procedure included also a Relative Decrease in Yield (RDY) index. Some secondary traits reported as significant selection criteria for selection under drought stress were not finally established in the present study. This is because the relationship between grain and secondary traits can be affected by various factors including germplasm, environment and applied statistical analysis. Therefore, different traits and selection procedure should be applied in the selection process of drought tolerant genotypes for diverse genetic materials and growing conditions.
NASA Astrophysics Data System (ADS)
Tien, Hai Minh; Le, Kien Anh; Le, Phung Thi Kim
2017-09-01
Bio hydrogen is a sustainable energy resource due to its potentially higher efficiency of conversion to usable power, high energy efficiency and non-polluting nature resource. In this work, the experiments have been carried out to indicate the possibility of generating bio hydrogen as well as identifying effective factors and the optimum conditions from cassava starch. Experimental design was used to investigate the effect of operating temperature (37-43 °C), pH (6-7), and inoculums ratio (6-10 %) to the yield hydrogen production, the COD reduction and the ratio of volume of hydrogen production to COD reduction. The statistical analysis of the experiment indicated that the significant effects for the fermentation yield were the main effect of temperature, pH and inoculums ratio. The interaction effects between them seem not significant. The central composite design showed that the polynomial regression models were in good agreement with the experimental results. This result will be applied to enhance the process of cassava starch processing wastewater treatment.
NASA Astrophysics Data System (ADS)
Li, Jun-Wei; Cao, Jun-Wei
2010-04-01
One challenge in large-scale scientific data analysis is to monitor data in real-time in a distributed environment. For the LIGO (Laser Interferometer Gravitational-wave Observatory) project, a dedicated suit of data monitoring tools (DMT) has been developed, yielding good extensibility to new data type and high flexibility to a distributed environment. Several services are provided, including visualization of data information in various forms and file output of monitoring results. In this work, a DMT monitor, OmegaMon, is developed for tracking statistics of gravitational-wave (OW) burst triggers that are generated from a specific OW burst data analysis pipeline, the Omega Pipeline. Such results can provide diagnostic information as reference of trigger post-processing and interferometer maintenance.
Superstatistics analysis of the ion current distribution function: Met3PbCl influence study.
Miśkiewicz, Janusz; Trela, Zenon; Przestalski, Stanisław; Karcz, Waldemar
2010-09-01
A novel analysis of ion current time series is proposed. It is shown that higher (second, third and fourth) statistical moments of the ion current probability distribution function (PDF) can yield new information about ion channel properties. The method is illustrated on a two-state model where the PDF of the compound states are given by normal distributions. The proposed method was applied to the analysis of the SV cation channels of vacuolar membrane of Beta vulgaris and the influence of trimethyllead chloride (Met(3)PbCl) on the ion current probability distribution. Ion currents were measured by patch-clamp technique. It was shown that Met(3)PbCl influences the variance of the open-state ion current but does not alter the PDF of the closed-state ion current. Incorporation of higher statistical moments into the standard investigation of ion channel properties is proposed.
Landsat analysis for uranium exploration in Northeast Turkey
Lee, Keenan
1983-01-01
No uranium deposits are known in the Trabzon, Turkey region, and consequently, exploration criteria have not been defined. Nonetheless, by analogy with uranium deposits studied elsewhere, exploration guides are suggested to include dense concentrations of linear features, lineaments -- especially with northwest trend, acidic plutonic rocks, and alteration indicated by limonite. A suite of digitally processed images of a single Landsat scene served as the image base for mapping 3,376 linear features. Analysis of the linear feature data yielded two statistically significant trends, which in turn defined two sets of strong lineaments. Color composite images were used to map acidic plutonic rocks and areas of surficial limonitic materials. The Landsat interpretation yielded a map of these exploration guides that may be used to evaluate relative uranium potential. One area in particular shows a high coincidence of favorable indicators.
ERIC Educational Resources Information Center
Martuza, Victor R.; Engel, John D.
Results from classical power analysis (Brewer, 1972) suggest that a researcher should not set a=p (when p is less than a) in a posteriori fashion when a study yields statistically significant results because of a resulting decrease in power. The purpose of the present report is to use Bayesian theory in examining the validity of this…
SHAREv2: fluctuations and a comprehensive treatment of decay feed-down
NASA Astrophysics Data System (ADS)
Torrieri, G.; Jeon, S.; Letessier, J.; Rafelski, J.
2006-11-01
This the user's manual for SHARE version 2. SHARE [G. Torrieri, S. Steinke, W. Broniowski, W. Florkowski, J. Letessier, J. Rafelski, Comput. Phys. Comm. 167 (2005) 229] (Statistical Hadronization with Resonances) is a collection of programs designed for the statistical analysis of particle production in relativistic heavy-ion collisions. While the structure of the program remains similar to v1.x, v2 provides several new features such as evaluation of statistical fluctuations of particle yields, and a greater versatility, in particular regarding decay feed-down and input/output structure. This article describes all the new features, with emphasis on statistical fluctuations. Program summaryTitle of program:SHAREv2 Catalogue identifier:ADVD_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADVD_v2_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computer:PC, Pentium III, 512 MB RAM not hardware dependent Operating system:Linux: RedHat 6.1, 7.2, FEDORA, etc. not system dependent Programming language:FORTRAN77 Size of the package:167 KB directory, without libraries (see http://wwwasdoc.web.cern.ch/wwwasdoc/minuit/minmain.html, http://wwwasd.web.cern.ch/wwwasd/cernlib.html for details on library requirements) Number of lines in distributed program, including test data, etc.:26 101 Number of bytes in distributed program, including test data, etc.:170 346 Distribution format:tar.gzip file Computer:Any computer with an f77 compiler Nature of the physical problem:Event-by-event fluctuations have been recognized to be the physical observable capable to constrain particle production models. Therefore, consideration of event-by-event fluctuations is required for a decisive falsification or constraining of (variants of) particle production models based on (grand-, micro-) canonical statistical mechanics phase space, the so called statistical hadronization models (SHM). As in the case of particle yields, to properly compare model calculations to data it is necessary to consistently take into account resonance decays. However, event-by-event fluctuations are more sensitive than particle yields to experimental acceptance issues, and a range of techniques needs to be implemented to extract 'physical' fluctuations from an experimental event-by-event measurement. Method of solving the problem:The techniques used within the SHARE suite of programs [G. Torrieri, S. Steinke, W. Broniowski, W. Florkowski, J. Letessier, J. Rafelski, Comput. Phys. Comm. 167 (2005) 229; SHAREv1] are updated and extended to fluctuations. A full particle data-table, decay tree, and set of experimental feed-down coefficients are provided. Unlike SHAREv1.x, experimental acceptance feed-down coefficients can be entered for any resonance decay. SHAREv2 can calculate yields, fluctuations, and bulk properties of the fireball from provided thermal parameters; alternatively, parameters can be obtained from fits to experimental data, via the MINUIT fitting algorithm [F. James, M. Roos, Comput. Phys. Comm. 10 (1975) 343]. Fits can also be analyzed for significance, parameter and data point sensitivity. Averages and fluctuations at freeze-out of both the stable particles and the hadronic resonances are set according to a statistical prescription, calculated via a series of Bessel functions, using CERN library programs. We also have the option of including finite particle widths of the resonances. A χ minimization algorithm, also from the CERN library programs, is used to perform and analyze the fit. Please see SHAREv1 for more details on these. Purpose:The vast amount of high quality soft hadron production data, from experiments running at the SPS, RHIC, in past at the AGS, and in the near future at the LHC, offers the opportunity for statistical particle production model falsification. This task has turned out to be difficult when considering solely particle yields addressed in the context of SHAREv1.x. For this reason physical conditions at freeze-out remain contested. Inclusion in the analysis of event-by-event fluctuations appears to resolve this issue. Similarly, a thorough analysis including both fluctuations and average multiplicities gives a way to explore the presence and strength of interactions following hadronization (when hadrons form), ending with thermal freeze-out (when all interactions cease). SHAREv2 with fluctuations will also help determine which statistical ensemble (if any), e.g., canonical or grand-canonical, is more physically appropriate for analyzing a given system. Together with resonances, fluctuations can also be used for a direct estimate of the extent the system re-interacts between chemical and thermal freeze-out. We hope and expect that SHAREv2 will contribute to decide if any of the statistical hadronization model variants has a genuine physical connection to hadron particle production. Computation time survey:We encounter, in the FORTRAN version computation, times up to seconds for evaluation of particle yields. These rise by up to a factor of 300 in the process of minimization and a further factor of a few when χ/N profiles and contours with chemical non-equilibrium are requested. Summary of new features (w.r.t. SHAREv1.x)Fluctuations:In addition to particle yields, ratios and bulk quantities SHAREv2 can calculate, fit and analyze statistical fluctuations of particles and particle ratios Decays:SHAREv2 has the flexibility to account for any experimental method of allowing for decay feed-downs to the particle yields Charm flavor:Charmed particles have been added to the decay tree, allowing as an option study of statistical hadronization of J/ψ, χ, D, etc. Quark chemistry:Chemical non-equilibrium yields for both u and d flavors, as opposed to generically light quarks q, are considered; η- η mixing, etc., are properly dealt with, and chemical non-equilibrium can be studied for each flavor separately Misc:Many new commands and features have been introduced and added to the basic user interface. For example, it is possible to study combinations of particles and their ratios. It is also possible to combine all the input files into one file. SHARE compatibility and manual:This write-up is an update and extension of SHAREv1. The user should consult SHAREv1 regarding the principles of user interface and for all particle yield related physics and program instructions, other than the parameter additions and minor changes described here. SHAREv2 is downward compatible for the changes of the user interface, offering the user of SHAREv1 a computer generated revised input files compatible with SHAREv2.
A proposed metabolic strategy for monitoring disease progression in Alzheimer's disease.
Greenberg, Nicola; Grassano, Antonio; Thambisetty, Madhav; Lovestone, Simon; Legido-Quigley, Cristina
2009-04-01
A specific, sensitive and essentially non-invasive assay to diagnose and monitor Alzheimer's disease (AD) would be valuable to both clinicians and medical researchers. The aim of this study was to perform a metabonomic statistical analysis on plasma fingerprints. Objectives were to investigate novel biomarkers indicative of AD, to consider the role of bile acids as AD biomarkers and to consider whether mild cognitive impairment (MCI) is a separate disease from AD. Samples were analysed by ultraperformance liquid chromatography-MS and resulting data sets were interpreted using soft-independent modelling of class analogy statistical analysis methods. PCA models did not show any grouping of subjects by disease state. Partial least-squares discriminant analysis (PLS-DS) models yielded class separation for AD. However, as with earlier studies, model validation revealed a predictive power of Q(2)<0.5 and indicating their unsuitability for predicting disease state. Three bile acids were extracted from the data and quantified, up-regulation was observed for MCI and AD patients. PLS-DA did not support MCI being considered as a separate disease from AD with MCI patient metabolic profiles being significantly closer to AD patients than controls. This study suggested that further investigation into the lipid fraction of the metabolome may yield useful biomarkers for AD and metabolomic profiles could be used to predict disease state in a clinical setting.
Bai, Xue; Zheng, Zhuqing; Liu, Bin; Ji, Xiaoyang; Bai, Yongsheng; Zhang, Wenguang
2016-08-22
The objective of this research was to investigate the variation of gene expression in the blood transcriptome profile of Chinese Holstein cows associated to the milk yield traits. We used RNA-seq to generate the bovine transcriptome from the blood of 23 lactating Chinese Holstein cows with extremely high and low milk yield. A total of 100 differentially expressed genes (DEGs) (p < 0.05, FDR < 0.05) were revealed between the high and low groups. Gene ontology (GO) analysis demonstrated that the 100 DEGs were enriched in specific biological processes with regard to defense response, immune response, inflammatory response, icosanoid metabolic process, and fatty acid metabolic process (p < 0.05). The KEGG pathway analysis with 100 DEGs revealed that the most statistically-significant metabolic pathway was related with Toll-like receptor signaling pathway (p < 0.05). The expression level of four selected DEGs was analyzed by qRT-PCR, and the results indicated that the expression patterns were consistent with the deep sequencing results by RNA-Seq. Furthermore, alternative splicing analysis of 100 DEGs demonstrated that there were different splicing pattern between high and low yielders. The alternative 3' splicing site was the major splicing pattern detected in high yielders. However, in low yielders the major type was exon skipping. This study provides a non-invasive method to identify the DEGs in cattle blood using RNA-seq for milk yield. The revealed 100 DEGs between Holstein cows with extremely high and low milk yield, and immunological pathway are likely involved in milk yield trait. Finally, this study allowed us to explore associations between immune traits and production traits related to milk production.
Guidelines for Genome-Scale Analysis of Biological Rhythms.
Hughes, Michael E; Abruzzi, Katherine C; Allada, Ravi; Anafi, Ron; Arpat, Alaaddin Bulak; Asher, Gad; Baldi, Pierre; de Bekker, Charissa; Bell-Pedersen, Deborah; Blau, Justin; Brown, Steve; Ceriani, M Fernanda; Chen, Zheng; Chiu, Joanna C; Cox, Juergen; Crowell, Alexander M; DeBruyne, Jason P; Dijk, Derk-Jan; DiTacchio, Luciano; Doyle, Francis J; Duffield, Giles E; Dunlap, Jay C; Eckel-Mahan, Kristin; Esser, Karyn A; FitzGerald, Garret A; Forger, Daniel B; Francey, Lauren J; Fu, Ying-Hui; Gachon, Frédéric; Gatfield, David; de Goede, Paul; Golden, Susan S; Green, Carla; Harer, John; Harmer, Stacey; Haspel, Jeff; Hastings, Michael H; Herzel, Hanspeter; Herzog, Erik D; Hoffmann, Christy; Hong, Christian; Hughey, Jacob J; Hurley, Jennifer M; de la Iglesia, Horacio O; Johnson, Carl; Kay, Steve A; Koike, Nobuya; Kornacker, Karl; Kramer, Achim; Lamia, Katja; Leise, Tanya; Lewis, Scott A; Li, Jiajia; Li, Xiaodong; Liu, Andrew C; Loros, Jennifer J; Martino, Tami A; Menet, Jerome S; Merrow, Martha; Millar, Andrew J; Mockler, Todd; Naef, Felix; Nagoshi, Emi; Nitabach, Michael N; Olmedo, Maria; Nusinow, Dmitri A; Ptáček, Louis J; Rand, David; Reddy, Akhilesh B; Robles, Maria S; Roenneberg, Till; Rosbash, Michael; Ruben, Marc D; Rund, Samuel S C; Sancar, Aziz; Sassone-Corsi, Paolo; Sehgal, Amita; Sherrill-Mix, Scott; Skene, Debra J; Storch, Kai-Florian; Takahashi, Joseph S; Ueda, Hiroki R; Wang, Han; Weitz, Charles; Westermark, Pål O; Wijnen, Herman; Xu, Ying; Wu, Gang; Yoo, Seung-Hee; Young, Michael; Zhang, Eric Erquan; Zielinski, Tomasz; Hogenesch, John B
2017-10-01
Genome biology approaches have made enormous contributions to our understanding of biological rhythms, particularly in identifying outputs of the clock, including RNAs, proteins, and metabolites, whose abundance oscillates throughout the day. These methods hold significant promise for future discovery, particularly when combined with computational modeling. However, genome-scale experiments are costly and laborious, yielding "big data" that are conceptually and statistically difficult to analyze. There is no obvious consensus regarding design or analysis. Here we discuss the relevant technical considerations to generate reproducible, statistically sound, and broadly useful genome-scale data. Rather than suggest a set of rigid rules, we aim to codify principles by which investigators, reviewers, and readers of the primary literature can evaluate the suitability of different experimental designs for measuring different aspects of biological rhythms. We introduce CircaInSilico, a web-based application for generating synthetic genome biology data to benchmark statistical methods for studying biological rhythms. Finally, we discuss several unmet analytical needs, including applications to clinical medicine, and suggest productive avenues to address them.
Guidelines for Genome-Scale Analysis of Biological Rhythms
Hughes, Michael E.; Abruzzi, Katherine C.; Allada, Ravi; Anafi, Ron; Arpat, Alaaddin Bulak; Asher, Gad; Baldi, Pierre; de Bekker, Charissa; Bell-Pedersen, Deborah; Blau, Justin; Brown, Steve; Ceriani, M. Fernanda; Chen, Zheng; Chiu, Joanna C.; Cox, Juergen; Crowell, Alexander M.; DeBruyne, Jason P.; Dijk, Derk-Jan; DiTacchio, Luciano; Doyle, Francis J.; Duffield, Giles E.; Dunlap, Jay C.; Eckel-Mahan, Kristin; Esser, Karyn A.; FitzGerald, Garret A.; Forger, Daniel B.; Francey, Lauren J.; Fu, Ying-Hui; Gachon, Frédéric; Gatfield, David; de Goede, Paul; Golden, Susan S.; Green, Carla; Harer, John; Harmer, Stacey; Haspel, Jeff; Hastings, Michael H.; Herzel, Hanspeter; Herzog, Erik D.; Hoffmann, Christy; Hong, Christian; Hughey, Jacob J.; Hurley, Jennifer M.; de la Iglesia, Horacio O.; Johnson, Carl; Kay, Steve A.; Koike, Nobuya; Kornacker, Karl; Kramer, Achim; Lamia, Katja; Leise, Tanya; Lewis, Scott A.; Li, Jiajia; Li, Xiaodong; Liu, Andrew C.; Loros, Jennifer J.; Martino, Tami A.; Menet, Jerome S.; Merrow, Martha; Millar, Andrew J.; Mockler, Todd; Naef, Felix; Nagoshi, Emi; Nitabach, Michael N.; Olmedo, Maria; Nusinow, Dmitri A.; Ptáček, Louis J.; Rand, David; Reddy, Akhilesh B.; Robles, Maria S.; Roenneberg, Till; Rosbash, Michael; Ruben, Marc D.; Rund, Samuel S.C.; Sancar, Aziz; Sassone-Corsi, Paolo; Sehgal, Amita; Sherrill-Mix, Scott; Skene, Debra J.; Storch, Kai-Florian; Takahashi, Joseph S.; Ueda, Hiroki R.; Wang, Han; Weitz, Charles; Westermark, Pål O.; Wijnen, Herman; Xu, Ying; Wu, Gang; Yoo, Seung-Hee; Young, Michael; Zhang, Eric Erquan; Zielinski, Tomasz; Hogenesch, John B.
2017-01-01
Genome biology approaches have made enormous contributions to our understanding of biological rhythms, particularly in identifying outputs of the clock, including RNAs, proteins, and metabolites, whose abundance oscillates throughout the day. These methods hold significant promise for future discovery, particularly when combined with computational modeling. However, genome-scale experiments are costly and laborious, yielding “big data” that are conceptually and statistically difficult to analyze. There is no obvious consensus regarding design or analysis. Here we discuss the relevant technical considerations to generate reproducible, statistically sound, and broadly useful genome-scale data. Rather than suggest a set of rigid rules, we aim to codify principles by which investigators, reviewers, and readers of the primary literature can evaluate the suitability of different experimental designs for measuring different aspects of biological rhythms. We introduce CircaInSilico, a web-based application for generating synthetic genome biology data to benchmark statistical methods for studying biological rhythms. Finally, we discuss several unmet analytical needs, including applications to clinical medicine, and suggest productive avenues to address them. PMID:29098954
A Flexible Approach for the Statistical Visualization of Ensemble Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Potter, K.; Wilson, A.; Bremer, P.
2009-09-29
Scientists are increasingly moving towards ensemble data sets to explore relationships present in dynamic systems. Ensemble data sets combine spatio-temporal simulation results generated using multiple numerical models, sampled input conditions and perturbed parameters. While ensemble data sets are a powerful tool for mitigating uncertainty, they pose significant visualization and analysis challenges due to their complexity. We present a collection of overview and statistical displays linked through a high level of interactivity to provide a framework for gaining key scientific insight into the distribution of the simulation results as well as the uncertainty associated with the data. In contrast to methodsmore » that present large amounts of diverse information in a single display, we argue that combining multiple linked statistical displays yields a clearer presentation of the data and facilitates a greater level of visual data analysis. We demonstrate this approach using driving problems from climate modeling and meteorology and discuss generalizations to other fields.« less
Payne, Courtney E; Wolfrum, Edward J
2015-01-01
Obtaining accurate chemical composition and reactivity (measures of carbohydrate release and yield) information for biomass feedstocks in a timely manner is necessary for the commercialization of biofuels. Our objective was to use near-infrared (NIR) spectroscopy and partial least squares (PLS) multivariate analysis to develop calibration models to predict the feedstock composition and the release and yield of soluble carbohydrates generated by a bench-scale dilute acid pretreatment and enzymatic hydrolysis assay. Major feedstocks included in the calibration models are corn stover, sorghum, switchgrass, perennial cool season grasses, rice straw, and miscanthus. We present individual model statistics to demonstrate model performance and validation samples to more accurately measure predictive quality of the models. The PLS-2 model for composition predicts glucan, xylan, lignin, and ash (wt%) with uncertainties similar to primary measurement methods. A PLS-2 model was developed to predict glucose and xylose release following pretreatment and enzymatic hydrolysis. An additional PLS-2 model was developed to predict glucan and xylan yield. PLS-1 models were developed to predict the sum of glucose/glucan and xylose/xylan for release and yield (grams per gram). The release and yield models have higher uncertainties than the primary methods used to develop the models. It is possible to build effective multispecies feedstock models for composition, as well as carbohydrate release and yield. The model for composition is useful for predicting glucan, xylan, lignin, and ash with good uncertainties. The release and yield models have higher uncertainties; however, these models are useful for rapidly screening sample populations to identify unusual samples.
NASA Astrophysics Data System (ADS)
Lin, Shu; Wang, Rui; Xia, Ning; Li, Yongdong; Liu, Chunliang
2018-01-01
Statistical multipactor theories are critical prediction approaches for multipactor breakdown determination. However, these approaches still require a negotiation between the calculation efficiency and accuracy. This paper presents an improved stationary statistical theory for efficient threshold analysis of two-surface multipactor. A general integral equation over the distribution function of the electron emission phase with both the single-sided and double-sided impacts considered is formulated. The modeling results indicate that the improved stationary statistical theory can not only obtain equally good accuracy of multipactor threshold calculation as the nonstationary statistical theory, but also achieve high calculation efficiency concurrently. By using this improved stationary statistical theory, the total time consumption in calculating full multipactor susceptibility zones of parallel plates can be decreased by as much as a factor of four relative to the nonstationary statistical theory. It also shows that the effect of single-sided impacts is indispensable for accurate multipactor prediction of coaxial lines and also more significant for the high order multipactor. Finally, the influence of secondary emission yield (SEY) properties on the multipactor threshold is further investigated. It is observed that the first cross energy and the energy range between the first cross and the SEY maximum both play a significant role in determining the multipactor threshold, which agrees with the numerical simulation results in the literature.
DOE Office of Scientific and Technical Information (OSTI.GOV)
del Amo Sanchez, P.; Lees, J.P.; Poireau, V.
2011-08-19
We report the measurement of the Cabibbo-Kobayashi-Maskawa CP-violating angle {gamma} through a Dalitz plot analysis of neutral D meson decays to K{sub S}{sup 0}{pi}{sup +}{pi}{sup -} and K{sub S}{sup 0} K{sup +}K{sup -} produced in the processes B{sup {-+}} {yields} DK{sup {-+}}, B{sup {-+}} {yields} D* K{sup {-+}} with D* {yields} D{pi}{sup 0}, D{gamma}, and B{sup {-+}} {yields} DK*{sup {-+}} with K*{sup {-+}} {yields} K{sub S}{sup 0}{pi}{sup {-+}}, using 468 million B{bar B} pairs collected by the BABAR detector at the PEP-II asymmetric-energy e{sup +}e{sup -} collider at SLAC. We measure {gamma} = (68 {+-} 14 {+-} 4 {+-} 3){supmore » o} (modulo 180{sup o}), where the first error is statistical, the second is the experimental systematic uncertainty and the third reflects the uncertainty in the description of the neutral D decay amplitudes. This result is inconsistent with {gamma} = 0 (no direct CP violation) with a significance of 3.5 standard deviations.« less
Lizana, Carolina; Wentworth, Mark; Martinez, Juan P; Villegas, Daniel; Meneses, Rodrigo; Murchie, Erik H; Pastenes, Claudio; Lercari, Bartolomeo; Vernieri, Paulo; Horton, Peter; Pinto, Manuel
2006-01-01
The yield of 24 commercial varieties and accessions of common bean (Phaseolus vulgaris) has been determined at different sites in Chile and Bolivia. Statistical analysis was performed in order to characterize whether a particular variety was more or less stable in yield under different environmental conditions. Amongst these, two varieties have been identified for more detailed study: one variety has a higher than average yield under unstressed conditions but is strongly affected by stress, and another has a reduced yield under unstressed conditions but is less affected by stress. The contrasting rate of abscission of the reproductive organs under drought stress was clearly consistent with these differences. The more tolerant genotype shows a great deal of plasticity at the biochemical and cellular level when exposed to drought stress, in terms of stomatal conductance, photosynthetic rate, abscisic acid synthesis, and resistance to photoinhibition. By contrast, the former lacks such plasticity, but shows an enhanced tendency for a morphological response, the movement of leaves, which appears to be its principal response to drought stress.
NASA Astrophysics Data System (ADS)
Arunachalam, M. S.; Obili, Manjula; Srimurali, M.
2016-07-01
Long-term variation of Surface Ozone, NO2, Temperature, Relative humidity and crop yield datasets over thirteen districts of Andhra Pradesh(AP) has been studied with the help of OMI, MODIS, AIRS, ERA-Interim re-analysis and Directorate of Economics and Statistics (DES) of AP. Inter comparison of crop yield loss estimates according to exposure metrics such as AOT40 (accumulated ozone exposure over a threshold of 40) and non-linear variation of surface temperature for twenty and eighteen varieties of two major crop growing seasons namely, kharif (April-September) and rabi (October-March), respectively has been made. Study is carried to establish a new crop-yield-exposure relationship for different crop cultivars of AP. Both ozone and temperature are showing a correlation coefficient of 0.66 and 0.87 with relative humidity; and 0.72 and 0.80 with NO2. Alleviation of high surface ozone results in high food security and improves the economy thereby reduces the induced warming of the troposphere caused by ozone. Keywords: Surface Ozone, NO2, Temperature, Relative humidity, Crop yield, AOT 40.
Performance of Blind Source Separation Algorithms for FMRI Analysis using a Group ICA Method
Correa, Nicolle; Adali, Tülay; Calhoun, Vince D.
2007-01-01
Independent component analysis (ICA) is a popular blind source separation (BSS) technique that has proven to be promising for the analysis of functional magnetic resonance imaging (fMRI) data. A number of ICA approaches have been used for fMRI data analysis, and even more ICA algorithms exist, however the impact of using different algorithms on the results is largely unexplored. In this paper, we study the performance of four major classes of algorithms for spatial ICA, namely information maximization, maximization of non-gaussianity, joint diagonalization of cross-cumulant matrices, and second-order correlation based methods when they are applied to fMRI data from subjects performing a visuo-motor task. We use a group ICA method to study the variability among different ICA algorithms and propose several analysis techniques to evaluate their performance. We compare how different ICA algorithms estimate activations in expected neuronal areas. The results demonstrate that the ICA algorithms using higher-order statistical information prove to be quite consistent for fMRI data analysis. Infomax, FastICA, and JADE all yield reliable results; each having their strengths in specific areas. EVD, an algorithm using second-order statistics, does not perform reliably for fMRI data. Additionally, for the iterative ICA algorithms, it is important to investigate the variability of the estimates from different runs. We test the consistency of the iterative algorithms, Infomax and FastICA, by running the algorithm a number of times with different initializations and note that they yield consistent results over these multiple runs. Our results greatly improve our confidence in the consistency of ICA for fMRI data analysis. PMID:17540281
Performance of blind source separation algorithms for fMRI analysis using a group ICA method.
Correa, Nicolle; Adali, Tülay; Calhoun, Vince D
2007-06-01
Independent component analysis (ICA) is a popular blind source separation technique that has proven to be promising for the analysis of functional magnetic resonance imaging (fMRI) data. A number of ICA approaches have been used for fMRI data analysis, and even more ICA algorithms exist; however, the impact of using different algorithms on the results is largely unexplored. In this paper, we study the performance of four major classes of algorithms for spatial ICA, namely, information maximization, maximization of non-Gaussianity, joint diagonalization of cross-cumulant matrices and second-order correlation-based methods, when they are applied to fMRI data from subjects performing a visuo-motor task. We use a group ICA method to study variability among different ICA algorithms, and we propose several analysis techniques to evaluate their performance. We compare how different ICA algorithms estimate activations in expected neuronal areas. The results demonstrate that the ICA algorithms using higher-order statistical information prove to be quite consistent for fMRI data analysis. Infomax, FastICA and joint approximate diagonalization of eigenmatrices (JADE) all yield reliable results, with each having its strengths in specific areas. Eigenvalue decomposition (EVD), an algorithm using second-order statistics, does not perform reliably for fMRI data. Additionally, for iterative ICA algorithms, it is important to investigate the variability of estimates from different runs. We test the consistency of the iterative algorithms Infomax and FastICA by running the algorithm a number of times with different initializations, and we note that they yield consistent results over these multiple runs. Our results greatly improve our confidence in the consistency of ICA for fMRI data analysis.
NASA Astrophysics Data System (ADS)
Egiyan, H.; Langheinrich, J.; Gothe, R. W.; Graham, L.; Holtrop, M.; Lu, H.; Mattione, P.; Mutchler, G.; Park, K.; Smith, E. S.; Stepanyan, S.; Zhao, Z. W.; Adhikari, K. P.; Aghasyan, M.; Anghinolfi, M.; Baghdasaryan, H.; Ball, J.; Baltzell, N. A.; Battaglieri, M.; Bedlinskiy, I.; Bennett, R. P.; Biselli, A. S.; Bookwalter, C.; Branford, D.; Briscoe, W. J.; Brooks, W. K.; Burkert, V. D.; Carman, D. S.; Celentano, A.; Chandavar, S.; Contalbrigo, M.; D'Angelo, A.; Daniel, A.; Dashyan, N.; de Vita, R.; de Sanctis, E.; Deur, A.; Dey, B.; Dickson, R.; Djalali, C.; Doughty, D.; Dupre, R.; El Alaoui, A.; El Fassi, L.; Eugenio, P.; Fedotov, G.; Fegan, S.; Fradi, A.; Gabrielyan, M. Y.; Gevorgyan, N.; Gilfoyle, G. P.; Giovanetti, K. L.; Girod, F. X.; Goetz, J. T.; Gohn, W.; Golovatch, E.; Griffioen, K. A.; Guidal, M.; Guler, N.; Guo, L.; Gyurjyan, V.; Hafidi, K.; Hakobyan, H.; Hanretty, C.; Heddle, D.; Hicks, K.; Ilieva, Y.; Ireland, D. G.; Ishkhanov, B. S.; Jo, H. S.; Joo, K.; Khetarpal, P.; Kim, A.; Kim, W.; Klein, A.; Klein, F. J.; Kubarovsky, V.; Kuleshov, S. V.; Livingston, K.; MacGregor, I. J. D.; Mao, Y.; Mayer, M.; McKinnon, B.; Mokeev, V.; Munevar, E.; Nadel-Turonski, P.; Ni, A.; Niculescu, G.; Ostrovidov, A. I.; Paolone, M.; Pappalardo, L.; Paremuzyan, R.; Park, S.; Pasyuk, E.; Anefalos Pereira, S.; Phelps, E.; Pogorelko, O.; Pozdniakov, S.; Price, J. W.; Procureur, S.; Protopopescu, D.; Raue, B. A.; Ricco, G.; Rimal, D.; Ripani, M.; Ritchie, B. G.; Rosner, G.; Rossi, P.; Sabatié, F.; Saini, M. S.; Salgado, C.; Schott, D.; Schumacher, R. A.; Seder, E.; Seraydaryan, H.; Sharabian, Y. G.; Smith, G. D.; Sober, D. I.; Stepanyan, S. S.; Strauch, S.; Taiuti, M.; Tang, W.; Taylor, C. E.; Tedeschi, D. J.; Ungaro, M.; Voutier, E.; Watts, D. P.; Weinstein, L. B.; Weygand, D. P.; Wood, M. H.; Zachariou, N.; Zana, L.; Zhao, B.
2012-01-01
We searched for the Φ--(1860) pentaquark in the photoproduction process off the deuteron in the Ξ-π--decay channel using CLAS. The invariant-mass spectrum of the Ξ-π- system does not indicate any statistically significant enhancement near the reported mass M=1.860 GeV. The statistical analysis of the sideband-subtracted mass spectrum yields a 90%-confidence-level upper limit of 0.7 nb for the photoproduction cross section of Φ--(1860) with a consecutive decay into Ξ-π- in the photon-energy range 4.5GeV
Wirz, Stefan; Klaschik, Eberhard
2005-01-01
This study assessed the efficacy of laxative use for treatment of constipation in patients receiving opioid therapy, with special attention to polyethylene glycol 3350/electrolyte solution (PEG-ES). Computerized data from 206 patients were analyzed using descriptive statistics. Subgroups were analyzed using confirmatory statistics. Constipation occurred in 42.7 percent of patients. Laxatives were administered to 74.3 percent of these patients using a standardized step scheme, with good results in 78.4 percent. As a therapy for constipation, the combined administration of PEG-ES, sodium picosulphate, and liquid paraffin proved most effective, although statistical analysis yielded no significance. Early use of PEG-ES using a step scheme holds promise for treatment of opioid-related constipation in palliative care patients, although further investigation is warranted.
NASA Astrophysics Data System (ADS)
Martucci, G.; Carniel, S.; Chiggiato, J.; Sclavo, M.; Lionello, P.; Galati, M. B.
2010-06-01
The study is a statistical analysis of sea states timeseries derived using the wave model WAM forced by the ERA-40 dataset in selected areas near the Italian coasts. For the period 1 January 1958 to 31 December 1999 the analysis yields: (i) the existence of a negative trend in the annual- and winter-averaged sea state heights; (ii) the existence of a turning-point in late 80's in the annual-averaged trend of sea state heights at a site in the Northern Adriatic Sea; (iii) the overall absence of a significant trend in the annual-averaged mean durations of sea states over thresholds; (iv) the assessment of the extreme values on a time-scale of thousand years. The analysis uses two methods to obtain samples of extremes from the independent sea states: the r-largest annual maxima and the peak-over-threshold. The two methods show statistical differences in retrieving the return values and more generally in describing the significant wave field. The r-largest annual maxima method provides more reliable predictions of the extreme values especially for small return periods (<100 years). Finally, the study statistically proves the existence of decadal negative trends in the significant wave heights and by this it conveys useful information on the wave climatology of the Italian seas during the second half of the 20th century.
NASA Astrophysics Data System (ADS)
Andersson, C. David; Hillgren, J. Mikael; Lindgren, Cecilia; Qian, Weixing; Akfur, Christine; Berg, Lotta; Ekström, Fredrik; Linusson, Anna
2015-03-01
Scientific disciplines such as medicinal- and environmental chemistry, pharmacology, and toxicology deal with the questions related to the effects small organic compounds exhort on biological targets and the compounds' physicochemical properties responsible for these effects. A common strategy in this endeavor is to establish structure-activity relationships (SARs). The aim of this work was to illustrate benefits of performing a statistical molecular design (SMD) and proper statistical analysis of the molecules' properties before SAR and quantitative structure-activity relationship (QSAR) analysis. Our SMD followed by synthesis yielded a set of inhibitors of the enzyme acetylcholinesterase (AChE) that had very few inherent dependencies between the substructures in the molecules. If such dependencies exist, they cause severe errors in SAR interpretation and predictions by QSAR-models, and leave a set of molecules less suitable for future decision-making. In our study, SAR- and QSAR models could show which molecular sub-structures and physicochemical features that were advantageous for the AChE inhibition. Finally, the QSAR model was used for the prediction of the inhibition of AChE by an external prediction set of molecules. The accuracy of these predictions was asserted by statistical significance tests and by comparisons to simple but relevant reference models.
Scenario analysis of fertilizer management practices for N2O mitigation from corn systems in Canada.
Abalos, Diego; Smith, Ward N; Grant, Brian B; Drury, Craig F; MacKell, Sarah; Wagner-Riddle, Claudia
2016-12-15
Effective management of nitrogen (N) fertilizer application by farmers provides great potential for reducing emissions of the potent greenhouse gas nitrous oxide (N 2 O). However, such potential is rarely achieved because our understanding of what practices (or combination of practices) lead to N 2 O reductions without compromising crop yields remains far from complete. Using scenario analysis with the process-based model DNDC, this study explored the effects of nine fertilizer practices on N 2 O emissions and crop yields from two corn production systems in Canada. The scenarios differed in: timing of fertilizer application, fertilizer rate, number of applications, fertilizer type, method of application and use of nitrification/urease inhibitors. Statistical analysis showed that during the initial calibration and validation stages the simulated results had no significant total error or bias compared to measured values, yet grain yield estimations warrant further model improvement. Sidedress fertilizer applications reduced yield-scaled N 2 O emissions by c. 60% compared to fall fertilization. Nitrification inhibitors further reduced yield-scaled N 2 O emissions by c. 10%; urease inhibitors had no effect on either N 2 O emissions or crop productivity. The combined adoption of split fertilizer application with inhibitors at a rate 10% lower than the conventional application rate (i.e. 150kgNha -1 ) was successful, but the benefits were lower than those achieved with single fertilization at sidedress. Our study provides a comprehensive assessment of fertilizer management practices that enables policy development regarding N 2 O mitigation from agricultural soils in Canada. Copyright © 2016 Elsevier B.V. All rights reserved.
Strategy For Yield Control And Enhancement In VLSI Wafer Manufacturing
NASA Astrophysics Data System (ADS)
Neilson, B.; Rickey, D.; Bane, R. P.
1988-01-01
In most fully utilized integrated circuit (IC) production facilities, profit is very closely linked with yield. In even the most controlled manufacturing environments, defects due to foreign material are a still major contributor to yield loss. Ideally, an IC manufacturer will have ample engineering resources to address any problem that arises. In the real world, staffing limitations require that some tasks must be left undone and potential benefits left unrealized. Therefore, it is important to prioritize problems in a manner that will give the maximum benefit to the manufacturer. When offered a smorgasbord of problems to solve, most people (engineers included) will start with what is most interesting or the most comfortable to work on. By providing a system that accurately predicts the impact of a wide variety of defect types, a rational method of prioritizing engineering effort can be made. To that effect, a program was developed to determine and rank the major yield detractors in a mixed analog/digital FET manufacturing line. The two classical methods of determining yield detractors are chip failure analysis and defect monitoring on drop in test die. Both of these methods have short comings: 1) Chip failure analysis is painstaking and very time consuming. As a result, the sample size is very small. 2) Drop in test die are usually designed for device parametric analysis rather than defect analysis. To provide enough wafer real estate to do meaningful defect analysis would render the wafer worthless for production. To avoid these problems, a defect monitor was designed that provided enough area to detect defects at the same rate or better than the NMOS product die whose yield was to be optimized. The defect monitor was comprehensive and electrically testable using such equipment as the Prometrix LM25 and other digital testers. This enabled the quick accumulation of data which could be handled statistically and mapped individually. By scaling the defect densities found on the monitors to the known sensitivities of the product wafer, the defect types were ranked by defect limiting yield. (Limiting yield is the resultant product yield if there were no other failure mechanisms other than the type being considered.) These results were then compared to the product failure analysis results to verify that the monitor was finding the same types of defects in the same proportion which were troubling our product. Finally, the major defect types were isolated and reduced using the short loop capability of the monitor.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lees, J. P.; Poireau, V.; Tisserand, V.
We report an analysis of charmless hadronic decays of charged B mesons to the final state K{sup +}{pi}{sup 0}{pi}{sup 0}, using a data sample of (470.9{+-}2.8)x10{sup 6} BB events collected with the BABAR detector at the {Upsilon}(4S) resonance. We observe an excess of signal events, with a significance above 10 standard deviations including systematic uncertainties, and measure the branching fraction and CP asymmetry to be B(B{sup +}{yields}K{sup +}{pi}{sup 0}{pi}{sup 0})=(16.2{+-}1.2{+-}1.5)x10{sup -6} and A{sub CP}(B{sup +}{yields}K{sup +}{pi}{sup 0}{pi}{sup 0})=-0.06{+-}0.06{+-}0.04, where the uncertainties are statistical and systematic, respectively. Additionally, we study the contributions of the B{sup +}{yields}K{sup *}(892){sup +}{pi}{sup 0}, B{sup +}{yields}f{submore » 0}(980)K{sup +}, and B{sup +}{yields}{chi}{sub c0}K{sup +} quasi-two-body decays. We report the world's best measurements of the branching fraction and CP asymmetry of the B{sup +}{yields}K{sup +}{pi}{sup 0}{pi}{sup 0} and B{sup +}{yields}K{sup *}(892){sup +}{pi}{sup 0} channels.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tan, Zeli; Leung, L. Ruby; Li, Hongyi
Although sediment yield (SY) from water erosion is ubiquitous and its environmental consequences are well recognized, its impacts on the global carbon cycle remain largely uncertain. This knowledge gap is partly due to the lack of soil erosion modeling in Earth System Models (ESMs), which are important tools used to understand the global carbon cycle and explore its changes. This study analyzed sediment and particulate organic carbon yield (CY) data from 1081 and 38 small catchments (0.1-200 km27 ), respectively, in different environments across the globe. Using multiple statistical analysis techniques, we explored environmental factors and hydrological processes important formore » SY and CY modeling in ESMs. Our results show clear correlations of high SY with traditional agriculture, seismicity and heavy storms, as well as strong correlations between SY and annual peak runoff. These highlight the potential limitation of SY models that represent only interrill and rill erosion because shallow overland flow and rill flow have limited transport capacity due to their hydraulic geometry to produce high SY. Further, our results suggest that SY modeling in ESMs should be implemented at the event scale to produce the catastrophic mass transport during episodic events. Several environmental factors such as seismicity and land management that are often not considered in current catchment-scale SY models can be important in controlling global SY. Our analyses show that SY is likely the primary control on CY in small catchments and a statistically significant empirical relationship is established to calculate SY and CY jointly in ESMs.« less
Statistical theory and methodology for remote sensing data analysis with special emphasis on LACIE
NASA Technical Reports Server (NTRS)
Odell, P. L.
1975-01-01
Crop proportion estimators for determining crop acreage through the use of remote sensing were evaluated. Several studies of these estimators were conducted, including an empirical comparison of the different estimators (using actual data) and an empirical study of the sensitivity (robustness) of the class of mixture estimators. The effect of missing data upon crop classification procedures is discussed in detail including a simulation of the missing data effect. The final problem addressed is that of taking yield data (bushels per acre) gathered at several yield stations and extrapolating these values over some specified large region. Computer programs developed in support of some of these activities are described.
Application of spatial Poisson process models to air mass thunderstorm rainfall
NASA Technical Reports Server (NTRS)
Eagleson, P. S.; Fennessy, N. M.; Wang, Qinliang; Rodriguez-Iturbe, I.
1987-01-01
Eight years of summer storm rainfall observations from 93 stations in and around the 154 sq km Walnut Gulch catchment of the Agricultural Research Service, U.S. Department of Agriculture, in Arizona are processed to yield the total station depths of 428 storms. Statistical analysis of these random fields yields the first two moments, the spatial correlation and variance functions, and the spatial distribution of total rainfall for each storm. The absolute and relative worth of three Poisson models are evaluated by comparing their prediction of the spatial distribution of storm rainfall with observations from the second half of the sample. The effect of interstorm parameter variation is examined.
Structure of turbulent non-premixed flames modeled with two-step chemistry
NASA Technical Reports Server (NTRS)
Chen, J. H.; Mahalingam, S.; Puri, I. K.; Vervisch, L.
1992-01-01
Direct numerical simulations of turbulent diffusion flames modeled with finite-rate, two-step chemistry, A + B yields I, A + I yields P, were carried out. A detailed analysis of the turbulent flame structure reveals the complex nature of the penetration of various reactive species across two reaction zones in mixture fraction space. Due to this two zone structure, these flames were found to be robust, resisting extinction over the parameter ranges investigated. As in single-step computations, mixture fraction dissipation rate and the mixture fraction were found to be statistically correlated. Simulations involving unequal molecular diffusivities suggest that the small scale mixing process and, hence, the turbulent flame structure is sensitive to the Schmidt number.
Spatial analysis on future housing markets: economic development and housing implications.
Liu, Xin; Wang, Lizhe
2014-01-01
A coupled projection method combining formal modelling and other statistical techniques was developed to delineate the relationship between economic and social drivers for net new housing allocations. Using the example of employment growth in Tyne and Wear, UK, until 2016, the empirical analysis yields housing projections at the macro- and microspatial levels (e.g., region to subregion to elected ward levels). The results have important implications for the strategic planning of locations for housing and employment, demonstrating both intuitively and quantitatively how local economic developments affect housing demand.
Spatial Analysis on Future Housing Markets: Economic Development and Housing Implications
Liu, Xin; Wang, Lizhe
2014-01-01
A coupled projection method combining formal modelling and other statistical techniques was developed to delineate the relationship between economic and social drivers for net new housing allocations. Using the example of employment growth in Tyne and Wear, UK, until 2016, the empirical analysis yields housing projections at the macro- and microspatial levels (e.g., region to subregion to elected ward levels). The results have important implications for the strategic planning of locations for housing and employment, demonstrating both intuitively and quantitatively how local economic developments affect housing demand. PMID:24892097
Status and results from the OPERA experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ariga, Tomoko
2011-10-06
The OPERA experiment is aiming at the first direct detection of neutrino oscillations in appearance mode through the study of the v{sub {mu}}{yields}v{tau} channel. The OPERA detector is placed in the CNGS long baseline v{sub {mu}} beam 730 km away from the neutrino source. The analysis of a sub-sample of the data taken in the 2008-2009 runs was completed After a brief description of the beam and the experimental setup, we report on event analysis and on a first candidate event, its background estimation and statistical significance.
On Statistical Analysis of Neuroimages with Imperfect Registration
Kim, Won Hwa; Ravi, Sathya N.; Johnson, Sterling C.; Okonkwo, Ozioma C.; Singh, Vikas
2016-01-01
A variety of studies in neuroscience/neuroimaging seek to perform statistical inference on the acquired brain image scans for diagnosis as well as understanding the pathological manifestation of diseases. To do so, an important first step is to register (or co-register) all of the image data into a common coordinate system. This permits meaningful comparison of the intensities at each voxel across groups (e.g., diseased versus healthy) to evaluate the effects of the disease and/or use machine learning algorithms in a subsequent step. But errors in the underlying registration make this problematic, they either decrease the statistical power or make the follow-up inference tasks less effective/accurate. In this paper, we derive a novel algorithm which offers immunity to local errors in the underlying deformation field obtained from registration procedures. By deriving a deformation invariant representation of the image, the downstream analysis can be made more robust as if one had access to a (hypothetical) far superior registration procedure. Our algorithm is based on recent work on scattering transform. Using this as a starting point, we show how results from harmonic analysis (especially, non-Euclidean wavelets) yields strategies for designing deformation and additive noise invariant representations of large 3-D brain image volumes. We present a set of results on synthetic and real brain images where we achieve robust statistical analysis even in the presence of substantial deformation errors; here, standard analysis procedures significantly under-perform and fail to identify the true signal. PMID:27042168
The Harm Done to Reproducibility by the Culture of Null Hypothesis Significance Testing.
Lash, Timothy L
2017-09-15
In the last few years, stakeholders in the scientific community have raised alarms about a perceived lack of reproducibility of scientific results. In reaction, guidelines for journals have been promulgated and grant applicants have been asked to address the rigor and reproducibility of their proposed projects. Neither solution addresses a primary culprit, which is the culture of null hypothesis significance testing that dominates statistical analysis and inference. In an innovative research enterprise, selection of results for further evaluation based on null hypothesis significance testing is doomed to yield a low proportion of reproducible results and a high proportion of effects that are initially overestimated. In addition, the culture of null hypothesis significance testing discourages quantitative adjustments to account for systematic errors and quantitative incorporation of prior information. These strategies would otherwise improve reproducibility and have not been previously proposed in the widely cited literature on this topic. Without discarding the culture of null hypothesis significance testing and implementing these alternative methods for statistical analysis and inference, all other strategies for improving reproducibility will yield marginal gains at best. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Kirgiz, Irina A; Calloway, Cassandra
2017-04-01
Tape lifting and FTA paper scraping methods were directly compared to traditional double swabbing for collecting touch DNA from car steering wheels (n = 70 cars). Touch DNA was collected from the left or right side of each steering wheel (randomized) using two sterile cotton swabs, while the other side was sampled using water-soluble tape or FTA paper cards. DNA was extracted and quantified in duplicate using qPCR. Quantifiable amounts of DNA were detected for 100% of the samples (n = 140) collected independent of the method. However, the DNA collection yield was dependent on the collection method. A statistically significant difference in DNA yield was observed between FTA scraping and double swabbing methods (p = 0.0051), with FTA paper collecting a two-fold higher amount. Statistical analysis showed no significant difference in DNA yields between the double swabbing and tape lifting techniques (p = 0.21). Based on the DNA concentration required for 1 ng input, 47% of the samples collected using FTA paper would be expected to yield a short tandem repeat (STR) profile compared to 30% and 23% using double swabbing or tape, respectively. Further, 55% and 77% of the samples collected using double swabbing or tape, respectively, did not yield a high enough DNA concentration for the 0.5 ng of DNA input recommended for conventional STR kits and would be expected to result in a partial or no profile compared to 35% of the samples collected using FTA paper. STR analysis was conducted for a subset of the higher concentrated samples to confirm that the DNA collected from the steering wheel was from the driver. 32 samples were selected with DNA amounts of at least 1 ng total DNA (100 pg/μl when concentrated if required). A mixed STR profile was observed for 26 samples (88%) and the last driver was the major DNA contributor for 29 samples (94%). For one sample, the last driver was the minor DNA contributor. A full STR profile of the last driver was observed for 21 samples (69%) and a partial profile was observed for nine samples (25%); STR analysis failed for two samples collected using tape (6%). In conclusion, we show that the FTA paper scraping method has the potential to collect higher DNA yields from touch DNA evidence deposited on non-porous surfaces often encountered in criminal cases compared to conventional methods. Copyright © 2017 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.
The statistics of identifying differentially expressed genes in Expresso and TM4: a comparison
Sioson, Allan A; Mane, Shrinivasrao P; Li, Pinghua; Sha, Wei; Heath, Lenwood S; Bohnert, Hans J; Grene, Ruth
2006-01-01
Background Analysis of DNA microarray data takes as input spot intensity measurements from scanner software and returns differential expression of genes between two conditions, together with a statistical significance assessment. This process typically consists of two steps: data normalization and identification of differentially expressed genes through statistical analysis. The Expresso microarray experiment management system implements these steps with a two-stage, log-linear ANOVA mixed model technique, tailored to individual experimental designs. The complement of tools in TM4, on the other hand, is based on a number of preset design choices that limit its flexibility. In the TM4 microarray analysis suite, normalization, filter, and analysis methods form an analysis pipeline. TM4 computes integrated intensity values (IIV) from the average intensities and spot pixel counts returned by the scanner software as input to its normalization steps. By contrast, Expresso can use either IIV data or median intensity values (MIV). Here, we compare Expresso and TM4 analysis of two experiments and assess the results against qRT-PCR data. Results The Expresso analysis using MIV data consistently identifies more genes as differentially expressed, when compared to Expresso analysis with IIV data. The typical TM4 normalization and filtering pipeline corrects systematic intensity-specific bias on a per microarray basis. Subsequent statistical analysis with Expresso or a TM4 t-test can effectively identify differentially expressed genes. The best agreement with qRT-PCR data is obtained through the use of Expresso analysis and MIV data. Conclusion The results of this research are of practical value to biologists who analyze microarray data sets. The TM4 normalization and filtering pipeline corrects microarray-specific systematic bias and complements the normalization stage in Expresso analysis. The results of Expresso using MIV data have the best agreement with qRT-PCR results. In one experiment, MIV is a better choice than IIV as input to data normalization and statistical analysis methods, as it yields as greater number of statistically significant differentially expressed genes; TM4 does not support the choice of MIV input data. Overall, the more flexible and extensive statistical models of Expresso achieve more accurate analytical results, when judged by the yardstick of qRT-PCR data, in the context of an experimental design of modest complexity. PMID:16626497
Furken, C; Hoedemaker, M
2014-01-01
The effects of rumen-protected choline (RPC) on energy metabolism and milk production in dairy cows were analyzed. Two hundred and ninety-eight primiparous and multiparous cows of a high producing dairy herd (mean daily milk yield: 32 l) were randomly assigned to control or treatment groups and were fed with 0 or 15 g RPC, respectively, (corresponding to 0 and 60 g/d ReaShure®, respectively) from 21 days before expected calving to 21 days postpartum (p. p.). Blood metabolites were determined for either all cows (glucose, β-hydroxybutyrate [BHB]) or randomly (insulin, insulin-like growth factor-1 [IGF-1], non-esterified fatty acids [NEFA]) during the periparturient period. An index for insulin sensitivity (RQUICKI) was calculated and milk production data (dairy herd improvement tests, 100-days-, 305-days-, milk peak yield, colostrum quality) was analyzed. In the statistical analysis, a distinction was made between the feeding groups and between the parity, and their interactions were analyzed. With the exception of a lower 305-day-milk yield in the treatment group (p < 0.05), the evaluated variables did not show statistically significant differences between the feeding groups and no interactions could be found. In comparison to heifers, multiparous cows had less cases of subclinical ketosis a. p. and p. p. (OR a. p.: 0.178; OR p. p.: 0.310), more of them were above the threshold for somatic cell counts (OR 2.584-3.298), and their milk yields were higher (p < 0.05). Supplementing RPC did not affect the energy metabolism or the milk production in this herd. Further research in other dairy herds should focus on this topic.
Octet baryon masses and sigma terms from an SU(3) chiral extrapolation
NASA Astrophysics Data System (ADS)
Young, R. D.; Thomas, A. W.
2010-01-01
We report an analysis of the impressive new lattice simulation results for octet baryon masses in 2+1-flavor QCD. The analysis is based on a low-order expansion about the chiral SU(3) limit in which the symmetry breaking arises from terms linear in the quark masses plus the variation of the Goldstone boson masses in the leading chiral loops. The baryon masses evaluated at the physical light-quark masses are in remarkable agreement with the experimental values, with a model dependence considerably smaller than the rather small statistical uncertainty. From the mass formulas one can evaluate the sigma commutators for all octet baryons. This yields an accurate value for the pion-nucleon sigma commutator. It also yields the first determination of the strangeness sigma term based on 2+1-flavor lattice QCD and, in general, the sigma commutators provide a resolution to the difficult issue of fine-tuning the strange-quark mass.
Ganasegeran, Kurubaran; Selvaraj, Kamaraj; Rashid, Abdul
2017-08-01
The six item Confusion, Hubbub and Order Scale (CHAOS-6) has been validated as a reliable tool to measure levels of household disorder. We aimed to investigate the goodness of fit and reliability of a new Malay version of the CHAOS-6. The original English version of the CHAOS-6 underwent forward-backward translation into the Malay language. The finalised Malay version was administered to 105 myocardial infarction survivors in a Malaysian cardiac health facility. We performed confirmatory factor analyses (CFAs) using structural equation modelling. A path diagram and fit statistics were yielded to determine the Malay version's validity. Composite reliability was tested to determine the scale's reliability. All 105 myocardial infarction survivors participated in the study. The CFA yielded a six-item, one-factor model with excellent fit statistics. Composite reliability for the single factor CHAOS-6 was 0.65, confirming that the scale is reliable for Malay speakers. The Malay version of the CHAOS-6 was reliable and showed the best fit statistics for our study sample. We thus offer a simple, brief, validated, reliable and novel instrument to measure chaos, the Skala Kecelaruan, Keriuhan & Tertib Terubahsuai (CHAOS-6) , for the Malaysian population.
Ganasegeran, Kurubaran; Selvaraj, Kamaraj; Rashid, Abdul
2017-01-01
Background The six item Confusion, Hubbub and Order Scale (CHAOS-6) has been validated as a reliable tool to measure levels of household disorder. We aimed to investigate the goodness of fit and reliability of a new Malay version of the CHAOS-6. Methods The original English version of the CHAOS-6 underwent forward-backward translation into the Malay language. The finalised Malay version was administered to 105 myocardial infarction survivors in a Malaysian cardiac health facility. We performed confirmatory factor analyses (CFAs) using structural equation modelling. A path diagram and fit statistics were yielded to determine the Malay version’s validity. Composite reliability was tested to determine the scale’s reliability. Results All 105 myocardial infarction survivors participated in the study. The CFA yielded a six-item, one-factor model with excellent fit statistics. Composite reliability for the single factor CHAOS-6 was 0.65, confirming that the scale is reliable for Malay speakers. Conclusion The Malay version of the CHAOS-6 was reliable and showed the best fit statistics for our study sample. We thus offer a simple, brief, validated, reliable and novel instrument to measure chaos, the Skala Kecelaruan, Keriuhan & Tertib Terubahsuai (CHAOS-6), for the Malaysian population. PMID:28951688
Rothfeld, Alex; Pawlak, Amanda; Liebler, Stephenie A H; Morris, Michael; Paci, James M
2018-04-01
Patellar tendon repair with braided polyethylene suture alone is subject to knot slippage and failure. Several techniques to augment the primary repair have been described. Purpose/Hypothesis: The purpose was to evaluate a novel patellar tendon repair technique augmented with a knotless suture anchor internal brace with suture tape (SAIB). The hypothesis was that this technique would be biomechanically superior to a nonaugmented repair and equivalent to a standard augmentation with an 18-gauge steel wire. Controlled laboratory study. Midsubstance patellar tendon tears were created in 32 human cadaveric knees. Two comparison groups were created. Group 1 compared #2 supersuture repair without augmentation to #2 supersuture repair with SAIB augmentation. Group 2 compared #2 supersuture repair with an 18-gauge stainless steel cerclage wire augmentation to #2 supersuture repair with SAIB augmentation. The specimens were potted and biomechanically loaded on a materials testing machine. Yield load, maximum load, mode of failure, plastic displacement, elastic displacement, and total displacement were calculated for each sample. Standard statistical analysis was performed. There was a statistically significant increase in the mean ± SD yield load and maximum load in the SAIB augmentation group compared with supersuture alone (mean yield load: 646 ± 202 N vs 229 ± 60 N; mean maximum load: 868 ± 162 N vs 365 ± 54 N; P < .001). Group 2 showed no statistically significant differences between the augmented repairs (mean yield load: 495 ± 213 N vs 566 ± 172 N; P = .476; mean maximum load: 737 ± 210 N vs 697 ± 130 N; P = .721). Patellar tendon repair augmented with SAIB is biomechanically superior to repair without augmentation and is equivalent to repair with augmentation with an 18-gauge stainless steel cerclage wire. This novel patellar tendon repair augmentation is equivalent to standard 18-gauge wire augmentation at time zero. It does not require a second surgery for removal, and it is biomechanically superior to primary repair alone.
Measurement of self-evaluative motives: a shopping scenario.
Wajda, Theresa A; Kolbe, Richard; Hu, Michael Y; Cui, Annie Peng
2008-08-01
To develop measures of consumers' self-evaluative motives of Self-verification, Self-enhancement, and Self-improvement within the context of a mall shopping environment, an initial set of 49 items was generated by conducting three focus-group sessions. These items were subsequently converted into shopping-dependent motive statements. 250 undergraduate college students responded on a 7-point scale to each statement as these related to the acquisition of recent personal shopping goods. An exploratory factor analysis yielded five factors, accounting for 57.7% of the variance, three of which corresponded to the Self-verification motive (five items), Self-enhancement motive (three items), and Self-improvement motive (six items). These 14 items, along with 9 reconstructed items, yielded 23 items retained and subjected to additional testing. In a final round of data collection, 169 college students provided data for exploratory factor analysis. 11 items were used in confirmatory factor analysis. Analysis indicated that the 11-item scale adequately captured measures of the three self-evaluative motives. However, further data reduction produced a 9-item scale with marked improvement in statistical fit over the 11-item scale.
NASA Astrophysics Data System (ADS)
Gibon, François; Pellarin, Thierry; Alhassane, Agali; Traoré, Seydou; Baron, Christian
2017-04-01
West Africa is greatly vulnerable, especially in terms of food sustainability. Mainly based on rainfed agriculture, the high variability of the rainy season strongly impacts the crop production driven by the soil water availability in the soil. To monitor this water availability, classical methods are based on daily precipitation measurements. However, the raingauge network suffers from the poor network density in Africa (1/10000km2). Alternatively, real-time satellite-derived precipitations can be used, but they are known to suffer from large uncertainties which produce significant error on crop yield estimations. The present study proposes to use root soil moisture rather than precipitation to evaluate crop yield variations. First, a local analysis of the spatiotemporal impact of water deficit on millet crop production in Niger was done, from in-situ soil moisture measurements (AMMA-CATCH/OZCAR (French Critical Zone exploration network)) and in-situ millet yield survey. Crop yield measurements were obtained for 10 villages located in the Niamey region from 2005 to 2012. The mean production (over 8 years) is 690 kg/ha, and ranges from 381 to 872 kg/ha during this period. Various statistical relationships based on soil moisture estimates were tested, and the most promising one (R>0.9) linked the 30-cm soil moisture anomalies from mid-August to mid-September (grain filling period) to the crop yield anomalies. Based on this local study, it was proposed to derive regional statistical relationships using 30-cm soil moisture maps over West Africa. The selected approach was to use a simple hydrological model, the Antecedent Precipitation Index (API), forced by real-time satellite-based precipitation (CMORPH, PERSIANN, TRMM3B42). To reduce uncertainties related to the quality of real-time rainfall satellite products, SMOS soil moisture measurements were assimilated into the API model through a Particular Filter algorithm. Then, obtained soil moisture anomalies were compared to 17 years of crop yield estimates from the FAOSTAT database (1998-2014). Results showed that the 30-cm soil moisture anomalies explained 89% of the crop yield variation in Niger, 72% in Burkina Faso, 82% in Mali and 84% in Senegal.
Payne, Courtney E.; Wolfrum, Edward J.
2015-03-12
Obtaining accurate chemical composition and reactivity (measures of carbohydrate release and yield) information for biomass feedstocks in a timely manner is necessary for the commercialization of biofuels. Our objective was to use near-infrared (NIR) spectroscopy and partial least squares (PLS) multivariate analysis to develop calibration models to predict the feedstock composition and the release and yield of soluble carbohydrates generated by a bench-scale dilute acid pretreatment and enzymatic hydrolysis assay. Major feedstocks included in the calibration models are corn stover, sorghum, switchgrass, perennial cool season grasses, rice straw, and miscanthus. Here are the results: We present individual model statistics tomore » demonstrate model performance and validation samples to more accurately measure predictive quality of the models. The PLS-2 model for composition predicts glucan, xylan, lignin, and ash (wt%) with uncertainties similar to primary measurement methods. A PLS-2 model was developed to predict glucose and xylose release following pretreatment and enzymatic hydrolysis. An additional PLS-2 model was developed to predict glucan and xylan yield. PLS-1 models were developed to predict the sum of glucose/glucan and xylose/xylan for release and yield (grams per gram). The release and yield models have higher uncertainties than the primary methods used to develop the models. In conclusion, it is possible to build effective multispecies feedstock models for composition, as well as carbohydrate release and yield. The model for composition is useful for predicting glucan, xylan, lignin, and ash with good uncertainties. The release and yield models have higher uncertainties; however, these models are useful for rapidly screening sample populations to identify unusual samples.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Payne, Courtney E.; Wolfrum, Edward J.
Obtaining accurate chemical composition and reactivity (measures of carbohydrate release and yield) information for biomass feedstocks in a timely manner is necessary for the commercialization of biofuels. Our objective was to use near-infrared (NIR) spectroscopy and partial least squares (PLS) multivariate analysis to develop calibration models to predict the feedstock composition and the release and yield of soluble carbohydrates generated by a bench-scale dilute acid pretreatment and enzymatic hydrolysis assay. Major feedstocks included in the calibration models are corn stover, sorghum, switchgrass, perennial cool season grasses, rice straw, and miscanthus. Here are the results: We present individual model statistics tomore » demonstrate model performance and validation samples to more accurately measure predictive quality of the models. The PLS-2 model for composition predicts glucan, xylan, lignin, and ash (wt%) with uncertainties similar to primary measurement methods. A PLS-2 model was developed to predict glucose and xylose release following pretreatment and enzymatic hydrolysis. An additional PLS-2 model was developed to predict glucan and xylan yield. PLS-1 models were developed to predict the sum of glucose/glucan and xylose/xylan for release and yield (grams per gram). The release and yield models have higher uncertainties than the primary methods used to develop the models. In conclusion, it is possible to build effective multispecies feedstock models for composition, as well as carbohydrate release and yield. The model for composition is useful for predicting glucan, xylan, lignin, and ash with good uncertainties. The release and yield models have higher uncertainties; however, these models are useful for rapidly screening sample populations to identify unusual samples.« less
Daniel, Charles C.
1990-01-01
A statistical analysis of data from wells drilled into the crystalline rocks of the Piedmont and Blue Ridge provinces of North Carolina verified and refined previously proposed criteria for the siting of wells to obtain greater than average yields. An opportunity to test the criteria was provided by the expansion of the town of Cary's municipal ground-water system. Three criteria were used: type of rock, thickness of saturated regolith based upon topography, and presence of fractures and joints based upon drainage lineations. A conceptual model of the local hydrogeologic system was developed to guide the selection of the most favorable well sites, and on the basis of the model, six type sites were determined. Eleven of 12 test wells that were located on the basis of type sites yielded from slightly above average to as much as six times the average yield to be expected from particular rock types as reported in the literature. Only one well drilled at a type site had a less than average yield. One well not located at any of the type sites produced little water. Long-term testing and monitoring after the wells were put into production showed that an 18-hour-on, 6-hour-off pumping cycle was much more effective in terms of total production, reduced head loss, and less drawdown than a 5-day-on and 2-day-off cycle. It was also observed that long-term yields by the production wells were about 75 percent of those predicted on the basis of 24-hour pumping tests and only about 60 percent of the driller's reported yields. Cost analysis showed that, by using criteria-selected well sites, a cost-effective well system can be developed that will provide water at an equivalent or lower cost than a surface-water supply. The analysis showed that the system would be cost effective if only one high-yield well were obtained out of every four drilled.
A novel approach to identify genes that determine grain protein deviation in cereals.
Mosleth, Ellen F; Wan, Yongfang; Lysenko, Artem; Chope, Gemma A; Penson, Simon P; Shewry, Peter R; Hawkesford, Malcolm J
2015-06-01
Grain yield and protein content were determined for six wheat cultivars grown over 3 years at multiple sites and at multiple nitrogen (N) fertilizer inputs. Although grain protein content was negatively correlated with yield, some grain samples had higher protein contents than expected based on their yields, a trait referred to as grain protein deviation (GPD). We used novel statistical approaches to identify gene transcripts significantly related to GPD across environments. The yield and protein content were initially adjusted for nitrogen fertilizer inputs and then adjusted for yield (to remove the negative correlation with protein content), resulting in a parameter termed corrected GPD. Significant genetic variation in corrected GPD was observed for six cultivars grown over a range of environmental conditions (a total of 584 samples). Gene transcript profiles were determined in a subset of 161 samples of developing grain to identify transcripts contributing to GPD. Principal component analysis (PCA), analysis of variance (ANOVA) and means of scores regression (MSR) were used to identify individual principal components (PCs) correlating with GPD alone. Scores of the selected PCs, which were significantly related to GPD and protein content but not to the yield and significantly affected by cultivar, were identified as reflecting a multivariate pattern of gene expression related to genetic variation in GPD. Transcripts with consistent variation along the selected PCs were identified by an approach hereby called one-block means of scores regression (one-block MSR). © 2014 The Authors. Plant Biotechnology Journal published by Society for Experimental Biology and The Association of Applied Biologists and John Wiley & Sons Ltd.
Correlations between the modelled potato crop yield and the general atmospheric circulation
NASA Astrophysics Data System (ADS)
Sepp, Mait; Saue, Triin
2012-07-01
Biology-related indicators do not usually depend on just one meteorological element but on a combination of several weather indicators. One way to establish such integral indicators is to classify the general atmospheric circulation into a small number of circulation types. The aim of present study is to analyse connections between general atmospheric circulation and potato crop yield in Estonia. Meteorologically possible yield (MPY), calculated by the model POMOD, is used to characterise potato crop yield. Data of three meteorological stations and the biological parameters of two potato sorts were applied to the model, and 73 different classifications of atmospheric circulation from catalogue 1.2 of COST 733, domain 05 are used to qualify circulation conditions. Correlation analysis showed that there is at least one circulation type in each of the classifications with at least one statistically significant (99%) correlation with potato crop yield, whether in Kuressaare, Tallinn or Tartu. However, no classifications with circulation types correlating with MPY in all three stations at the same time were revealed. Circulation types inducing a decrease in the potato crop yield are more clearly represented. Clear differences occurred between the observed geographical locations as well as between the seasons: derived from the number of significant circulation types, summer and Kuressaare stand out. Of potato varieties, late 'Anti' is more influenced by circulation. Analysis of MSLP maps of circulation types revealed that the seaside stations (Tallinn, Kuressaare) suffer from negative effects of anti-cyclonic conditions (drought), while Tartu suffers from the cyclonic activity (excessive water).
An overview of meta-analysis for clinicians.
Lee, Young Ho
2018-03-01
The number of medical studies being published is increasing exponentially, and clinicians must routinely process large amounts of new information. Moreover, the results of individual studies are often insufficient to provide confident answers, as their results are not consistently reproducible. A meta-analysis is a statistical method for combining the results of different studies on the same topic and it may resolve conflicts among studies. Meta-analysis is being used increasingly and plays an important role in medical research. This review introduces the basic concepts, steps, advantages, and caveats of meta-analysis, to help clinicians understand it in clinical practice and research. A major advantage of a meta-analysis is that it produces a precise estimate of the effect size, with considerably increased statistical power, which is important when the power of the primary study is limited because of a small sample size. A meta-analysis may yield conclusive results when individual studies are inconclusive. Furthermore, meta-analyses investigate the source of variation and different effects among subgroups. In summary, a meta-analysis is an objective, quantitative method that provides less biased estimates on a specific topic. Understanding how to conduct a meta-analysis aids clinicians in the process of making clinical decisions.
Considerations in the statistical analysis of clinical trials in periodontitis.
Imrey, P B
1986-05-01
Adult periodontitis has been described as a chronic infectious process exhibiting sporadic, acute exacerbations which cause quantal, localized losses of dental attachment. Many analytic problems of periodontal trials are similar to those of other chronic diseases. However, the episodic, localized, infrequent, and relatively unpredictable behavior of exacerbations, coupled with measurement error difficulties, cause some specific problems. Considerable controversy exists as to the proper selection and treatment of multiple site data from the same patient for group comparisons for epidemiologic or therapeutic evaluative purposes. This paper comments, with varying degrees of emphasis, on several issues pertinent to the analysis of periodontal trials. Considerable attention is given to the ways in which measurement variability may distort analytic results. Statistical treatments of multiple site data for descriptive summaries are distinguished from treatments for formal statistical inference to validate therapeutic effects. Evidence suggesting that sites behave independently is contested. For inferential analyses directed at therapeutic or preventive effects, analytic models based on site independence are deemed unsatisfactory. Methods of summarization that may yield more powerful analyses than all-site mean scores, while retaining appropriate treatment of inter-site associations, are suggested. Brief comments and opinions on an assortment of other issues in clinical trial analysis are preferred.
Steep discounting of delayed monetary and food rewards in obesity: a meta-analysis.
Amlung, M; Petker, T; Jackson, J; Balodis, I; MacKillop, J
2016-08-01
An increasing number of studies have investigated delay discounting (DD) in relation to obesity, but with mixed findings. This meta-analysis synthesized the literature on the relationship between monetary and food DD and obesity, with three objectives: (1) to characterize the relationship between DD and obesity in both case-control comparisons and continuous designs; (2) to examine potential moderators, including case-control v. continuous design, money v. food rewards, sample sex distribution, and sample age (18 years); and (3) to evaluate publication bias. From 134 candidate articles, 39 independent investigations yielded 29 case-control and 30 continuous comparisons (total n = 10 278). Random-effects meta-analysis was conducted using Cohen's d as the effect size. Publication bias was evaluated using fail-safe N, Begg-Mazumdar and Egger tests, meta-regression of publication year and effect size, and imputation of missing studies. The primary analysis revealed a medium effect size across studies that was highly statistically significant (d = 0.43, p < 10-14). None of the moderators examined yielded statistically significant differences, although notably larger effect sizes were found for studies with case-control designs, food rewards and child/adolescent samples. Limited evidence of publication bias was present, although the Begg-Mazumdar test and meta-regression suggested a slightly diminishing effect size over time. Steep DD of food and money appears to be a robust feature of obesity that is relatively consistent across the DD assessment methodologies and study designs examined. These findings are discussed in the context of research on DD in drug addiction, the neural bases of DD in obesity, and potential clinical applications.
NASA Astrophysics Data System (ADS)
Zhang, J.; Ives, A. R.; Turner, M. G.; Kucharik, C. J.
2017-12-01
Previous studies have identified global agricultural regions where "stagnation" of long-term crop yield increases has occurred. These studies have used a variety of simple statistical methods that often ignore important aspects of time series regression modeling. These methods can lead to differing and contradictory results, which creates uncertainty regarding food security given rapid global population growth. Here, we present a new statistical framework incorporating time series-based algorithms into standard regression models to quantify spatiotemporal yield trends of US maize, soybean, and winter wheat from 1970-2016. Our primary goal was to quantify spatial differences in yield trends for these three crops using USDA county level data. This information was used to identify regions experiencing the largest changes in the rate of yield increases over time, and to determine whether abrupt shifts in the rate of yield increases have occurred. Although crop yields continue to increase in most maize-, soybean-, and winter wheat-growing areas, yield increases have stagnated in some key agricultural regions during the most recent 15 to 16 years: some maize-growing areas, except for the northern Great Plains, have shown a significant trend towards smaller annual yield increases for maize; soybean has maintained an consistent long-term yield gains in the Northern Great Plains, the Midwest, and southeast US, but has experienced a shift to smaller annual increases in other regions; winter wheat maintained a moderate annual increase in eastern South Dakota and eastern US locations, but showed a decline in the magnitude of annual increases across the central Great Plains and western US regions. Our results suggest that there were abrupt shifts in the rate of annual yield increases in a variety of US regions among the three crops. The framework presented here can be broadly applied to additional yield trend analyses for different crops and regions of the Earth.
Markov chain Monte Carlo estimation of quantum states
NASA Astrophysics Data System (ADS)
Diguglielmo, James; Messenger, Chris; Fiurášek, Jaromír; Hage, Boris; Samblowski, Aiko; Schmidt, Tabea; Schnabel, Roman
2009-03-01
We apply a Bayesian data analysis scheme known as the Markov chain Monte Carlo to the tomographic reconstruction of quantum states. This method yields a vector, known as the Markov chain, which contains the full statistical information concerning all reconstruction parameters including their statistical correlations with no a priori assumptions as to the form of the distribution from which it has been obtained. From this vector we can derive, e.g., the marginal distributions and uncertainties of all model parameters, and also of other quantities such as the purity of the reconstructed state. We demonstrate the utility of this scheme by reconstructing the Wigner function of phase-diffused squeezed states. These states possess non-Gaussian statistics and therefore represent a nontrivial case of tomographic reconstruction. We compare our results to those obtained through pure maximum-likelihood and Fisher information approaches.
Jackson, George S.; Hillegonds, Darren J.; Muzikar, Paul; Goehring, Brent
2013-01-01
A 41Ca interlaboratory comparison between Lawrence Livermore National Laboratory (LLNL) and the Purdue Rare Isotope Laboratory (PRIME Lab) has been completed. Analysis of the ratios assayed by accelerator mass spectrometry (AMS) shows that there is no statistically significant difference in the ratios. Further, Bayesian analysis shows that the uncertainties reported by both facilities are correct with the possibility of a slight under-estimation by one laboratory. Finally, the chemistry procedures used by the two facilities to produce CaF2 for the cesium sputter ion source are robust and don't yield any significant differences in the final result. PMID:24179312
NASA Astrophysics Data System (ADS)
Martucci, G.; Carniel, S.; Chiggiato, J.; Sclavo, M.; Lionello, P.; Galati, M. B.
2009-09-01
The study is a statistical analysis of sea states timeseries derived using the wave model WAM forced by the ERA-40 dataset in selected areas near the Italian coasts. For the period 1 January 1958 to 31 December 1999 the analysis yields: (i) the existence of a negative trend in the annual- and winter-averaged sea state heights; (ii) the existence of a turning-point in late 70's in the annual-averaged trend of sea state heights at a site in the Northern Adriatic Sea; (iii) the overall absence of a significant trend in the annual-averaged mean durations of sea states over thresholds; (iv) the assessment of the extreme values on a time-scale of thousand years. The analysis uses two methods to obtain samples of extremes from the independent sea states: the r-largest annual maxima and the peak-over-threshold. The two methods show statistical differences in retrieving the return values and more generally in describing the significant wave field. The study shows the existence of decadal negative trends in the significant wave heights and by this it conveys useful information on the wave climatology of the Italian seas during the second half of the 20th century.
Equilibrium statistical-thermal models in high-energy physics
NASA Astrophysics Data System (ADS)
Tawfik, Abdel Nasser
2014-05-01
We review some recent highlights from the applications of statistical-thermal models to different experimental measurements and lattice QCD thermodynamics that have been made during the last decade. We start with a short review of the historical milestones on the path of constructing statistical-thermal models for heavy-ion physics. We discovered that Heinz Koppe formulated in 1948, an almost complete recipe for the statistical-thermal models. In 1950, Enrico Fermi generalized this statistical approach, in which he started with a general cross-section formula and inserted into it, the simplifying assumptions about the matrix element of the interaction process that likely reflects many features of the high-energy reactions dominated by density in the phase space of final states. In 1964, Hagedorn systematically analyzed the high-energy phenomena using all tools of statistical physics and introduced the concept of limiting temperature based on the statistical bootstrap model. It turns to be quite often that many-particle systems can be studied with the help of statistical-thermal methods. The analysis of yield multiplicities in high-energy collisions gives an overwhelming evidence for the chemical equilibrium in the final state. The strange particles might be an exception, as they are suppressed at lower beam energies. However, their relative yields fulfill statistical equilibrium, as well. We review the equilibrium statistical-thermal models for particle production, fluctuations and collective flow in heavy-ion experiments. We also review their reproduction of the lattice QCD thermodynamics at vanishing and finite chemical potential. During the last decade, five conditions have been suggested to describe the universal behavior of the chemical freeze-out parameters. The higher order moments of multiplicity have been discussed. They offer deep insights about particle production and to critical fluctuations. Therefore, we use them to describe the freeze-out parameters and suggest the location of the QCD critical endpoint. Various extensions have been proposed in order to take into consideration the possible deviations of the ideal hadron gas. We highlight various types of interactions, dissipative properties and location-dependences (spatial rapidity). Furthermore, we review three models combining hadronic with partonic phases; quasi-particle model, linear sigma model with Polyakov potentials and compressible bag model.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amo Sanchez, P. del; Lees, J. P.; Poireau, V.
Using the entire sample of 467x10{sup 6} {Upsilon}(4S){yields}BB decays collected with the BABAR detector at the PEP-II asymmetric-energy B factory at the SLAC National Accelerator Laboratory, we perform an analysis of B{sup {+-}}{yields}DK{sup {+-}}decays, using decay modes in which the neutral D meson decays to either CP-eigenstates or non-CP-eigenstates. We measure the partial decay rate charge asymmetries for CP-even and CP-odd D final states to be A{sub CP+}=0.25{+-}0.06{+-}0.02 and A{sub CP-}=-0.09{+-}0.07{+-}0.02, respectively, where the first error is the statistical and the second is the systematic uncertainty. The parameter A{sub CP+} is different from zero with a significance of 3.6 standardmore » deviations, constituting evidence for direct CP violation. We also measure the ratios of the charged-averaged B partial decay rates in CP and non-CP decays, R{sub CP+}=1.18{+-}0.09{+-}0.05 and R{sub CP-}=1.07{+-}0.08{+-}0.04. We infer frequentist confidence intervals for the angle {gamma} of the unitarity triangle, for the strong phase difference {delta}{sub B}, and for the amplitude ratio r{sub B}, which are related to the B{sup -}{yields}DK{sup -} decay amplitude by r{sub B}e{sup i({delta}{sub B}-{gamma})}=A(B{sup -}{yields}D{sup 0}K{sup -})/A(B{sup -}{yields}D{sup 0}K{sup -}). Including statistical and systematic uncertainties, we obtain 0.24
Statistics-based model for prediction of chemical biosynthesis yield from Saccharomyces cerevisiae
2011-01-01
Background The robustness of Saccharomyces cerevisiae in facilitating industrial-scale production of ethanol extends its utilization as a platform to synthesize other metabolites. Metabolic engineering strategies, typically via pathway overexpression and deletion, continue to play a key role for optimizing the conversion efficiency of substrates into the desired products. However, chemical production titer or yield remains difficult to predict based on reaction stoichiometry and mass balance. We sampled a large space of data of chemical production from S. cerevisiae, and developed a statistics-based model to calculate production yield using input variables that represent the number of enzymatic steps in the key biosynthetic pathway of interest, metabolic modifications, cultivation modes, nutrition and oxygen availability. Results Based on the production data of about 40 chemicals produced from S. cerevisiae, metabolic engineering methods, nutrient supplementation, and fermentation conditions described therein, we generated mathematical models with numerical and categorical variables to predict production yield. Statistically, the models showed that: 1. Chemical production from central metabolic precursors decreased exponentially with increasing number of enzymatic steps for biosynthesis (>30% loss of yield per enzymatic step, P-value = 0); 2. Categorical variables of gene overexpression and knockout improved product yield by 2~4 folds (P-value < 0.1); 3. Addition of notable amount of intermediate precursors or nutrients improved product yield by over five folds (P-value < 0.05); 4. Performing the cultivation in a well-controlled bioreactor enhanced the yield of product by three folds (P-value < 0.05); 5. Contribution of oxygen to product yield was not statistically significant. Yield calculations for various chemicals using the linear model were in fairly good agreement with the experimental values. The model generally underestimated the ethanol production as compared to other chemicals, which supported the notion that the metabolism of Saccharomyces cerevisiae has historically evolved for robust alcohol fermentation. Conclusions We generated simple mathematical models for first-order approximation of chemical production yield from S. cerevisiae. These linear models provide empirical insights to the effects of strain engineering and cultivation conditions toward biosynthetic efficiency. These models may not only provide guidelines for metabolic engineers to synthesize desired products, but also be useful to compare the biosynthesis performance among different research papers. PMID:21689458
A multivariate model and statistical method for validating tree grade lumber yield equations
Donald W. Seegrist
1975-01-01
Lumber yields within lumber grades can be described by a multivariate linear model. A method for validating lumber yield prediction equations when there are several tree grades is presented. The method is based on multivariate simultaneous test procedures.
Egiyan, H.; Langheinrich, J.; Gothe, R. W.; ...
2012-01-30
We searched for the Φ⁻⁻(1860) pentaquark in the photoproduction process off the deuteron in the Ξ⁻π⁻-decay channel using CLAS. The invariant-mass spectrum of the Ξ⁻π⁻ system does not indicate any statistically significant enhancement near the reported mass M=1.860 GeV. The statistical analysis of the sideband-subtracted mass spectrum yields a 90%-confidence-level upper limit of 0.7 nb for the photoproduction cross section of Φ⁻⁻(1860) with a consecutive decay intoΞ⁻π⁻ in the photon-energy range 4.5GeVγ<5.5GeV.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Öztürk, Hande; Noyan, I. Cevdet
A rigorous study of sampling and intensity statistics applicable for a powder diffraction experiment as a function of crystallite size is presented. Our analysis yields approximate equations for the expected value, variance and standard deviations for both the number of diffracting grains and the corresponding diffracted intensity for a given Bragg peak. The classical formalism published in 1948 by Alexander, Klug & Kummer [J. Appl. Phys.(1948),19, 742–753] appears as a special case, limited to large crystallite sizes, here. It is observed that both the Lorentz probability expression and the statistics equations used in the classical formalism are inapplicable for nanocrystallinemore » powder samples.« less
Öztürk, Hande; Noyan, I. Cevdet
2017-08-24
A rigorous study of sampling and intensity statistics applicable for a powder diffraction experiment as a function of crystallite size is presented. Our analysis yields approximate equations for the expected value, variance and standard deviations for both the number of diffracting grains and the corresponding diffracted intensity for a given Bragg peak. The classical formalism published in 1948 by Alexander, Klug & Kummer [J. Appl. Phys.(1948),19, 742–753] appears as a special case, limited to large crystallite sizes, here. It is observed that both the Lorentz probability expression and the statistics equations used in the classical formalism are inapplicable for nanocrystallinemore » powder samples.« less
Siddiqua, Shaila; Mamun, Abdullah Al; Enayetul Babar, Sheikh Md
2015-01-01
Renewable biodiesels are needed as an alternative to petroleum-derived transport fuels, which contribute to global warming and are of limited availability. Algae biomass, are a potential source of renewable energy, and they can be converted into energy such as biofuels. This study introduces an integrated method for the production of biodiesel from Chara vulgaris algae collected from the coastal region of Bangladesh. The Box-Behnken design based on response surface methods (RSM) used as the statistical tool to optimize three variables for predicting the best performing conditions (calorific value and yield) of algae biodiesel. The three parameters for production condition were chloroform (X1), sodium chloride concentration (X2) and temperature (X3). Optimal conditions were estimated by the aid of statistical regression analysis and surface plot chart. The optimal condition of biodiesel production parameter for 12 g of dry algae biomass was observed to be 198 ml chloroform with 0.75 % sodium chloride at 65 °C temperature, where the calorific value of biodiesel is 9255.106 kcal/kg and yield 3.6 ml.
Callan, Richard S; Palladino, Christie L; Furness, Alan R; Bundy, Emily L; Ange, Brittany L
2014-10-01
Recent efforts have been directed towards utilizing CAD/CAM technology in the education of future dentists. The purpose of this pilot study was to investigate the feasibility of implementing CAD/CAM technology in instruction on preparing a tooth for restoration. Students at one dental school were assigned access to CAD/CAM technology vs. traditional preparation methods in a randomized, crossover design. In a convenience sample of a second-year class, seventy-six of the seventy-nine students volunteered to participate, for a response rate of 96 percent. Two analyses were performed on this pilot data: a primary effectiveness analysis comparing students' competency exam scores by intervention group (intention-to-treat analysis) and a secondary efficacy analysis comparing competency exam scores among students who reported using CAD/CAM versus those who did not. The effectiveness analysis showed no difference in outcomes by intervention group assignment. While student survey results indicated interest in utilizing the technology, the actual utilization rate was much less than one might anticipate, yielding a sample size that limited statistical power. The secondary analysis demonstrated higher mean competency exam scores for students reporting use of CAD/CAM compared to those who did not use the technology, but these results did not reach statistical significance (p=0.075). Prior research has investigated the efficacy of CAD/CAM in a controlled educational trial, but this study adds to the literature by investigating student use of CAD/CAM in a real-world, self-study fashion. Further studies should investigate ways in which to increase student utilization of CAD/CAM and whether or not increased utilization, with a larger sample size, would yield significant outcomes.
Climate Variability and Sugarcane Yield in Louisiana.
NASA Astrophysics Data System (ADS)
Greenland, David
2005-11-01
This paper seeks to understand the role that climate variability has on annual yield of sugarcane in Louisiana. Unique features of sugarcane growth in Louisiana and nonclimatic, yield-influencing factors make this goal an interesting and challenging one. Several methods of seeking and establishing the relations between yield and climate variables are employed. First, yield climate relations were investigated at a single research station where crop variety and growing conditions could be held constant and yield relations could be established between a predominant older crop variety and a newer one. Interviews with crop experts and a literature survey were used to identify potential climatic factors that control yield. A statistical analysis was performed using statewide yield data from the American Sugar Cane League from 1963 to 2002 and a climate database. Yield values for later years were adjusted downward to form an adjusted yield dataset. The climate database was principally constructed from daily and monthly values of maximum and minimum temperature and daily and monthly total precipitation for six cooperative weather-reporting stations representative of the area of sugarcane production. The influence of 74 different, though not independent, climate-related variables on sugarcane yield was investigated. The fact that a climate signal exists is demonstrated by comparing mean values of the climate variables corresponding to the upper and lower third of adjusted yield values. Most of these mean-value differences show an intuitively plausible difference between the high- and low-yield years. The difference between means of the climate variables for years corresponding to the upper and lower third of annual yield values for 13 of the variables is statistically significant at or above the 90% level. A correlation matrix was used to identify the variables that had the largest influence on annual yield. Four variables [called here critical climatic variables (CCV)], mean maximum August temperature, mean minimum February temperature, soil water surplus between April and September, and occurrence of autumn (fall) hurricanes, were built into a model to simulate adjusted yield values. The CCV model simulates the yield value with an rmse of 5.1 t ha-1. The mean of the adjusted yield data over the study period was 60.4 t ha-1, with values for the highest and lowest years being 73.1 and 50.6 t ha-1, respectively, and a standard deviation of 5.9 t ha-1. Presumably because of the almost constant high water table and soil water availability, higher precipitation totals, which are inversely related to radiation and temperature, tend to have a negative effect on the yields. Past trends in the values of critical climatic variables and general projections of future climate suggest that, with respect to the climatic environment and as long as land drainage is continued and maintained, future levels of sugarcane yield will rise in Louisiana.
Development of a Cadaveric Model for Arthrocentesis.
MacIver, Melissa A; Johnson, Matthew
2015-01-01
This article reports the development of a novel cadaveric model for future use in teaching arthrocentesis. In the clinical setting, animal safety is essential and practice is thus limited. Objectives of the study were to develop and compare a model to an unmodified cadaver by injecting one of two types of fluids to increase yield. The two fluids injected, mineral oil (MO) and hypertonic saline (HS), were compared to determine any difference on yield. Lastly, aspiration immediately after (T1) or three hours after (T2) injection were compared to determine any effect on diagnostic yield. Joints used included the stifle, elbow, and carpus in eight medium dog cadavers. Arthrocentesis was performed before injection (control) and yield measured. Test joints were injected with MO or HS and yield measured after range of motion (T1) and three hours post injection to simulate lab preparation (T2). Both models had statistically significantly higher yield compared with the unmodified cadaver in all joints at T1 and T2 (p<.05) with the exception of HST2 carpus. T2 aspiration had a statistically significant lower yield when compared to T1HS carpus, T1HS elbow, and T1MO carpus. Overall, irrespective of fluid volume or type, percent yield was lower in T2 compared to T1. No statistically significant difference was seen between HS and MO in most joints with the exception of MOT1 stifle and HST2 elbow. Within the time frame assessed, both models were acceptable. However, HS arthrocentesis models proved appropriate for student trial due to the difficult aspirations with MO.
Analysis of S-box in Image Encryption Using Root Mean Square Error Method
NASA Astrophysics Data System (ADS)
Hussain, Iqtadar; Shah, Tariq; Gondal, Muhammad Asif; Mahmood, Hasan
2012-07-01
The use of substitution boxes (S-boxes) in encryption applications has proven to be an effective nonlinear component in creating confusion and randomness. The S-box is evolving and many variants appear in literature, which include advanced encryption standard (AES) S-box, affine power affine (APA) S-box, Skipjack S-box, Gray S-box, Lui J S-box, residue prime number S-box, Xyi S-box, and S8 S-box. These S-boxes have algebraic and statistical properties which distinguish them from each other in terms of encryption strength. In some circumstances, the parameters from algebraic and statistical analysis yield results which do not provide clear evidence in distinguishing an S-box for an application to a particular set of data. In image encryption applications, the use of S-boxes needs special care because the visual analysis and perception of a viewer can sometimes identify artifacts embedded in the image. In addition to existing algebraic and statistical analysis already used for image encryption applications, we propose an application of root mean square error technique, which further elaborates the results and enables the analyst to vividly distinguish between the performances of various S-boxes. While the use of the root mean square error analysis in statistics has proven to be effective in determining the difference in original data and the processed data, its use in image encryption has shown promising results in estimating the strength of the encryption method. In this paper, we show the application of the root mean square error analysis to S-box image encryption. The parameters from this analysis are used in determining the strength of S-boxes
Analysis of D0 -> K anti-K X Decays
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jessop, Colin P.
2003-06-06
Using data taken with the CLEO II detector, they have studied the decays of the D{sup 0} to K{sup +}K{sup -}, K{sup 0}{bar K}{sup 0}, K{sub S}{sup 0}K{sub S}{sup 0}, K{sub S}{sup 0}K{sub S}{sup 0}{pi}{sup 0}, K{sup +}K{sup -}{pi}{sup 0}. The authors present significantly improved results for B(D{sup 0} {yields} K{sup +}K{sup -}) = (0.454 {+-} 0.028 {+-} 0.035)%, B(D{sup 0} {yields} K{sup 0}{bar K}{sup 0}) = (0.054 {+-} 0.012 {+-} 0.010)% and B(D{sup 0} {yields} K{sub S}{sup 0}K{sub S}{sup 0}K{sub S}{sup 0}) = (0.074 {+-} 0.010 {+-} 0.015)% where the first errors are statistical and the second errors aremore » the estimate of their systematic uncertainty. They also present a new upper limit B(D{sup 0} {yields} K{sub S}{sup 0}K{sub S}{sup 0}{pi}{sup 0}) < 0.059% at the 90% confidence level and the first measurement of B(D{sup 0} {yields} K{sup +}K{sup -}{pi}{sup 0}) = (0.14 {+-} 0.04)%.« less
Herle, Pradyumna; Shukla, Lipi; Morrison, Wayne A; Shayan, Ramin
2015-03-01
There is a general consensus among reconstructive surgeons that preoperative radiotherapy is associated with a higher risk of flap failure and complications in head and neck surgery. Opinion is also divided regarding the effects of radiation dose on free flap outcomes and timing of preoperative radiation to minimize adverse outcomes. Our meta-analysis will attempt to address these issues. A systematic review of the literature was conducted in concordance to PRISMA protocol. Data were combined using STATA 12 and Open Meta-Analyst software programmes. Twenty-four studies were included comparing 2842 flaps performed in irradiated fields and 3491 flaps performed in non-irradiated fields. Meta-analysis yielded statistically significant risk ratios for flap failure (RR 1.48, P = 0.004), complications (RR 1.84, P < 0.001), reoperation (RR 2.06, P < 0.001) and fistula (RR 2.05, P < 0.001). Mean radiation dose demonstrated a trend towards increased risk of flap failure, but this was not statistically significant. On subgroup analysis, flaps with >60 Gy radiation had a non-statistically significant higher risk of flap failure (RR 1.61, P = 0.145). Preoperative radiation is associated with a statistically significant increased risk of flap complications, failure and fistula. Preoperative radiation in excess of 60 Gy after radiotherapy represents a potential risk factor for increased flap loss and should be avoided where possible. © 2014 Royal Australasian College of Surgeons.
Analysis of the Einstein sample of early-type galaxies
NASA Technical Reports Server (NTRS)
Eskridge, Paul B.; Fabbiano, Giuseppina
1993-01-01
The EINSTEIN galaxy catalog contains x-ray data for 148 early-type (E and SO) galaxies. A detailed analysis of the global properties of this sample are studied. By comparing the x-ray properties with other tracers of the ISM, as well as with observables related to the stellar dynamics and populations of the sample, we expect to determine more clearly the physical relationships that determine the evolution of early-type galaxies. Previous studies with smaller samples have explored the relationships between x-ray luminosity (L(sub x)) and luminosities in other bands. Using our larger sample and the statistical techniques of survival analysis, a number of these earlier analyses were repeated. For our full sample, a strong statistical correlation is found between L(sub X) and L(sub B) (the probability that the null hypothesis is upheld is P less than 10(exp -4) from a variety of rank correlation tests. Regressions with several algorithms yield consistent results.
NASA Astrophysics Data System (ADS)
Noik, V. James; Mohd Tuah, P.
2015-04-01
Plastic fragments and particles as an emerging environmental contaminant and pollutant are gaining scientific attention in the recent decades due to the potential threats on biota. This study aims to elucidate the presence, abundance and temporal change of plastic fragments and particles from two selected beaches, namely Santubong and Trombol in Kuching on two sampling times. Morphological and polymer identification assessment on the recovered plastics was also conducted. Overall comparison statistical analysis revealed that the abundance of plastic fragments/debris on both of sampling stations were insignificantly different (p>0.05). Likewise, statistical analysis on the temporal changes on the abundance yielded no significant difference for most of the sampling sites on each respective station, except STB-S2. Morphological studies revealed physical features of plastic fragments and debris were diverse in shapes, sizes, colors and surface fatigues. FTIR fingerprinting analysis shows that polypropylene and polyethylene were the dominant plastic polymers debris on both beaches.
Parasites as valuable stock markers for fisheries in Australasia, East Asia and the Pacific Islands.
Lester, R J G; Moore, B R
2015-01-01
Over 30 studies in Australasia, East Asia and the Pacific Islands region have collected and analysed parasite data to determine the ranges of individual fish, many leading to conclusions about stock delineation. Parasites used as biological tags have included both those known to have long residence times in the fish and those thought to be relatively transient. In many cases the parasitological conclusions have been supported by other methods especially analysis of the chemical constituents of otoliths, and to a lesser extent, genetic data. In analysing parasite data, authors have applied multiple different statistical methodologies, including summary statistics, and univariate and multivariate approaches. Recently, a growing number of researchers have found non-parametric methods, such as analysis of similarities and cluster analysis, to be valuable. Future studies into the residence times, life cycles and geographical distributions of parasites together with more robust analytical methods will yield much important information to clarify stock structures in the area.
Damage detection of engine bladed-disks using multivariate statistical analysis
NASA Astrophysics Data System (ADS)
Fang, X.; Tang, J.
2006-03-01
The timely detection of damage in aero-engine bladed-disks is an extremely important and challenging research topic. Bladed-disks have high modal density and, particularly, their vibration responses are subject to significant uncertainties due to manufacturing tolerance (blade-to-blade difference or mistuning), operating condition change and sensor noise. In this study, we present a new methodology for the on-line damage detection of engine bladed-disks using their vibratory responses during spin-up or spin-down operations which can be measured by blade-tip-timing sensing technique. We apply a principle component analysis (PCA)-based approach for data compression, feature extraction, and denoising. The non-model based damage detection is achieved by analyzing the change between response features of the healthy structure and of the damaged one. We facilitate such comparison by incorporating the Hotelling's statistic T2 analysis, which yields damage declaration with a given confidence level. The effectiveness of the method is demonstrated by case studies.
Production and Decay of {xi}{sub c}{sup 0} at BABAR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aubert, B.; Barate, R.; Boutigny, D.
Using 116.1 fb{sup -1} of data collected by the BABAR detector, we present an analysis of {xi}{sub c}{sup 0} production in B decays and from the cc continuum, with the {xi}{sub c}{sup 0} decaying into {omega}{sup -}K{sup +} and {xi}{sup -}{pi}{sup +} final states. We measure the ratio of branching fractions B({xi}{sub c}{sup 0}{yields}{omega}{sup -}K{sup +})/B({xi}{sub c}{sup 0}{yields}{xi}{sup -}{pi}{sup +}) to be 0.294{+-}0.018{+-}0.016, where the first uncertainty is statistical and the second is systematic. The {xi}{sub c}{sup 0} momentum spectrum is measured on and 40 MeV below the {upsilon}(4S) resonance. From these spectra the branching fraction product B(B{yields}{xi}{sub c}{sup 0}X)xB({xi}{submore » c}{sup 0}{yields}{xi}{sup -}{pi}{sup +}) is measured to be (2.11{+-}0.19{+-}0.25)x10{sup -4}, and the cross-section product {sigma}(e{sup +}e{sup -}{yields}{xi}{sub c}{sup 0}X)xB({xi}{sub c}{sup 0}{yields}{xi}{sup -}{pi}{sup +}) from the continuum is measured to be (388{+-}39{+-}41) fb at a center-of-mass energy of 10.58 GeV.« less
Implicit Wiener series analysis of epileptic seizure recordings.
Barbero, Alvaro; Franz, Matthias; van Drongelen, Wim; Dorronsoro, José R; Schölkopf, Bernhard; Grosse-Wentrup, Moritz
2009-01-01
Implicit Wiener series are a powerful tool to build Volterra representations of time series with any degree of non-linearity. A natural question is then whether higher order representations yield more useful models. In this work we shall study this question for ECoG data channel relationships in epileptic seizure recordings, considering whether quadratic representations yield more accurate classifiers than linear ones. To do so we first show how to derive statistical information on the Volterra coefficient distribution and how to construct seizure classification patterns over that information. As our results illustrate, a quadratic model seems to provide no advantages over a linear one. Nevertheless, we shall also show that the interpretability of the implicit Wiener series provides insights into the inter-channel relationships of the recordings.
Mohammed, Faraz; Manohar, Vidya; Jose, Maji; Thapasum, Arishiya Fairozekhan; Mohamed, Shamaz; Shamaz, Bibi Halima; D'Souza, Neevan
2015-03-01
The purpose of this study was to estimate the copper levels in saliva of patients with oral submucous fibrosis (OSF) and different areca nut products and its correlation with different histological grades of OSF. The study comprised 60 individuals, 30 OSF patients and 30 non-OSF individuals. Unstimulated whole saliva was collected, and copper analysis was performed using colorimetric method. The commercial areca nut products used by the patients were acquired and subjected to copper analysis through the atomic absorption spectrophotometer method. Oral biopsies were performed for OSF patients for histopathological correlation. The mean salivary copper level was 27.023 μg/dl in OSF patients when compared with 8.393 μg/dl in non-OSF individuals (P < 0.005). The mean copper content in different areca nut products was 13.313 ppm (P < 0.005). Comparison of copper content in different areca nut products with salivary copper levels of OSF patients showed negative correlation (P < 0.853). Comparison of salivary copper levels between different histological grades of OSF yielded a statistically significant association between grades I and III (P < 0.005) and grades II and III OSF (P < 0.019). Comparison of copper content in areca nut products and different histological grades of OSF yielded weak negative statistical correlation (r = -0.116). Despite high copper content in areca nut products, the observations yielded a negative correlation with different histological grades of OSF. This further raises a doubt about the copper content in areca nut as an etiological factor for this crippling disease. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Mehdi, Syed Riaz; Al Dahmash, Badr Abdullah
2013-07-01
Riyadh and central province falls in a moderate prevalent zone of hemoglobinopathies in Saudi Arabia. However, it has been observed that the physicians working in Saudi Arabia invariably advise all cases of anemia for hemoglobin electrophoresis (HE). The present work was carried out to study the yield of the HE in Riyadh and the investigative practices of the physicians advising HE. The study was carried out in the hospitals of King Saud University from 2009 to 2011 in order to assess the yield of HE in referred cases of clinical anemia. A total of 1073 cases divided in two groups of males and females had undergone complete blood count and red blood cell morphology. Cellulose acetate HE was performed and all the positive results were reconfirmed on the high performance liquid chromatography (HPLC). The results were analyzed for the type of hemoglobinopathies. For statistical analysis Statistical Package for Social Sciences 15 version (SPSS Inc., Chicago, IL, USA) was used. A total of 405 males and 668 females blood samples were included in the present study. 116 (28.5%) males and 167 (25%) females showed an abnormal pattern on HE. The incidence of beta thalassemia trait was higher in females while sickle cell trait was predominantly seen in males. Red cell indices were reduced considerably in thalassemias, but were unaffected in sickle cell disorders, except those which had concurrent alpha trait. The total yield of HE was 26.6% which was much less than expected. The physicians are advised to rule out iron deficiency and other common causes of anemia before investigating the cases for hemoglobinopathies, which employs time consuming and expensive tests of HE and HPLC.
Mehdi, Syed Riaz; Al Dahmash, Badr Abdullah
2013-01-01
BACKGROUND AND OBJECTIVES: Riyadh and central province falls in a moderate prevalent zone of hemoglobinopathies in Saudi Arabia. However, it has been observed that the physicians working in Saudi Arabia invariably advise all cases of anemia for hemoglobin electrophoresis (HE). The present work was carried out to study the yield of the HE in Riyadh and the investigative practices of the physicians advising HE. SETTINGS AND DESIGN: The study was carried out in the hospitals of King Saud University from 2009 to 2011 in order to assess the yield of HE in referred cases of clinical anemia. MATERIALS AND METHODS: A total of 1073 cases divided in two groups of males and females had undergone complete blood count and red blood cell morphology. Cellulose acetate HE was performed and all the positive results were reconfirmed on the high performance liquid chromatography (HPLC). The results were analyzed for the type of hemoglobinopathies. For statistical analysis Statistical Package for Social Sciences 15 version (SPSS Inc., Chicago, IL, USA) was used. RESULTS: A total of 405 males and 668 females blood samples were included in the present study. 116 (28.5%) males and 167 (25%) females showed an abnormal pattern on HE. The incidence of beta thalassemia trait was higher in females while sickle cell trait was predominantly seen in males. Red cell indices were reduced considerably in thalassemias, but were unaffected in sickle cell disorders, except those which had concurrent alpha trait. The total yield of HE was 26.6% which was much less than expected. CONCLUSION: The physicians are advised to rule out iron deficiency and other common causes of anemia before investigating the cases for hemoglobinopathies, which employs time consuming and expensive tests of HE and HPLC. PMID:24339548
Jumbri, Khairulazhar; Al-Haniff Rozy, Mohd Fahruddin; Ashari, Siti Efliza; Mohamad, Rosfarizan; Basri, Mahiran; Fard Masoumi, Hamid Reza
2015-01-01
Kojic acid is widely used to inhibit the browning effect of tyrosinase in cosmetic and food industries. In this work, synthesis of kojic monooleate ester (KMO) was carried out using lipase-catalysed esterification of kojic acid and oleic acid in a solvent-free system. Response Surface Methodology (RSM) based on central composite rotatable design (CCRD) was used to optimise the main important reaction variables, such as enzyme amount, reaction temperature, substrate molar ratio, and reaction time along with immobilised lipase from Candida Antarctica (Novozym 435) as a biocatalyst. The RSM data indicated that the reaction temperature was less significant in comparison to other factors for the production of a KMO ester. By using this statistical analysis, a quadratic model was developed in order to correlate the preparation variable to the response (reaction yield). The optimum conditions for the enzymatic synthesis of KMO were as follows: an enzyme amount of 2.0 wt%, reaction temperature of 83.69°C, substrate molar ratio of 1:2.37 (mmole kojic acid:oleic acid) and a reaction time of 300.0 min. Under these conditions, the actual yield percentage obtained was 42.09%, which is comparably well with the maximum predicted value of 44.46%. Under the optimal conditions, Novozym 435 could be reused for 5 cycles for KMO production percentage yield of at least 40%. The results demonstrated that statistical analysis using RSM can be used efficiently to optimise the production of a KMO ester. Moreover, the optimum conditions obtained can be applied to scale-up the process and minimise the cost.
Jumbri, Khairulazhar; Al-Haniff Rozy, Mohd Fahruddin; Ashari, Siti Efliza; Mohamad, Rosfarizan; Basri, Mahiran; Fard Masoumi, Hamid Reza
2015-01-01
Kojic acid is widely used to inhibit the browning effect of tyrosinase in cosmetic and food industries. In this work, synthesis of kojic monooleate ester (KMO) was carried out using lipase-catalysed esterification of kojic acid and oleic acid in a solvent-free system. Response Surface Methodology (RSM) based on central composite rotatable design (CCRD) was used to optimise the main important reaction variables, such as enzyme amount, reaction temperature, substrate molar ratio, and reaction time along with immobilised lipase from Candida Antarctica (Novozym 435) as a biocatalyst. The RSM data indicated that the reaction temperature was less significant in comparison to other factors for the production of a KMO ester. By using this statistical analysis, a quadratic model was developed in order to correlate the preparation variable to the response (reaction yield). The optimum conditions for the enzymatic synthesis of KMO were as follows: an enzyme amount of 2.0 wt%, reaction temperature of 83.69°C, substrate molar ratio of 1:2.37 (mmole kojic acid:oleic acid) and a reaction time of 300.0 min. Under these conditions, the actual yield percentage obtained was 42.09%, which is comparably well with the maximum predicted value of 44.46%. Under the optimal conditions, Novozym 435 could be reused for 5 cycles for KMO production percentage yield of at least 40%. The results demonstrated that statistical analysis using RSM can be used efficiently to optimise the production of a KMO ester. Moreover, the optimum conditions obtained can be applied to scale-up the process and minimise the cost. PMID:26657030
Stupák, Ivan; Pavloková, Sylvie; Vysloužil, Jakub; Dohnal, Jiří; Čulen, Martin
2017-11-23
Biorelevant dissolution instruments represent an important tool for pharmaceutical research and development. These instruments are designed to simulate the dissolution of drug formulations in conditions most closely mimicking the gastrointestinal tract. In this work, we focused on the optimization of dissolution compartments/vessels for an updated version of the biorelevant dissolution apparatus-Golem v2. We designed eight compartments of uniform size but different inner geometry. The dissolution performance of the compartments was tested using immediate release caffeine tablets and evaluated by standard statistical methods and principal component analysis. Based on two phases of dissolution testing (using 250 and 100 mL of dissolution medium), we selected two compartment types yielding the highest measurement reproducibility. We also confirmed a statistically ssignificant effect of agitation rate and dissolution volume on the extent of drug dissolved and measurement reproducibility.
Statistical analysis of target acquisition sensor modeling experiments
NASA Astrophysics Data System (ADS)
Deaver, Dawne M.; Moyer, Steve
2015-05-01
The U.S. Army RDECOM CERDEC NVESD Modeling and Simulation Division is charged with the development and advancement of military target acquisition models to estimate expected soldier performance when using all types of imaging sensors. Two elements of sensor modeling are (1) laboratory-based psychophysical experiments used to measure task performance and calibrate the various models and (2) field-based experiments used to verify the model estimates for specific sensors. In both types of experiments, it is common practice to control or measure environmental, sensor, and target physical parameters in order to minimize uncertainty of the physics based modeling. Predicting the minimum number of test subjects required to calibrate or validate the model should be, but is not always, done during test planning. The objective of this analysis is to develop guidelines for test planners which recommend the number and types of test samples required to yield a statistically significant result.
Specification of ISS Plasma Environment Variability
NASA Technical Reports Server (NTRS)
Minow, Joseph I.; Neergaard, Linda F.; Bui, Them H.; Mikatarian, Ronald R.; Barsamian, H.; Koontz, Steven L.
2004-01-01
Quantifying spacecraft charging risks and associated hazards for the International Space Station (ISS) requires a plasma environment specification for the natural variability of ionospheric temperature (Te) and density (Ne). Empirical ionospheric specification and forecast models such as the International Reference Ionosphere (IRI) model typically only provide long term (seasonal) mean Te and Ne values for the low Earth orbit environment. This paper describes a statistical analysis of historical ionospheric low Earth orbit plasma measurements from the AE-C, AE-D, and DE-2 satellites used to derive a model of deviations of observed data values from IRI-2001 estimates of Ne, Te parameters for each data point to provide a statistical basis for modeling the deviations of the plasma environment from the IRI model output. Application of the deviation model with the IRI-2001 output yields a method for estimating extreme environments for the ISS spacecraft charging analysis.
Performance analysis of different tuning rules for an isothermal CSTR using integrated EPC and SPC
NASA Astrophysics Data System (ADS)
Roslan, A. H.; Karim, S. F. Abd; Hamzah, N.
2018-03-01
This paper demonstrates the integration of Engineering Process Control (EPC) and Statistical Process Control (SPC) for the control of product concentration of an isothermal CSTR. The objectives of this study are to evaluate the performance of Ziegler-Nichols (Z-N), Direct Synthesis, (DS) and Internal Model Control (IMC) tuning methods and determine the most effective method for this process. The simulation model was obtained from past literature and re-constructed using SIMULINK MATLAB to evaluate the process response. Additionally, the process stability, capability and normality were analyzed using Process Capability Sixpack reports in Minitab. Based on the results, DS displays the best response for having the smallest rise time, settling time, overshoot, undershoot, Integral Time Absolute Error (ITAE) and Integral Square Error (ISE). Also, based on statistical analysis, DS yields as the best tuning method as it exhibits the highest process stability and capability.
New insights into old methods for identifying causal rare variants.
Wang, Haitian; Huang, Chien-Hsun; Lo, Shaw-Hwa; Zheng, Tian; Hu, Inchi
2011-11-29
The advance of high-throughput next-generation sequencing technology makes possible the analysis of rare variants. However, the investigation of rare variants in unrelated-individuals data sets faces the challenge of low power, and most methods circumvent the difficulty by using various collapsing procedures based on genes, pathways, or gene clusters. We suggest a new way to identify causal rare variants using the F-statistic and sliced inverse regression. The procedure is tested on the data set provided by the Genetic Analysis Workshop 17 (GAW17). After preliminary data reduction, we ranked markers according to their F-statistic values. Top-ranked markers were then subjected to sliced inverse regression, and those with higher absolute coefficients in the most significant sliced inverse regression direction were selected. The procedure yields good false discovery rates for the GAW17 data and thus is a promising method for future study on rare variants.
Bayesian inference for joint modelling of longitudinal continuous, binary and ordinal events.
Li, Qiuju; Pan, Jianxin; Belcher, John
2016-12-01
In medical studies, repeated measurements of continuous, binary and ordinal outcomes are routinely collected from the same patient. Instead of modelling each outcome separately, in this study we propose to jointly model the trivariate longitudinal responses, so as to take account of the inherent association between the different outcomes and thus improve statistical inferences. This work is motivated by a large cohort study in the North West of England, involving trivariate responses from each patient: Body Mass Index, Depression (Yes/No) ascertained with cut-off score not less than 8 at the Hospital Anxiety and Depression Scale, and Pain Interference generated from the Medical Outcomes Study 36-item short-form health survey with values returned on an ordinal scale 1-5. There are some well-established methods for combined continuous and binary, or even continuous and ordinal responses, but little work was done on the joint analysis of continuous, binary and ordinal responses. We propose conditional joint random-effects models, which take into account the inherent association between the continuous, binary and ordinal outcomes. Bayesian analysis methods are used to make statistical inferences. Simulation studies show that, by jointly modelling the trivariate outcomes, standard deviations of the estimates of parameters in the models are smaller and much more stable, leading to more efficient parameter estimates and reliable statistical inferences. In the real data analysis, the proposed joint analysis yields a much smaller deviance information criterion value than the separate analysis, and shows other good statistical properties too. © The Author(s) 2014.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Zhenhua; Rose, Adam Z.; Prager, Fynnwin
The state of the art approach to economic consequence analysis (ECA) is computable general equilibrium (CGE) modeling. However, such models contain thousands of equations and cannot readily be incorporated into computerized systems used by policy analysts to yield estimates of economic impacts of various types of transportation system failures due to natural hazards, human related attacks or technological accidents. This paper presents a reduced-form approach to simplify the analytical content of CGE models to make them more transparent and enhance their utilization potential. The reduced-form CGE analysis is conducted by first running simulations one hundred times, varying key parameters, suchmore » as magnitude of the initial shock, duration, location, remediation, and resilience, according to a Latin Hypercube sampling procedure. Statistical analysis is then applied to the “synthetic data” results in the form of both ordinary least squares and quantile regression. The analysis yields linear equations that are incorporated into a computerized system and utilized along with Monte Carlo simulation methods for propagating uncertainties in economic consequences. Although our demonstration and discussion focuses on aviation system disruptions caused by terrorist attacks, the approach can be applied to a broad range of threat scenarios.« less
Mundike, Jhonnah; Collard, François-Xavier; Görgens, Johann F
2017-11-01
Pyrolysis of invasive non-indigenous plants, Lantana camara (LC) and Mimosa pigra (MP) was conducted at milligram-scale for optimisation of temperature, heating rate and hold time on char yield and higher heating value (HHV). The impact of scaling-up to gram-scale was also studied, with chromatography used to correlate gas composition with HHV evolution. Statistically significant effects of temperature on char yield and HHV were obtained, while heating rate and hold time effects were insignificant. Milligram-scale maximised HHVs were 30.03MJkg -1 (525°C) and 31.01MJkg -1 (580°C) for LC and MP, respectively. Higher char yields and HHVs for MP were attributed to increased lignin content. Scaling-up promoted secondary char formation thereby increasing HHVs, 30.82MJkg -1 for LC and 31.61MJkg -1 for MP. Incondensable gas analysis showed that temperature increase beyond preferred values caused dehydrogenation that decreased HHV. Similarly, CO evolution profile explained differences in optimal HHV temperatures. Copyright © 2017 Elsevier Ltd. All rights reserved.
Effects of different mechanized soil fertilization methods on corn nutrient accumulation and yield
NASA Astrophysics Data System (ADS)
Shi, Qingwen; Bai, Chunming; Wang, Huixin; Wu, Di; Song, Qiaobo; Dong, Zengqi; Gao, Depeng; Dong, Qiping; Cheng, Xin; Zhang, Yahao; Mu, Jiahui; Chen, Qinghong; Liao, Wenqing; Qu, Tianru; Zhang, Chunling; Zhang, Xinyu; Liu, Yifei; Han, Xiaori
2017-05-01
Aim: Experiments for mechanized corn soil fertilization were conducted in Faku demonstration zone. On this basis, we studied effects on corn nutrient accumulation and yield traits at brown soil regions due to different mechanized soil fertilization measures. We also evaluated and optimized the regulation effects of mechanized soil fertilization for the purpose of crop yield increase and production efficiency improvement. Method: Based on the survey of soil background value in the demonstration zone, we collected plant samples during different corn growth periods to determine and make statistical analysis. Conclusions: Decomposed cow dung, when under mechanical broadcasting, was able to remarkably increase nitrogen and potassium accumulation content of corns at their ripe stage. Crushed stalk returning combined with deep tillage would remarkably increase phosphorus accumulation content of corn plants. When compared with top application, crushed stalk returning combined with deep tillage would remarkably increase corn thousand kernel weight (TKW). Mechanized broadcasting of granular organic fertilizer and crushed stalk returning combined with deep tillage, when compared with surface application, were able to boost corn yield in the in the demonstration zone.
NASA Astrophysics Data System (ADS)
Adriane Ochiai, Mikael; Marrod Cruz, Salvador; Oporto, Louiellyn; de Leon, Rizalinda
2018-03-01
Direct solvothermal liquefaction was used in converting the lignocellulosic biomass, Pennisetum purpureum or Napier grass using ethanol as solvent. Liquefaction of Napier grass resulted in a dark and viscous bio-crude product and exhibited promising yields (34.6377% to 48.6267%). It was determined that the effects of temperature and residence time were statistically significant with the residence time having the greatest positive effect on yield. High yields of bio-crude from Napier grass seem to occur when solvothermal temperature, residence time increased and as solids ratio decreased. However, elemental analysis showed that the bio-crude produced needs to undergo deoxygenation (O: 14.25 - 49.42%) before mixing with petroleum. For the higher heating value (HHV), the parameters observed in the study were statistically insignificant, however, temperature was determined to have the greatest positive effect on HHV. It was observed that high HHV bio-crude were produced at high temperatures, low solids ratio, and low residence times. The acquired averages for the HHV (20.0333 MJ/kg to 29.7744 MJ/kg) were all higher than the HHV of the Napier grass sample used in the study (12.9394 MJ/kg). It was observed in the study, that even though solids ratio has the least effect on both responses, the choice of solids ratio is dependent on its interaction effects with the other parameters as its effects contribute to the observed responses.
NASA Astrophysics Data System (ADS)
Riddin, T. L.; Gericke, M.; Whiteley, C. G.
2006-07-01
Fusarium oxysporum fungal strain was screened and found to be successful for the inter- and extracellular production of platinum nanoparticles. Nanoparticle formation was visually observed, over time, by the colour of the extracellular solution and/or the fungal biomass turning from yellow to dark brown, and their concentration was determined from the amount of residual hexachloroplatinic acid measured from a standard curve at 456 nm. The extracellular nanoparticles were characterized by transmission electron microscopy. Nanoparticles of varying size (10-100 nm) and shape (hexagons, pentagons, circles, squares, rectangles) were produced at both extracellular and intercellular levels by the Fusarium oxysporum. The particles precipitate out of solution and bioaccumulate by nucleation either intercellularly, on the cell wall/membrane, or extracellularly in the surrounding medium. The importance of pH, temperature and hexachloroplatinic acid (H2PtCl6) concentration in nanoparticle formation was examined through the use of a statistical response surface methodology. Only the extracellular production of nanoparticles proved to be statistically significant, with a concentration yield of 4.85 mg l-1 estimated by a first-order regression model. From a second-order polynomial regression, the predicted yield of nanoparticles increased to 5.66 mg l-1 and, after a backward step, regression gave a final model with a yield of 6.59 mg l-1.
Structure and performance of different DRG classification systems for neonatal medicine.
Muldoon, J H
1999-01-01
There are a number of Diagnosis-Related Group (DRG) classification systems that have evolved over the past 2 decades, each with their own strengths and weaknesses. DRG systems are used for case-mix trending, utilization management and quality improvement, comparative reporting, prospective payment, and price negotiations. For any of these applications it is essential to know the accuracy with which the DRG system classifies patients, specifically for predicting resource use and also mortality. The objective of this study was to assess the adequacy of the three most commonly used DRG systems for neonatal patients-Medicare DRGs, All Patient Diagnosis-Related Groups (AP-DRGs), and All Patient Refined Diagnosis-Related Groups (APR-DRGs). A 2-part methodology is used to assess adequacy. The first part is a descriptive analysis that examines the structural characteristics of each system. This provides a framework for understanding the inherent strengths and weaknesses of each system and for interpreting their statistical performance. The second part examines the statistical performance of each system on a large nationally representative hospital database. The analysis identifies major differences in the structure and statistical performance of the three DRG systems for neonates. The Medicare DRGs are structurally the least developed and yield the poorest overall statistical performance (cost R2 = 0.292; mortality R2 = 0.083). The APR-DRGs are structurally the most developed and yield the best statistical performance (cost R2 = 0.627; mortality R2 = 0.416). The AP-DRGs are intermediate to Medicare DRGs and APR-DRGs, although closer to APR-DRGs (cost R2 = 0.507; mortality R2 = 0.304). An analysis of payment impacts and systematic effects identifies there are major systematic biases with the Medicare DRGs. At the patient level, there is substantial underpayment for surgical neonates, transferred-in neonates, neonates discharged to home health services, and neonates who die. In contrast, there is substantial overpayment for normal newborns. At the facility level, there is substantial underpayment for freestanding acute children's hospitals and major teaching general hospitals. There is overpayment for other urban general hospitals but this pattern varies by hospital size. There is very substantial overpayment for other rural hospitals. The AP-DRGs remove the majority of the systematic effects but significant biases remain. The APR-DRGs remove most of the systematic effects but some biases remain.
Huncharek, M; Kupelnick, B
2001-01-01
The etiology of epithelial ovarian cancer is unknown. Prior work suggests that high dietary fat intake is associated with an increased risk of this tumor, although this association remains speculative. A meta-analysis was performed to evaluate this suspected relationship. Using previously described methods, a protocol was developed for a meta-analysis examining the association between high vs. low dietary fat intake and the risk of epithelial ovarian cancer. Literature search techniques, study inclusion criteria, and statistical procedures were prospectively defined. Data from observational studies were pooled using a general variance-based meta-analytic method employing confidence intervals (CI) previously described by Greenland. The outcome of interest was a summary relative risk (RRs) reflecting the risk of ovarian cancer associated with high vs. low dietary fat intake. Sensitivity analyses were performed when necessary to evaluate any observed statistical heterogeneity. The literature search yielded 8 observational studies enrolling 6,689 subjects. Data were stratified into three dietary fat intake categories: total fat, animal fat, and saturated fat. Initial tests for statistical homogeneity demonstrated that hospital-based studies accounted for observed heterogeneity possibly because of selection bias. Accounting for this, an RRs was calculated for high vs. low total fat intake, yielding a value of 1.24 (95% CI = 1.07-1.43), a statistically significant result. That is, high total fat intake is associated with a 24% increased risk of ovarian cancer development. The RRs for high saturated fat intake was 1.20 (95% CI = 1.04-1.39), suggesting a 20% increased risk of ovarian cancer among subjects with these dietary habits. High vs. low animal fat diet gave an RRs of 1.70 (95% CI = 1.43-2.03), consistent with a statistically significant 70% increased ovarian cancer risk. High dietary fat intake appears to represent a significant risk factor for the development of ovarian cancer. The magnitude of this risk associated with total fat and saturated fat is rather modest. Ovarian cancer risk associated with high animal fat intake appears significantly greater than that associated with the other types of fat intake studied, although this requires confirmation via larger analyses. Further work is needed to clarify factors that may modify the effects of dietary fat in vivo.
Modelling drought-related yield losses in Iberia using remote sensing and multiscalar indices
NASA Astrophysics Data System (ADS)
Ribeiro, Andreia F. S.; Russo, Ana; Gouveia, Célia M.; Páscoa, Patrícia
2018-04-01
The response of two rainfed winter cereal yields (wheat and barley) to drought conditions in the Iberian Peninsula (IP) was investigated for a long period (1986-2012). Drought hazard was evaluated based on the multiscalar Standardized Precipitation Evapotranspiration Index (SPEI) and three remote sensing indices, namely the Vegetation Condition (VCI), the Temperature Condition (TCI), and the Vegetation Health (VHI) Indices. A correlation analysis between the yield and the drought indicators was conducted, and multiple linear regression (MLR) and artificial neural network (ANN) models were established to estimate yield at the regional level. The correlation values suggested that yield reduces with moisture depletion (low values of VCI) during early-spring and with too high temperatures (low values of TCI) close to the harvest time. Generally, all drought indicators displayed greatest influence during the plant stages in which the crop is photosynthetically more active (spring and summer), rather than the earlier moments of plants life cycle (autumn/winter). Our results suggested that SPEI is more relevant in the southern sector of the IP, while remote sensing indices are rather good in estimating cereal yield in the northern sector of the IP. The strength of the statistical relationships found by MLR and ANN methods is quite similar, with some improvements found by the ANN. A great number of true positives (hits) of occurrence of yield-losses exhibiting hit rate (HR) values higher than 69% was obtained.
Teodoro, P E; Torres, F E; Santos, A D; Corrêa, A M; Nascimento, M; Barroso, L M A; Ceccon, G
2016-05-09
The aim of this study was to evaluate the suitability of statistics as experimental precision degree measures for trials with cowpea (Vigna unguiculata L. Walp.) genotypes. Cowpea genotype yields were evaluated in 29 trials conducted in Brazil between 2005 and 2012. The genotypes were evaluated with a randomized block design with four replications. Ten statistics that were estimated for each trial were compared using descriptive statistics, Pearson correlations, and path analysis. According to the class limits established, selective accuracy and F-test values for genotype, heritability, and the coefficient of determination adequately estimated the degree of experimental precision. Using these statistics, 86.21% of the trials had adequate experimental precision. Selective accuracy and the F-test values for genotype, heritability, and the coefficient of determination were directly related to each other, and were more suitable than the coefficient of variation and the least significant difference (by the Tukey test) to evaluate experimental precision in trials with cowpea genotypes.
The important but weakening maize yield benefit of grain filling prolongation in the US Midwest.
Zhu, Peng; Jin, Zhenong; Zhuang, Qianlai; Ciais, Philippe; Bernacchi, Carl; Wang, Xuhui; Makowski, David; Lobell, David
2018-06-14
A better understanding of recent crop yield trends is necessary for improving the yield and maintaining food security. Several possible mechanisms have been investigated recently in order to explain the steady growth in maize yield over the US Corn-Belt, but a substantial fraction of the increasing trend remains elusive. In this study, trends in grain filling period (GFP) were identified and their relations with maize yield increase were further analyzed. By using satellite data from 2000 to 2015, an average lengthening of GFP of 0.37 days per year was found over the region, which probably results from variety renewal. Statistical analysis suggests that longer GFP accounted for roughly one-quarter (23%) of the yield increase trend by promoting kernel dry matter accumulation, yet had less yield benefit in hotter counties. Both official survey data and crop model simulations estimated a similar contribution of GFP trend to yield. If growing degree days that determines the GFP continues to prolong at the current rate for the next 50 years, yield reduction will be lessened with 25% and 18% longer GFP under Representative Concentration Pathway 2.6 (RCP 2.6) and RCP 6.0, respectively. However, this level of progress is insufficient to offset yield losses in future climates, because drought and heat stress during the GFP will become more prevalent and severe. This study highlights the need to devise multiple effective adaptation strategies to withstand the upcoming challenges in food security. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
76 FR 13018 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-09
... statistical surveys that yield quantitative results that can be generalized to the population of study. This... information will not be used for quantitative information collections that are designed to yield reliably... generic mechanisms that are designed to yield quantitative results. Total Burden Estimate for the...
It's all relative: ranking the diversity of aquatic bacterial communities.
Shaw, Allison K; Halpern, Aaron L; Beeson, Karen; Tran, Bao; Venter, J Craig; Martiny, Jennifer B H
2008-09-01
The study of microbial diversity patterns is hampered by the enormous diversity of microbial communities and the lack of resources to sample them exhaustively. For many questions about richness and evenness, however, one only needs to know the relative order of diversity among samples rather than total diversity. We used 16S libraries from the Global Ocean Survey to investigate the ability of 10 diversity statistics (including rarefaction, non-parametric, parametric, curve extrapolation and diversity indices) to assess the relative diversity of six aquatic bacterial communities. Overall, we found that the statistics yielded remarkably similar rankings of the samples for a given sequence similarity cut-off. This correspondence, despite the different underlying assumptions of the statistics, suggests that diversity statistics are a useful tool for ranking samples of microbial diversity. In addition, sequence similarity cut-off influenced the diversity ranking of the samples, demonstrating that diversity statistics can also be used to detect differences in phylogenetic structure among microbial communities. Finally, a subsampling analysis suggests that further sequencing from these particular clone libraries would not have substantially changed the richness rankings of the samples.
NASA Astrophysics Data System (ADS)
Slaski, G.; Ohde, B.
2016-09-01
The article presents the results of a statistical dispersion analysis of an energy and power demand for tractive purposes of a battery electric vehicle. The authors compare data distribution for different values of an average speed in two approaches, namely a short and long period of observation. The short period of observation (generally around several hundred meters) results from a previously proposed macroscopic energy consumption model based on an average speed per road section. This approach yielded high values of standard deviation and coefficient of variation (the ratio between standard deviation and the mean) around 0.7-1.2. The long period of observation (about several kilometers long) is similar in length to standardized speed cycles used in testing a vehicle energy consumption and available range. The data were analysed to determine the impact of observation length on the energy and power demand variation. The analysis was based on a simulation of electric power and energy consumption performed with speed profiles data recorded in Poznan agglomeration.
Quantifying the indirect impacts of climate on agriculture: an inter-method comparison
Calvin, Kate; Fisher-Vanden, Karen
2017-10-27
Climate change and increases in CO2 concentration affect the productivity of land, with implications for land use, land cover, and agricultural production. Much of the literature on the effect of climate on agriculture has focused on linking projections of changes in climate to process-based or statistical crop models. However, the changes in productivity have broader economic implications that cannot be quantified in crop models alone. How important are these socio-economic feedbacks to a comprehensive assessment of the impacts of climate change on agriculture? In this paper, we attempt to measure the importance of these interaction effects through an inter-method comparisonmore » between process models, statistical models, and integrated assessment model (IAMs). We find the impacts on crop yields vary widely between these three modeling approaches. Yield impacts generated by the IAMs are 20%-40% higher than the yield impacts generated by process-based or statistical crop models, with indirect climate effects adjusting yields by between - 12% and + 15% (e.g. input substitution and crop switching). The remaining effects are due to technological change.« less
Quantifying the indirect impacts of climate on agriculture: an inter-method comparison
NASA Astrophysics Data System (ADS)
Calvin, Kate; Fisher-Vanden, Karen
2017-11-01
Climate change and increases in CO2 concentration affect the productivity of land, with implications for land use, land cover, and agricultural production. Much of the literature on the effect of climate on agriculture has focused on linking projections of changes in climate to process-based or statistical crop models. However, the changes in productivity have broader economic implications that cannot be quantified in crop models alone. How important are these socio-economic feedbacks to a comprehensive assessment of the impacts of climate change on agriculture? In this paper, we attempt to measure the importance of these interaction effects through an inter-method comparison between process models, statistical models, and integrated assessment model (IAMs). We find the impacts on crop yields vary widely between these three modeling approaches. Yield impacts generated by the IAMs are 20%-40% higher than the yield impacts generated by process-based or statistical crop models, with indirect climate effects adjusting yields by between -12% and +15% (e.g. input substitution and crop switching). The remaining effects are due to technological change.
Quantifying the indirect impacts of climate on agriculture: an inter-method comparison
DOE Office of Scientific and Technical Information (OSTI.GOV)
Calvin, Kate; Fisher-Vanden, Karen
Climate change and increases in CO2 concentration affect the productivity of land, with implications for land use, land cover, and agricultural production. Much of the literature on the effect of climate on agriculture has focused on linking projections of changes in climate to process-based or statistical crop models. However, the changes in productivity have broader economic implications that cannot be quantified in crop models alone. How important are these socio-economic feedbacks to a comprehensive assessment of the impacts of climate change on agriculture? In this paper, we attempt to measure the importance of these interaction effects through an inter-method comparisonmore » between process models, statistical models, and integrated assessment model (IAMs). We find the impacts on crop yields vary widely between these three modeling approaches. Yield impacts generated by the IAMs are 20%-40% higher than the yield impacts generated by process-based or statistical crop models, with indirect climate effects adjusting yields by between - 12% and + 15% (e.g. input substitution and crop switching). The remaining effects are due to technological change.« less
2009-02-01
data was linearly fit, and the slope yielded the Seebeck coefficient. A small resis - tor was epoxied to the top of the sample, and the oppo- site end...space probes in its radioisotope thermoelectric generators (RTGs) and is of current interest to automobile manufacturers to supply additional power... resis - tivity or conductivity, thermal conductivity, and Seebeck coefficient. These required measurements are demanding, especially the thermal
Research in Theoretical High Energy Nuclear Physics at the University of Arizona
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rafelski, Johann
In the past decade (2004-2015) we addressed the quest for the understanding of how quark confinement works, how it can be dissolved in a limited space-time domain, and what this means: i) for the paradigm of the laws of physics of present day; and, ii) for our understanding of cosmology. The focus of our in laboratory matter formation work has been centered on the understanding of the less frequently produced hadronic particles (e.g. strange antibaryons, charmed and beauty hadrons, massive resonances, charmonium, B c). We have developed a public analysis tool, SHARE (Statistical HAdronization with REsonances) which allows a precisemore » model description of experimental particle yield and fluctuation data. We have developed a charm recombination model to allow for off-equilibrium rate of charmonium production. We have developed methods and techniques which allowed us to study the hadron resonance yield evolution by kinetic theory. We explored entropy, strangeness and charm as signature of QGP addressing the wide range of reaction energy for AGS, SPS, RHIC and LHC energy range. In analysis of experimental data, we obtained both statistical parameters as well as physical properties of the hadron source. The following pages present listings of our primary writing on these questions. The abstracts are included in lieu of more detailed discussion of our research accomplishments in each of the publications.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, J.
We present the preliminary measurement of CP-violating asymmetries in B{sup 0} {yields} ({rho}{pi}){sup 0} {yields} {pi}{sup +}{pi}{sup -}{pi}{sup 0} decays using a time-dependent Dalitz plot analysis. The results are obtained from a data sample of 213 million {Upsilon}(4S) {yields} B{bar B} decays, collected by the BABAR detector at the PEP-II asymmetric-energy B Factory at SLAC. This analysis extends the narrow-rho quasi-two-body approximation used in the previous analysis, by taking into account the interference between the rho resonances of the three charges. We measure 16 coefficients of the bilinear form factor terms occurring in the time-dependent decay rate of the B{supmore » 0} meson with the use of a maximum-likelihood fit. We derive the physically relevant quantities from these coefficients. We measure the direct CP-violation parameters A{sub {rho}{pi}} = -0.088 {+-} 0.049 {+-} 0.013 and C = 0.34 {+-} 0.11 {+-} 0.05, where the first errors are statistical and the second systematic. For the mixing-induced CP-violation parameter we find S = -0.10 {+-} 0.14 {+-} 0.04, and for the dilution and strong phase shift parameters respectively, we obtain {Delta}C = 0.15 {+-} 0.11 {+-} 0.03 and {Delta}S = 0.22 {+-} 0.15 {+-} 0.03. For the angle alpha of the Unitarity Triangle we measure (113{sub -17}{sup +27} {+-} 6){sup o}, while only a weak constraint is achieved at the significance level of more than two standard deviations. Finally, for the relative strong phase {delta}{sub {+-}} between the B{sup 0} {yields} {rho}{sup -}{pi}{sup +} and B{sup 0} {yields} {rho}{sup +}{pi}{sup -} transitions we find (-67{sub -31}{sup +28} {+-} 7) deg, with a similarly weak constraint at two standard deviations and beyond.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-04
... not statistical surveys that yield quantitative results that can be generalized to the population of... information will not be used for quantitative information collections that are designed to yield reliably... generic mechanisms that are designed to yield quantitative results. No comments were received in response...
77 FR 75498 - Request for Comments on a New Collection
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-20
... statistical surveys that yield quantitative results that can be generalized to the population of study. DATES... surveys that yield quantitative results that can be generalized to the population of study. This feedback... qualitative information will not be used for quantitative information collections that are designed to yield...
What's holding us back? Raising the alfalfa yield bar
USDA-ARS?s Scientific Manuscript database
Measuring yield of commodity crops is easy – weight and moisture content are determined on delivery. Consequently, reports of production or yield for grain crops can be made reliably to the agencies that track crop production, such as the USDA-National Agricultural Statistics Service (NASS). The s...
The alfalfa yield gap: A review of the evidence
USDA-ARS?s Scientific Manuscript database
Knowledge of feasibly attainable crop yields is needed for many purposes, from field-scale management to national policy decisions. For alfalfa (Medicago sativa L.), the most widely used estimates of yield in the US are whole-farm reports from the National Agriculture Statistics Service, which are b...
Streamwise evolution of statistical events and the triple correlation in a model wind turbine array
NASA Astrophysics Data System (ADS)
Viestenz, Kyle; Cal, Raúl Bayoán
2013-11-01
Hot-wire anemometry data, obtained from a wind tunnel experiment containing a 3 × 3 wind turbine array, are used to conditionally average the Reynolds stresses. Nine profiles at the centerline behind the array are analyzed to characterize the turbulent velocity statistics of the wake flow. Quadrant analysis yields statistical events occurring in the wake of the wind farm, where quadrants 2 and 4 produce ejections and sweeps, respectively. A balance between these quadrants is expressed via the ΔSo parameter, which attains a maximum value at the bottom tip and changes sign near the top tip of the rotor. These are then associated to the triple correlation term present in the turbulent kinetic energy equation of the fluctuations. The development of these various quantities is assessed in light of wake remediation, energy transport and possess significance in closure models. National Science Foundation: ECCS-1032647.
The log-periodic-AR(1)-GARCH(1,1) model for financial crashes
NASA Astrophysics Data System (ADS)
Gazola, L.; Fernandes, C.; Pizzinga, A.; Riera, R.
2008-02-01
This paper intends to meet recent claims for the attainment of more rigorous statistical methodology within the econophysics literature. To this end, we consider an econometric approach to investigate the outcomes of the log-periodic model of price movements, which has been largely used to forecast financial crashes. In order to accomplish reliable statistical inference for unknown parameters, we incorporate an autoregressive dynamic and a conditional heteroskedasticity structure in the error term of the original model, yielding the log-periodic-AR(1)-GARCH(1,1) model. Both the original and the extended models are fitted to financial indices of U. S. market, namely S&P500 and NASDAQ. Our analysis reveal two main points: (i) the log-periodic-AR(1)-GARCH(1,1) model has residuals with better statistical properties and (ii) the estimation of the parameter concerning the time of the financial crash has been improved.
Modeling of the reactant conversion rate in a turbulent shear flow
NASA Technical Reports Server (NTRS)
Frankel, S. H.; Madnia, C. K.; Givi, P.
1992-01-01
Results are presented of direct numerical simulations (DNS) of spatially developing shear flows under the influence of infinitely fast chemical reactions of the type A + B yields Products. The simulation results are used to construct the compositional structure of the scalar field in a statistical manner. The results of this statistical analysis indicate that the use of a Beta density for the probability density function (PDF) of an appropriate Shvab-Zeldovich mixture fraction provides a very good estimate of the limiting bounds of the reactant conversion rate within the shear layer. This provides a strong justification for the implementation of this density in practical modeling of non-homogeneous turbulent reacting flows. However, the validity of the model cannot be generalized for predictions of higher order statistical quantities. A closed form analytical expression is presented for predicting the maximum rate of reactant conversion in non-homogeneous reacting turbulence.
Mathematical and statistical models for determining the crop load in grapevine
NASA Astrophysics Data System (ADS)
Alina, Dobrei; Alin, Dobrei; Eleonora, Nistor; Teodor, Cristea; Marius, Boldea; Florin, Sala
2016-06-01
Ensuring a balance between vine crop load and vine vegetative growth is a dynamic process, so it is necessary to develop models for describing this relationship. This study analyzed the interrelationship between the crop load and growing specific parameters (viable buds - VB, dead (frost-injured) buds - DB, total shoots growth-TSG, one-year-old wood - MSG), in two vine grapes varieties: Muscat Ottonel cultivar for wine and Victoria cultivar for fresh grapes. In both varieties interrelationship between the buds number and vegetative growth parameters were described by polynomial functions statistically assured. Using regression analysis it was possible to develop predictive models for one-year-old wood (MSG), an important parameter for the yield and quality of wine grape production, with statistical significance results (R2 = 0.884, p <0.001, F = 45.957 in Muscat Ottonel cultivar and R2 = 0.893, p = 0.001, F = 49.886 in Victoria cultivar).
Guillaume, Bryan; Wang, Changqing; Poh, Joann; Shen, Mo Jun; Ong, Mei Lyn; Tan, Pei Fang; Karnani, Neerja; Meaney, Michael; Qiu, Anqi
2018-06-01
Statistical inference on neuroimaging data is often conducted using a mass-univariate model, equivalent to fitting a linear model at every voxel with a known set of covariates. Due to the large number of linear models, it is challenging to check if the selection of covariates is appropriate and to modify this selection adequately. The use of standard diagnostics, such as residual plotting, is clearly not practical for neuroimaging data. However, the selection of covariates is crucial for linear regression to ensure valid statistical inference. In particular, the mean model of regression needs to be reasonably well specified. Unfortunately, this issue is often overlooked in the field of neuroimaging. This study aims to adopt the existing Confounder Adjusted Testing and Estimation (CATE) approach and to extend it for use with neuroimaging data. We propose a modification of CATE that can yield valid statistical inferences using Principal Component Analysis (PCA) estimators instead of Maximum Likelihood (ML) estimators. We then propose a non-parametric hypothesis testing procedure that can improve upon parametric testing. Monte Carlo simulations show that the modification of CATE allows for more accurate modelling of neuroimaging data and can in turn yield a better control of False Positive Rate (FPR) and Family-Wise Error Rate (FWER). We demonstrate its application to an Epigenome-Wide Association Study (EWAS) on neonatal brain imaging and umbilical cord DNA methylation data obtained as part of a longitudinal cohort study. Software for this CATE study is freely available at http://www.bioeng.nus.edu.sg/cfa/Imaging_Genetics2.html. Copyright © 2018 The Author(s). Published by Elsevier Inc. All rights reserved.
Vavrek, Darcy A; Sharma, Rajiv; Haas, Mitchell
2014-06-01
The purpose of this analysis is to report the incremental costs and benefits of different doses of spinal manipulative therapy (SMT) in patients with chronic low back pain (LBP). We randomized 400 patients with chronic LBP to receive a dose of 0, 6, 12, or 18 sessions of SMT. Participants were scheduled for 18 visits for 6 weeks and received SMT or light massage control from a doctor of chiropractic. Societal costs in the year after study enrollment were estimated using patient reports of health care use and lost productivity. The main health outcomes were the number of pain-free days and disability-free days. Multiple regression was performed on outcomes and log-transformed cost data. Lost productivity accounts for most societal costs of chronic LBP. Cost of treatment and lost productivity ranged from $3398 for 12 SMT sessions to $3815 for 0 SMT sessions with no statistically significant differences between groups. Baseline patient characteristics related to increase in costs were greater age (P = .03), greater disability (P = .01), lower quality-adjusted life year scores (P = .01), and higher costs in the period preceding enrollment (P < .01). Pain-free and disability-free days were greater for all SMT doses compared with control, but only SMT 12 yielded a statistically significant benefit of 22.9 pain-free days (P = .03) and 19.8 disability-free days (P = .04). No statistically significant group differences in quality-adjusted life years were noted. A dose of 12 SMT sessions yielded a modest benefit in pain-free and disability-free days. Care of chronic LBP with SMT did not increase the costs of treatment plus lost productivity. Copyright © 2014 National University of Health Sciences. Published by Mosby, Inc. All rights reserved.
Statistical significance of trace evidence matches using independent physicochemical measurements
NASA Astrophysics Data System (ADS)
Almirall, Jose R.; Cole, Michael; Furton, Kenneth G.; Gettinby, George
1997-02-01
A statistical approach to the significance of glass evidence is proposed using independent physicochemical measurements and chemometrics. Traditional interpretation of the significance of trace evidence matches or exclusions relies on qualitative descriptors such as 'indistinguishable from,' 'consistent with,' 'similar to' etc. By performing physical and chemical measurements with are independent of one another, the significance of object exclusions or matches can be evaluated statistically. One of the problems with this approach is that the human brain is excellent at recognizing and classifying patterns and shapes but performs less well when that object is represented by a numerical list of attributes. Chemometrics can be employed to group similar objects using clustering algorithms and provide statistical significance in a quantitative manner. This approach is enhanced when population databases exist or can be created and the data in question can be evaluated given these databases. Since the selection of the variables used and their pre-processing can greatly influence the outcome, several different methods could be employed in order to obtain a more complete picture of the information contained in the data. Presently, we report on the analysis of glass samples using refractive index measurements and the quantitative analysis of the concentrations of the metals: Mg, Al, Ca, Fe, Mn, Ba, Sr, Ti and Zr. The extension of this general approach to fiber and paint comparisons also is discussed. This statistical approach should not replace the current interpretative approaches to trace evidence matches or exclusions but rather yields an additional quantitative measure. The lack of sufficient general population databases containing the needed physicochemical measurements and the potential for confusion arising from statistical analysis currently hamper this approach and ways of overcoming these obstacles are presented.
Bansal, Ravi; Peterson, Bradley S
2018-06-01
Identifying regional effects of interest in MRI datasets usually entails testing a priori hypotheses across many thousands of brain voxels, requiring control for false positive findings in these multiple hypotheses testing. Recent studies have suggested that parametric statistical methods may have incorrectly modeled functional MRI data, thereby leading to higher false positive rates than their nominal rates. Nonparametric methods for statistical inference when conducting multiple statistical tests, in contrast, are thought to produce false positives at the nominal rate, which has thus led to the suggestion that previously reported studies should reanalyze their fMRI data using nonparametric tools. To understand better why parametric methods may yield excessive false positives, we assessed their performance when applied both to simulated datasets of 1D, 2D, and 3D Gaussian Random Fields (GRFs) and to 710 real-world, resting-state fMRI datasets. We showed that both the simulated 2D and 3D GRFs and the real-world data contain a small percentage (<6%) of very large clusters (on average 60 times larger than the average cluster size), which were not present in 1D GRFs. These unexpectedly large clusters were deemed statistically significant using parametric methods, leading to empirical familywise error rates (FWERs) as high as 65%: the high empirical FWERs were not a consequence of parametric methods failing to model spatial smoothness accurately, but rather of these very large clusters that are inherently present in smooth, high-dimensional random fields. In fact, when discounting these very large clusters, the empirical FWER for parametric methods was 3.24%. Furthermore, even an empirical FWER of 65% would yield on average less than one of those very large clusters in each brain-wide analysis. Nonparametric methods, in contrast, estimated distributions from those large clusters, and therefore, by construct rejected the large clusters as false positives at the nominal FWERs. Those rejected clusters were outlying values in the distribution of cluster size but cannot be distinguished from true positive findings without further analyses, including assessing whether fMRI signal in those regions correlates with other clinical, behavioral, or cognitive measures. Rejecting the large clusters, however, significantly reduced the statistical power of nonparametric methods in detecting true findings compared with parametric methods, which would have detected most true findings that are essential for making valid biological inferences in MRI data. Parametric analyses, in contrast, detected most true findings while generating relatively few false positives: on average, less than one of those very large clusters would be deemed a true finding in each brain-wide analysis. We therefore recommend the continued use of parametric methods that model nonstationary smoothness for cluster-level, familywise control of false positives, particularly when using a Cluster Defining Threshold of 2.5 or higher, and subsequently assessing rigorously the biological plausibility of the findings, even for large clusters. Finally, because nonparametric methods yielded a large reduction in statistical power to detect true positive findings, we conclude that the modest reduction in false positive findings that nonparametric analyses afford does not warrant a re-analysis of previously published fMRI studies using nonparametric techniques. Copyright © 2018 Elsevier Inc. All rights reserved.
Capillary fluctuations of surface steps: An atomistic simulation study for the model Cu(111) system
NASA Astrophysics Data System (ADS)
Freitas, Rodrigo; Frolov, Timofey; Asta, Mark
2017-10-01
Molecular dynamics (MD) simulations are employed to investigate the capillary fluctuations of steps on the surface of a model metal system. The fluctuation spectrum, characterized by the wave number (k ) dependence of the mean squared capillary-wave amplitudes and associated relaxation times, is calculated for 〈110 〉 and 〈112 〉 steps on the {111 } surface of elemental copper near the melting temperature of the classical potential model considered. Step stiffnesses are derived from the MD results, yielding values from the largest system sizes of (37 ±1 ) meV/A ˚ for the different line orientations, implying that the stiffness is isotropic within the statistical precision of the calculations. The fluctuation lifetimes are found to vary by approximately four orders of magnitude over the range of wave numbers investigated, displaying a k dependence consistent with kinetics governed by step-edge mediated diffusion. The values for step stiffness derived from these simulations are compared to step free energies for the same system and temperature obtained in a recent MD-based thermodynamic-integration (TI) study [Freitas, Frolov, and Asta, Phys. Rev. B 95, 155444 (2017), 10.1103/PhysRevB.95.155444]. Results from the capillary-fluctuation analysis and TI calculations yield statistically significant differences that are discussed within the framework of statistical-mechanical theories for configurational contributions to step free energies.
A fast and efficient method for device level layout analysis
NASA Astrophysics Data System (ADS)
Dong, YaoQi; Zou, Elaine; Pang, Jenny; Huang, Lucas; Yang, Legender; Zhang, Chunlei; Du, Chunshan; Hu, Xinyi; Wan, Qijian
2017-03-01
There is an increasing demand for device level layout analysis, especially as technology advances. The analysis is to study standard cells by extracting and classifying critical dimension parameters. There are couples of parameters to extract, like channel width, length, gate to active distance, and active to adjacent active distance, etc. for 14nm technology, there are some other parameters that are cared about. On the one hand, these parameters are very important for studying standard cell structures and spice model development with the goal of improving standard cell manufacturing yield and optimizing circuit performance; on the other hand, a full chip device statistics analysis can provide useful information to diagnose the yield issue. Device analysis is essential for standard cell customization and enhancements and manufacturability failure diagnosis. Traditional parasitic parameters extraction tool like Calibre xRC is powerful but it is not sufficient for this device level layout analysis application as engineers would like to review, classify and filter out the data more easily. This paper presents a fast and efficient method based on Calibre equation-based DRC (eqDRC). Equation-based DRC extends the traditional DRC technology to provide a flexible programmable modeling engine which allows the end user to define grouped multi-dimensional feature measurements using flexible mathematical expressions. This paper demonstrates how such an engine and its programming language can be used to implement critical device parameter extraction. The device parameters are extracted and stored in a DFM database which can be processed by Calibre YieldServer. YieldServer is data processing software that lets engineers query, manipulate, modify, and create data in a DFM database. These parameters, known as properties in eqDRC language, can be annotated back to the layout for easily review. Calibre DesignRev can create a HTML formatted report of the results displayed in Calibre RVE which makes it easy to share results among groups. This method has been proven and used in SMIC PDE team and SPICE team.
A spin column-free approach to sodium hydroxide-based glycan permethylation.
Hu, Yueming; Borges, Chad R
2017-07-24
Glycan permethylation was introduced as a tool to facilitate the study of glycans in 1903. Since that time, permethylation procedures have been continually modified to improve permethylation efficiency and qualitative applicability. Typically, however, either laborious preparation steps or cumbersome and uneconomical spin columns have been needed to obtain decent permethylation yields on small glycan samples. Here we describe a spin column-free (SCF) glycan permethylation procedure that is applicable to both O- and N-linked glycans and can be employed upstream to intact glycan analysis by MALDI-MS, ESI-MS, or glycan linkage analysis by GC-MS. The SCF procedure involves neutralization of NaOH beads by acidified phosphate buffer, which eliminates the risk of glycan oxidative degradation and avoids the use of spin columns. Optimization of the new permethylation procedure provided high permethylation efficiency for both hexose (>98%) and HexNAc (>99%) residues-yields which were comparable to (or better than) those of some widely-used spin column-based procedures. A light vs. heavy labelling approach was employed to compare intact glycan yields from a popular spin-column based approach to the SCF approach. Recovery of intact N-glycans was significantly better with the SCF procedure (p < 0.05), but overall yield of O-glycans was similar or slightly diminished (p < 0.05 for tetrasaccharides or smaller). When the SCF procedure was employed upstream to hydrolysis, reduction and acetylation for glycan linkage analysis of pooled glycans from unfractionated blood plasma, analytical reproducibility was on par with that from previous spin column-based "glycan node" analysis results. When applied to blood plasma samples from stage III-IV breast cancer patients (n = 20) and age-matched controls (n = 20), the SCF procedure facilitated identification of three glycan nodes with significantly different distributions between the cases and controls (ROC c-statistics > 0.75; p < 0.01). In summary, the SCF permethylation procedure expedites and economizes both intact glycan analysis and linkage analysis of glycans from whole biospecimens.
A spin column-free approach to sodium hydroxide-based glycan permethylation†
Hu, Yueming; Borges, Chad R.
2018-01-01
Glycan permethylation was introduced as a tool to facilitate the study of glycans in 1903. Since that time, permethylation procedures have been continually modified to improve permethylation efficiency and qualitative applicability. Typically, however, either laborious preparation steps or cumbersome and uneconomical spin columns have been needed to obtain decent permethylation yields on small glycan samples. Here we describe a spin column-free (SCF) glycan permethylation procedure that is applicable to both O- and N-linked glycans and can be employed upstream to intact glycan analysis by MALDI-MS, ESI-MS, or glycan linkage analysis by GC-MS. The SCF procedure involves neutralization of NaOH beads by acidified phosphate buffer, which eliminates the risk of glycan oxidative degradation and avoids the use of spin columns. Optimization of the new permethylation procedure provided high permethylation efficiency for both hexose (>98%) and HexNAc (>99%) residues—yields which were comparable to (or better than) those of some widely-used spin column-based procedures. A light vs. heavy labelling approach was employed to compare intact glycan yields from a popular spin-column based approach to the SCF approach. Recovery of intact N-glycans was significantly better with the SCF procedure (p < 0.05), but overall yield of O-glycans was similar or slightly diminished (p < 0.05 for tetrasaccharides or smaller). When the SCF procedure was employed upstream to hydrolysis, reduction and acetylation for glycan linkage analysis of pooled glycans from unfractionated blood plasma, analytical reproducibility was on par with that from previous spin column-based “glycan node” analysis results. When applied to blood plasma samples from stage III–IV breast cancer patients (n = 20) and age-matched controls (n = 20), the SCF procedure facilitated identification of three glycan nodes with significantly different distributions between the cases and controls (ROC c-statistics > 0.75; p < 0.01). In summary, the SCF permethylation procedure expedites and economizes both intact glycan analysis and linkage analysis of glycans from whole biospecimens. PMID:28635997
Generalization of Entropy Based Divergence Measures for Symbolic Sequence Analysis
Ré, Miguel A.; Azad, Rajeev K.
2014-01-01
Entropy based measures have been frequently used in symbolic sequence analysis. A symmetrized and smoothed form of Kullback-Leibler divergence or relative entropy, the Jensen-Shannon divergence (JSD), is of particular interest because of its sharing properties with families of other divergence measures and its interpretability in different domains including statistical physics, information theory and mathematical statistics. The uniqueness and versatility of this measure arise because of a number of attributes including generalization to any number of probability distributions and association of weights to the distributions. Furthermore, its entropic formulation allows its generalization in different statistical frameworks, such as, non-extensive Tsallis statistics and higher order Markovian statistics. We revisit these generalizations and propose a new generalization of JSD in the integrated Tsallis and Markovian statistical framework. We show that this generalization can be interpreted in terms of mutual information. We also investigate the performance of different JSD generalizations in deconstructing chimeric DNA sequences assembled from bacterial genomes including that of E. coli, S. enterica typhi, Y. pestis and H. influenzae. Our results show that the JSD generalizations bring in more pronounced improvements when the sequences being compared are from phylogenetically proximal organisms, which are often difficult to distinguish because of their compositional similarity. While small but noticeable improvements were observed with the Tsallis statistical JSD generalization, relatively large improvements were observed with the Markovian generalization. In contrast, the proposed Tsallis-Markovian generalization yielded more pronounced improvements relative to the Tsallis and Markovian generalizations, specifically when the sequences being compared arose from phylogenetically proximal organisms. PMID:24728338
Generalization of entropy based divergence measures for symbolic sequence analysis.
Ré, Miguel A; Azad, Rajeev K
2014-01-01
Entropy based measures have been frequently used in symbolic sequence analysis. A symmetrized and smoothed form of Kullback-Leibler divergence or relative entropy, the Jensen-Shannon divergence (JSD), is of particular interest because of its sharing properties with families of other divergence measures and its interpretability in different domains including statistical physics, information theory and mathematical statistics. The uniqueness and versatility of this measure arise because of a number of attributes including generalization to any number of probability distributions and association of weights to the distributions. Furthermore, its entropic formulation allows its generalization in different statistical frameworks, such as, non-extensive Tsallis statistics and higher order Markovian statistics. We revisit these generalizations and propose a new generalization of JSD in the integrated Tsallis and Markovian statistical framework. We show that this generalization can be interpreted in terms of mutual information. We also investigate the performance of different JSD generalizations in deconstructing chimeric DNA sequences assembled from bacterial genomes including that of E. coli, S. enterica typhi, Y. pestis and H. influenzae. Our results show that the JSD generalizations bring in more pronounced improvements when the sequences being compared are from phylogenetically proximal organisms, which are often difficult to distinguish because of their compositional similarity. While small but noticeable improvements were observed with the Tsallis statistical JSD generalization, relatively large improvements were observed with the Markovian generalization. In contrast, the proposed Tsallis-Markovian generalization yielded more pronounced improvements relative to the Tsallis and Markovian generalizations, specifically when the sequences being compared arose from phylogenetically proximal organisms.
NASA Astrophysics Data System (ADS)
Guadagnini, A.; Riva, M.; Dell'Oca, A.
2017-12-01
We propose to ground sensitivity of uncertain parameters of environmental models on a set of indices based on the main (statistical) moments, i.e., mean, variance, skewness and kurtosis, of the probability density function (pdf) of a target model output. This enables us to perform Global Sensitivity Analysis (GSA) of a model in terms of multiple statistical moments and yields a quantification of the impact of model parameters on features driving the shape of the pdf of model output. Our GSA approach includes the possibility of being coupled with the construction of a reduced complexity model that allows approximating the full model response at a reduced computational cost. We demonstrate our approach through a variety of test cases. These include a commonly used analytical benchmark, a simplified model representing pumping in a coastal aquifer, a laboratory-scale tracer experiment, and the migration of fracturing fluid through a naturally fractured reservoir (source) to reach an overlying formation (target). Our strategy allows discriminating the relative importance of model parameters to the four statistical moments considered. We also provide an appraisal of the error associated with the evaluation of our sensitivity metrics by replacing the original system model through the selected surrogate model. Our results suggest that one might need to construct a surrogate model with increasing level of accuracy depending on the statistical moment considered in the GSA. The methodological framework we propose can assist the development of analysis techniques targeted to model calibration, design of experiment, uncertainty quantification and risk assessment.
PCA as a practical indicator of OPLS-DA model reliability.
Worley, Bradley; Powers, Robert
Principal Component Analysis (PCA) and Orthogonal Projections to Latent Structures Discriminant Analysis (OPLS-DA) are powerful statistical modeling tools that provide insights into separations between experimental groups based on high-dimensional spectral measurements from NMR, MS or other analytical instrumentation. However, when used without validation, these tools may lead investigators to statistically unreliable conclusions. This danger is especially real for Partial Least Squares (PLS) and OPLS, which aggressively force separations between experimental groups. As a result, OPLS-DA is often used as an alternative method when PCA fails to expose group separation, but this practice is highly dangerous. Without rigorous validation, OPLS-DA can easily yield statistically unreliable group separation. A Monte Carlo analysis of PCA group separations and OPLS-DA cross-validation metrics was performed on NMR datasets with statistically significant separations in scores-space. A linearly increasing amount of Gaussian noise was added to each data matrix followed by the construction and validation of PCA and OPLS-DA models. With increasing added noise, the PCA scores-space distance between groups rapidly decreased and the OPLS-DA cross-validation statistics simultaneously deteriorated. A decrease in correlation between the estimated loadings (added noise) and the true (original) loadings was also observed. While the validity of the OPLS-DA model diminished with increasing added noise, the group separation in scores-space remained basically unaffected. Supported by the results of Monte Carlo analyses of PCA group separations and OPLS-DA cross-validation metrics, we provide practical guidelines and cross-validatory recommendations for reliable inference from PCA and OPLS-DA models.
Zhao, W; Busto, R; Truettner, J; Ginsberg, M D
2001-07-30
The analysis of pixel-based relationships between local cerebral blood flow (LCBF) and mRNA expression can reveal important insights into brain function. Traditionally, LCBF and in situ hybridization studies for genes of interest have been analyzed in separate series. To overcome this limitation and to increase the power of statistical analysis, this study focused on developing a double-label method to measure local cerebral blood flow (LCBF) and gene expressions simultaneously by means of a dual-autoradiography procedure. A 14C-iodoantipyrine autoradiographic LCBF study was first performed. Serial brain sections (12 in this study) were obtained at multiple coronal levels and were processed in the conventional manner to yield quantitative LCBF images. Two replicate sections at each bregma level were then used for in situ hybridization. To eliminate the 14C-iodoantipyrine from these sections, a chloroform-washout procedure was first performed. The sections were then processed for in situ hybridization autoradiography for the probes of interest. This method was tested in Wistar rats subjected to 12 min of global forebrain ischemia by two-vessel occlusion plus hypotension, followed by 2 or 6 h of reperfusion (n=4-6 per group). LCBF and in situ hybridization images for heat shock protein 70 (HSP70) were generated for each rat, aligned by disparity analysis, and analyzed on a pixel-by-pixel basis. This method yielded detailed inter-modality correlation between LCBF and HSP70 mRNA expressions. The advantages of this method include reducing the number of experimental animals by one-half; and providing accurate pixel-based correlations between different modalities in the same animals, thus enabling paired statistical analyses. This method can be extended to permit correlation of LCBF with the expression of multiple genes of interest.
Theory of Financial Risk and Derivative Pricing
NASA Astrophysics Data System (ADS)
Bouchaud, Jean-Philippe; Potters, Marc
2009-01-01
Foreword; Preface; 1. Probability theory: basic notions; 2. Maximum and addition of random variables; 3. Continuous time limit, Ito calculus and path integrals; 4. Analysis of empirical data; 5. Financial products and financial markets; 6. Statistics of real prices: basic results; 7. Non-linear correlations and volatility fluctuations; 8. Skewness and price-volatility correlations; 9. Cross-correlations; 10. Risk measures; 11. Extreme correlations and variety; 12. Optimal portfolios; 13. Futures and options: fundamental concepts; 14. Options: hedging and residual risk; 15. Options: the role of drift and correlations; 16. Options: the Black and Scholes model; 17. Options: some more specific problems; 18. Options: minimum variance Monte-Carlo; 19. The yield curve; 20. Simple mechanisms for anomalous price statistics; Index of most important symbols; Index.
Theory of Financial Risk and Derivative Pricing - 2nd Edition
NASA Astrophysics Data System (ADS)
Bouchaud, Jean-Philippe; Potters, Marc
2003-12-01
Foreword; Preface; 1. Probability theory: basic notions; 2. Maximum and addition of random variables; 3. Continuous time limit, Ito calculus and path integrals; 4. Analysis of empirical data; 5. Financial products and financial markets; 6. Statistics of real prices: basic results; 7. Non-linear correlations and volatility fluctuations; 8. Skewness and price-volatility correlations; 9. Cross-correlations; 10. Risk measures; 11. Extreme correlations and variety; 12. Optimal portfolios; 13. Futures and options: fundamental concepts; 14. Options: hedging and residual risk; 15. Options: the role of drift and correlations; 16. Options: the Black and Scholes model; 17. Options: some more specific problems; 18. Options: minimum variance Monte-Carlo; 19. The yield curve; 20. Simple mechanisms for anomalous price statistics; Index of most important symbols; Index.
Hypothesis testing for band size detection of high-dimensional banded precision matrices.
An, Baiguo; Guo, Jianhua; Liu, Yufeng
2014-06-01
Many statistical analysis procedures require a good estimator for a high-dimensional covariance matrix or its inverse, the precision matrix. When the precision matrix is banded, the Cholesky-based method often yields a good estimator of the precision matrix. One important aspect of this method is determination of the band size of the precision matrix. In practice, crossvalidation is commonly used; however, we show that crossvalidation not only is computationally intensive but can be very unstable. In this paper, we propose a new hypothesis testing procedure to determine the band size in high dimensions. Our proposed test statistic is shown to be asymptotically normal under the null hypothesis, and its theoretical power is studied. Numerical examples demonstrate the effectiveness of our testing procedure.
NASA Astrophysics Data System (ADS)
Jayakumar, M.; Rajavel, M.; Surendran, U.
2016-12-01
A study on the variability of coffee yield of both Coffea arabica and Coffea canephora as influenced by climate parameters (rainfall (RF), maximum temperature (Tmax), minimum temperature (Tmin), and mean relative humidity (RH)) was undertaken at Regional Coffee Research Station, Chundale, Wayanad, Kerala State, India. The result on the coffee yield data of 30 years (1980 to 2009) revealed that the yield of coffee is fluctuating with the variations in climatic parameters. Among the species, productivity was higher for C. canephora coffee than C. arabica in most of the years. Maximum yield of C. canephora (2040 kg ha-1) was recorded in 2003-2004 and there was declining trend of yield noticed in the recent years. Similarly, the maximum yield of C. arabica (1745 kg ha-1) was recorded in 1988-1989 and decreased yield was noticed in the subsequent years till 1997-1998 due to year to year variability in climate. The highest correlation coefficient was found between the yield of C. arabica coffee and maximum temperature during January (0.7) and between C. arabica coffee yield and RH during July (0.4). Yield of C. canephora coffee had highest correlation with maximum temperature, RH and rainfall during February. Statistical regression model between selected climatic parameters and yield of C. arabica and C. canephora coffee was developed to forecast the yield of coffee in Wayanad district in Kerala. The model was validated for years 2010, 2011, and 2012 with the coffee yield data obtained during the years and the prediction was found to be good.
Prakash Maran, J; Manikandan, S; Thirugnanasambandham, K; Vigna Nivetha, C; Dinesh, R
2013-01-30
In this study, ultrasound assisted extraction (UAE) conditions on the yield of polysaccharide from corn silk were studied using three factors, three level Box-Behnken response surface design. Process parameters, which affect the efficiency of UAE such as extraction temperature (40-60 °C), time (10-30 min) and solid-liquid ratio (1:10-1:30 g/ml) were investigated. The results showed that, the extraction conditions have significant effects on extraction yield of polysaccharide. The obtained experimental data were fitted to a second-order polynomial equation using multiple regression analysis with high coefficient of determination value (R(2)) of 0.994. An optimization study using Derringer's desired function methodology was performed and the optimal conditions based on both individual and combinations of all independent variables (extraction temperature of 56 °C, time of 17 min and solid-liquid ratio of 1:20 g/ml) were determined with maximum polysaccharide yield of 6.06%, which was confirmed through validation experiments. Copyright © 2012 Elsevier Ltd. All rights reserved.
Tan, Kok Tat; Lee, Keat Teong; Mohamed, Abdul Rahman
2010-02-01
In this study, fatty acid methyl esters (FAME) have been successfully produced from transesterification reaction between triglycerides and methyl acetate, instead of alcohol. In this non-catalytic supercritical methyl acetate (SCMA) technology, triacetin which is a valuable biodiesel additive is produced as side product rather than glycerol, which has lower commercial value. Besides, the properties of the biodiesel (FAME and triacetin) were found to be superior compared to those produced from conventional catalytic reactions (FAME only). In this study, the effects of various important parameters on the yield of biodiesel were optimized by utilizing Response Surface Methodology (RSM) analysis. The mathematical model developed was found to be adequate and statistically accurate to predict the optimum yield of biodiesel. The optimum conditions were found to be 399 degrees C for reaction temperature, 30 mol/mol of methyl acetate to oil molar ratio and reaction time of 59 min to achieve 97.6% biodiesel yield.
Parametric study for the optimization of ionic liquid pretreatment of corn stover
DOE Office of Scientific and Technical Information (OSTI.GOV)
Papa, Gabriella; Feldman, Taya; Sale, Kenneth L.
A parametric study of the efficacy of the ionic liquid (IL) pretreatment (PT) of corn stover (CS) using 1-ethyl-3-methylimidazolium acetate ([C 2C 1Im][OAc] ) and cholinium lysinate ([Ch][Lys] ) was conducted. The impact of 50% and 15% biomass loading for milled and non-milled CS on IL-PT was evaluated, as well the impact of 20 and 5 mg enzyme/g glucan on saccharification efficiency. The glucose and xylose released were generated from 32 conditions – 2 ionic liquids (ILs), 2 temperatures, 2 particle sizes (S), 2 solid loadings, and 2 enzyme loadings. Statistical analysis indicates that sugar yields were correlated with lignin andmore » xylan removal and depends on the factors, where S did not explain variation in sugar yields. Both ILs were effective in pretreating large particle sized CS, without compromising sugar yields. The knowledge from material and energy balances is an essential step in directing optimization of sugar recovery at desirable process conditions.« less
Apparent Yield Strength of Hot-Pressed SiCs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Daloz, William L; Wereszczak, Andrew A; Jadaan, Osama M.
2008-01-01
Apparent yield strengths (YApp) of four hot-pressed silicon carbides (SiC-B, SiC-N,SiC-HPN, and SiC-SC-1RN) were estimated using diamond spherical or Hertzian indentation. The von Mises and Tresca criteria were considered. The developed test method was robust, simple and quick to execute, and thusly enabled the acquisition of confident sampling statistics. The choice of indenter size, test method, and method of analysis are described. The compressive force necessary to initiate apparent yielding was identified postmortem using differential interference contrast (or Nomarski) imaging with an optical microscope. It was found that the YApp of SiC-HPN (14.0 GPa) was approximately 10% higher than themore » equivalently valued YApp of SiC-B, SiC-N, and SiC-SC-1RN. This discrimination in YApp shows that the use of this test method could be insightful because there were no differences among the average Knoop hardnesses of the four SiC grades.« less
Travaini, Rodolfo; Barrado, Enrique; Bolado-Rodríguez, Silvia
2016-08-01
A L9(3)(4) orthogonal array (OA) experimental design was applied to study the four parameters considered most important in the ozonolysis pretreatment (moisture content, ozone concentration, ozone/oxygen flow and particle size) on ethanol production from sugarcane bagasse (SCB). Statistical analysis highlighted ozone concentration as the highest influence parameter on reaction time and sugars release after enzymatic hydrolysis. The increase on reaction time when decreasing the ozone/oxygen flow resulted in small differences of ozone consumptions. Design optimization for sugars release provided a parameters combination close to the best experimental run, where 77.55% and 56.95% of glucose and xylose yields were obtained, respectively. When optimizing the grams of sugar released by gram of ozone, the highest influence parameter was moisture content, with a maximum yield of 2.98gSUGARS/gO3. In experiments on hydrolysates fermentation, Saccharomyces cerevisiae provided ethanol yields around 80%, while Pichia stipitis was completely inhibited. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
King, A. W.; Absar, S. M.; Nair, S.; Preston, B. L.
2012-12-01
The vulnerability of agriculture is among the leading concerns surrounding climate change. Agricultural production is influenced by drought and other extremes in weather and climate. In regions of subsistence farming, worst case reductions in yield lead to malnutrition and famine. Reduced surplus contributes to poverty in agrarian economies. In more economically diverse and industrialized regions, variations in agricultural yield can influence the regional economy through market mechanisms. The latter grows in importance as agriculture increasingly services the energy market in addition to markets for food and fiber. Agriculture is historically a highly adaptive enterprise and will respond to future changes in climate with a variety of adaptive mechanisms. Nonetheless, the risk, if not expectation, of increases in climate extremes and hazards exceeding historical experience motivates scientifically based anticipatory assessment of the vulnerability of agriculture to climate change. We investigate the sensitivity component of that vulnerability using EPIC, a well established field-scale model of cropping systems that includes the simulation of economic yield. The core of our analysis is the relationship between simulated yield and various indices of climate change, including the CCI/CLIVAR/JCOM ETCCDI indices, calculated from weather inputs to the model. We complement this core with analysis using the DSSAT cropping system model and exploration of relationships between historical yield statistics and climate indices calculated from weather records. Our analyses are for sites in the Southeast/Gulf Coast region of the United States. We do find "tight" monotonic relationships between annual yield and climate for some indices, especially those associated with available water. More commonly, however, we find an increase in the variability of yield as the index value becomes more extreme. Our findings contribute to understanding the sensitivity of crop yield as part of vulnerability analysis. They also contribute to considerations of adaptation, focusing attention on adapting to increased variability in yield rather than just reductions in yield. For example, in the face of increased variability or reduced reliability, hedging and risk spreading strategies may be more important than technological innovations such as drought-resistant crops or other optimization strategies. Our findings also have implications for the choice and application of climate extreme indices, demands on models used to project climate change and the development of next generation integrated assessment models (IAM) that incorporate the agricultural sector, and especially adaption within that sector, in energy and broader more general markets.
Global Agriculture Yields and Conflict under Future Climate
NASA Astrophysics Data System (ADS)
Rising, J.; Cane, M. A.
2013-12-01
Aspects of climate have been shown to correlate significantly with conflict. We investigate a possible pathway for these effects through changes in agriculture yields, as predicted by field crop models (FAO's AquaCrop and DSSAT). Using satellite and station weather data, and surveyed data for soil and management, we simulate major crop yields across all countries between 1961 and 2008, and compare these to FAO and USDA reported yields. Correlations vary by country and by crop, from approximately .8 to -.5. Some of this range in crop model performance is explained by crop varieties, data quality, and other natural, economic, and political features. We also quantify the ability of AquaCrop and DSSAT to simulate yields under past cycles of ENSO as a proxy for their performance under changes in climate. We then describe two statistical models which relate crop yields to conflict events from the UCDP/PRIO Armed Conflict dataset. The first relates several preceding years of predicted yields of the major grain in each country to any conflict involving that country. The second uses the GREG ethnic group maps to identify differences in predicted yields between neighboring regions. By using variation in predicted yields to explain conflict, rather than actual yields, we can identify the exogenous effects of weather on conflict. Finally, we apply precipitation and temperature time-series under IPCC's A1B scenario to the statistical models. This allows us to estimate the scale of the impact of future yields on future conflict. Centroids of the major growing regions for each country's primary crop, based on USDA FAS consumption. Correlations between simulated yields and reported yields, for AquaCrop and DSSAT, under the assumption that no irrigation, fertilization, or pest control is used. Reported yields are the average of FAO yields and USDA FAS yields, where both are available.
Packham, B; Barnes, G; Dos Santos, G Sato; Aristovich, K; Gilad, O; Ghosh, A; Oh, T; Holder, D
2016-06-01
Electrical impedance tomography (EIT) allows for the reconstruction of internal conductivity from surface measurements. A change in conductivity occurs as ion channels open during neural activity, making EIT a potential tool for functional brain imaging. EIT images can have >10 000 voxels, which means statistical analysis of such images presents a substantial multiple testing problem. One way to optimally correct for these issues and still maintain the flexibility of complicated experimental designs is to use random field theory. This parametric method estimates the distribution of peaks one would expect by chance in a smooth random field of a given size. Random field theory has been used in several other neuroimaging techniques but never validated for EIT images of fast neural activity, such validation can be achieved using non-parametric techniques. Both parametric and non-parametric techniques were used to analyze a set of 22 images collected from 8 rats. Significant group activations were detected using both techniques (corrected p < 0.05). Both parametric and non-parametric analyses yielded similar results, although the latter was less conservative. These results demonstrate the first statistical analysis of such an image set and indicate that such an analysis is an approach for EIT images of neural activity.
Packham, B; Barnes, G; dos Santos, G Sato; Aristovich, K; Gilad, O; Ghosh, A; Oh, T; Holder, D
2016-01-01
Abstract Electrical impedance tomography (EIT) allows for the reconstruction of internal conductivity from surface measurements. A change in conductivity occurs as ion channels open during neural activity, making EIT a potential tool for functional brain imaging. EIT images can have >10 000 voxels, which means statistical analysis of such images presents a substantial multiple testing problem. One way to optimally correct for these issues and still maintain the flexibility of complicated experimental designs is to use random field theory. This parametric method estimates the distribution of peaks one would expect by chance in a smooth random field of a given size. Random field theory has been used in several other neuroimaging techniques but never validated for EIT images of fast neural activity, such validation can be achieved using non-parametric techniques. Both parametric and non-parametric techniques were used to analyze a set of 22 images collected from 8 rats. Significant group activations were detected using both techniques (corrected p < 0.05). Both parametric and non-parametric analyses yielded similar results, although the latter was less conservative. These results demonstrate the first statistical analysis of such an image set and indicate that such an analysis is an approach for EIT images of neural activity. PMID:27203477
Application of the Statistical ICA Technique in the DANCE Data Analysis
NASA Astrophysics Data System (ADS)
Baramsai, Bayarbadrakh; Jandel, M.; Bredeweg, T. A.; Rusev, G.; Walker, C. L.; Couture, A.; Mosby, S.; Ullmann, J. L.; Dance Collaboration
2015-10-01
The Detector for Advanced Neutron Capture Experiments (DANCE) at the Los Alamos Neutron Science Center is used to improve our understanding of the neutron capture reaction. DANCE is a highly efficient 4 π γ-ray detector array consisting of 160 BaF2 crystals which make it an ideal tool for neutron capture experiments. The (n, γ) reaction Q-value equals to the sum energy of all γ-rays emitted in the de-excitation cascades from the excited capture state to the ground state. The total γ-ray energy is used to identify reactions on different isotopes as well as the background. However, it's challenging to identify contribution in the Esum spectra from different isotopes with the similar Q-values. Recently we have tested the applicability of modern statistical methods such as Independent Component Analysis (ICA) to identify and separate different (n, γ) reaction yields on different isotopes that are present in the target material. ICA is a recently developed computational tool for separating multidimensional data into statistically independent additive subcomponents. In this conference talk, we present some results of the application of ICA algorithms and its modification for the DANCE experimental data analysis. This research is supported by the U. S. Department of Energy, Office of Science, Nuclear Physics under the Early Career Award No. LANL20135009.
Defect design of insulation systems for photovoltaic modules
NASA Technical Reports Server (NTRS)
Mon, G. R.
1981-01-01
A defect-design approach to sizing electrical insulation systems for terrestrial photovoltaic modules is presented. It consists of gathering voltage-breakdown statistics on various thicknesses of candidate insulation films where, for a designated voltage, module failure probabilities for enumerated thickness and number-of-layer film combinations are calculated. Cost analysis then selects the most economical insulation system. A manufacturing yield problem is solved to exemplify the technique. Results for unaged Mylar suggest using fewer layers of thicker films. Defect design incorporates effects of flaws in optimal insulation system selection, and obviates choosing a tolerable failure rate, since the optimization process accomplishes that. Exposure to weathering and voltage stress reduces the voltage-withstanding capability of module insulation films. Defect design, applied to aged polyester films, promises to yield reliable, cost-optimal insulation systems.
Assembly and analysis of fragmentation data for liquid propellant vessels
NASA Technical Reports Server (NTRS)
Baker, W. E.; Parr, V. B.; Bessey, R. L.; Cox, P. A.
1974-01-01
Fragmentation data was assembled and analyzed for exploding liquid propellant vessels. These data were to be retrieved from reports of tests and accidents, including measurements or estimates of blast yield, etc. A significant amount of data was retrieved from a series of tests conducted for measurement of blast and fireball effects of liquid propellant explosions (Project PYRO), a few well-documented accident reports, and a series of tests to determine auto-ignition properties of mixing liquid propellants. The data were reduced and fitted to various statistical functions. Comparisons were made with methods of prediction for blast yield, initial fragment velocities, and fragment range. Reasonably good correlation was achieved. Methods presented in the report allow prediction of fragment patterns, given type and quantity of propellant, type of accident, and time of propellant mixing.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-08
... statistical surveys that yield quantitative results that can be generalized to the population of study. This... information will not be used for quantitative information collections that are designed to yield reliably... generic mechanisms that are designed to yield quantitative results. The FHWA received no comments in...
An Integrated Analysis of the Physiological Effects of Space Flight: Executive Summary
NASA Technical Reports Server (NTRS)
Leonard, J. I.
1985-01-01
A large array of models were applied in a unified manner to solve problems in space flight physiology. Mathematical simulation was used as an alternative way of looking at physiological systems and maximizing the yield from previous space flight experiments. A medical data analysis system was created which consist of an automated data base, a computerized biostatistical and data analysis system, and a set of simulation models of physiological systems. Five basic models were employed: (1) a pulsatile cardiovascular model; (2) a respiratory model; (3) a thermoregulatory model; (4) a circulatory, fluid, and electrolyte balance model; and (5) an erythropoiesis regulatory model. Algorithms were provided to perform routine statistical tests, multivariate analysis, nonlinear regression analysis, and autocorrelation analysis. Special purpose programs were prepared for rank correlation, factor analysis, and the integration of the metabolic balance data.
Baskar, Gurunathan; Sathya, Shree Rajesh K Lakshmi Jai; Jinnah, Riswana Begum; Sahadevan, Renganathan
2011-01-01
Response surface methodology was employed to optimize the concentration of four important cultivation media components such as cottonseed oil cake, glucose, NH4Cl, and MgSO4 for maximum medicinal polysaccharide yield by Lingzhi or Reishi medicinal mushroom, Ganoderma lucidum MTCC 1039 in submerged culture. The second-order polynomial model describing the relationship between media components and polysaccharide yield was fitted in coded units of the variables. The higher value of the coefficient of determination (R2 = 0.953) justified an excellent correlation between media components and polysaccharide yield, and the model fitted well with high statistical reliability and significance. The predicted optimum concentration of the media components was 3.0% cottonseed oil cake, 3.0% glucose, 0.15% NH4Cl, and 0.045% MgSO4, with the maximum predicted polysaccharide yield of 819.76 mg/L. The experimental polysaccharide yield at the predicted optimum media components was 854.29 mg/L, which was 4.22% higher than the predicted yield.
On the analysis of Canadian Holstein dairy cow lactation curves using standard growth functions.
López, S; France, J; Odongo, N E; McBride, R A; Kebreab, E; AlZahal, O; McBride, B W; Dijkstra, J
2015-04-01
Six classical growth functions (monomolecular, Schumacher, Gompertz, logistic, Richards, and Morgan) were fitted to individual and average (by parity) cumulative milk production curves of Canadian Holstein dairy cows. The data analyzed consisted of approximately 91,000 daily milk yield records corresponding to 122 first, 99 second, and 92 third parity individual lactation curves. The functions were fitted using nonlinear regression procedures, and their performance was assessed using goodness-of-fit statistics (coefficient of determination, residual mean squares, Akaike information criterion, and the correlation and concordance coefficients between observed and adjusted milk yields at several days in milk). Overall, all the growth functions evaluated showed an acceptable fit to the cumulative milk production curves, with the Richards equation ranking first (smallest Akaike information criterion) followed by the Morgan equation. Differences among the functions in their goodness-of-fit were enlarged when fitted to average curves by parity, where the sigmoidal functions with a variable point of inflection (Richards and Morgan) outperformed the other 4 equations. All the functions provided satisfactory predictions of milk yield (calculated from the first derivative of the functions) at different lactation stages, from early to late lactation. The Richards and Morgan equations provided the most accurate estimates of peak yield and total milk production per 305-d lactation, whereas the least accurate estimates were obtained with the logistic equation. In conclusion, classical growth functions (especially sigmoidal functions with a variable point of inflection) proved to be feasible alternatives to fit cumulative milk production curves of dairy cows, resulting in suitable statistical performance and accurate estimates of lactation traits. Copyright © 2015 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
MetaboLyzer: A Novel Statistical Workflow for Analyzing Post-Processed LC/MS Metabolomics Data
Mak, Tytus D.; Laiakis, Evagelia C.; Goudarzi, Maryam; Fornace, Albert J.
2014-01-01
Metabolomics, the global study of small molecules in a particular system, has in the last few years risen to become a primary –omics platform for the study of metabolic processes. With the ever-increasing pool of quantitative data yielded from metabolomic research, specialized methods and tools with which to analyze and extract meaningful conclusions from these data are becoming more and more crucial. Furthermore, the depth of knowledge and expertise required to undertake a metabolomics oriented study is a daunting obstacle to investigators new to the field. As such, we have created a new statistical analysis workflow, MetaboLyzer, which aims to both simplify analysis for investigators new to metabolomics, as well as provide experienced investigators the flexibility to conduct sophisticated analysis. MetaboLyzer’s workflow is specifically tailored to the unique characteristics and idiosyncrasies of postprocessed liquid chromatography/mass spectrometry (LC/MS) based metabolomic datasets. It utilizes a wide gamut of statistical tests, procedures, and methodologies that belong to classical biostatistics, as well as several novel statistical techniques that we have developed specifically for metabolomics data. Furthermore, MetaboLyzer conducts rapid putative ion identification and putative biologically relevant analysis via incorporation of four major small molecule databases: KEGG, HMDB, Lipid Maps, and BioCyc. MetaboLyzer incorporates these aspects into a comprehensive workflow that outputs easy to understand statistically significant and potentially biologically relevant information in the form of heatmaps, volcano plots, 3D visualization plots, correlation maps, and metabolic pathway hit histograms. For demonstration purposes, a urine metabolomics data set from a previously reported radiobiology study in which samples were collected from mice exposed to gamma radiation was analyzed. MetaboLyzer was able to identify 243 statistically significant ions out of a total of 1942. Numerous putative metabolites and pathways were found to be biologically significant from the putative ion identification workflow. PMID:24266674
DOE Office of Scientific and Technical Information (OSTI.GOV)
Le, K. C.; Tran, T. M.; Langer, J. S.
The statistical-thermodynamic dislocation theory developed in previous papers is used here in an analysis of high-temperature deformation of aluminum and steel. Using physics-based parameters that we expect theoretically to be independent of strain rate and temperature, we are able to fit experimental stress-strain curves for three different strain rates and three different temperatures for each of these two materials. Here, our theoretical curves include yielding transitions at zero strain in agreement with experiment. We find that thermal softening effects are important even at the lowest temperatures and smallest strain rates.
An Analysis of the Impact of Job Search Behaviors on Air Force Company Grade Officer Turnover
2012-03-01
pilot tested on Air Force CGOs. Participants were given the definition of passive job search and active job search used in this research effort, and...identifying these different groups and testing the modified model separately within each could yield more accuracy in predicting turnover. This research ...the model the same way. Use of the pseudo R 2 , and the reported statistics and the table design were done in the same manner as previous research
Transistor-like behavior of single metalloprotein junctions.
Artés, Juan M; Díez-Pérez, Ismael; Gorostiza, Pau
2012-06-13
Single protein junctions consisting of azurin bridged between a gold substrate and the probe of an electrochemical tunneling microscope (ECSTM) have been obtained by two independent methods that allowed statistical analysis over a large number of measured junctions. Conductance measurements yield (7.3 ± 1.5) × 10(-6)G(0) in agreement with reported estimates using other techniques. Redox gating of the protein with an on/off ratio of 20 was demonstrated and constitutes a proof-of-principle of a single redox protein field-effect transistor.
Similar Estimates of Temperature Impacts on Global Wheat Yield by Three Independent Methods
NASA Technical Reports Server (NTRS)
Liu, Bing; Asseng, Senthold; Muller, Christoph; Ewart, Frank; Elliott, Joshua; Lobell, David B.; Martre, Pierre; Ruane, Alex C.; Wallach, Daniel; Jones, James W.;
2016-01-01
The potential impact of global temperature change on global crop yield has recently been assessed with different methods. Here we show that grid-based and point-based simulations and statistical regressions (from historic records), without deliberate adaptation or CO2 fertilization effects, produce similar estimates of temperature impact on wheat yields at global and national scales. With a 1 C global temperature increase, global wheat yield is projected to decline between 4.1% and 6.4%. Projected relative temperature impacts from different methods were similar for major wheat-producing countries China, India, USA and France, but less so for Russia. Point-based and grid-based simulations, and to some extent the statistical regressions, were consistent in projecting that warmer regions are likely to suffer more yield loss with increasing temperature than cooler regions. By forming a multi-method ensemble, it was possible to quantify 'method uncertainty' in addition to model uncertainty. This significantly improves confidence in estimates of climate impacts on global food security.
Similar estimates of temperature impacts on global wheat yield by three independent methods
NASA Astrophysics Data System (ADS)
Liu, Bing; Asseng, Senthold; Müller, Christoph; Ewert, Frank; Elliott, Joshua; Lobell, David B.; Martre, Pierre; Ruane, Alex C.; Wallach, Daniel; Jones, James W.; Rosenzweig, Cynthia; Aggarwal, Pramod K.; Alderman, Phillip D.; Anothai, Jakarat; Basso, Bruno; Biernath, Christian; Cammarano, Davide; Challinor, Andy; Deryng, Delphine; Sanctis, Giacomo De; Doltra, Jordi; Fereres, Elias; Folberth, Christian; Garcia-Vila, Margarita; Gayler, Sebastian; Hoogenboom, Gerrit; Hunt, Leslie A.; Izaurralde, Roberto C.; Jabloun, Mohamed; Jones, Curtis D.; Kersebaum, Kurt C.; Kimball, Bruce A.; Koehler, Ann-Kristin; Kumar, Soora Naresh; Nendel, Claas; O'Leary, Garry J.; Olesen, Jørgen E.; Ottman, Michael J.; Palosuo, Taru; Prasad, P. V. Vara; Priesack, Eckart; Pugh, Thomas A. M.; Reynolds, Matthew; Rezaei, Ehsan E.; Rötter, Reimund P.; Schmid, Erwin; Semenov, Mikhail A.; Shcherbak, Iurii; Stehfest, Elke; Stöckle, Claudio O.; Stratonovitch, Pierre; Streck, Thilo; Supit, Iwan; Tao, Fulu; Thorburn, Peter; Waha, Katharina; Wall, Gerard W.; Wang, Enli; White, Jeffrey W.; Wolf, Joost; Zhao, Zhigan; Zhu, Yan
2016-12-01
The potential impact of global temperature change on global crop yield has recently been assessed with different methods. Here we show that grid-based and point-based simulations and statistical regressions (from historic records), without deliberate adaptation or CO2 fertilization effects, produce similar estimates of temperature impact on wheat yields at global and national scales. With a 1 °C global temperature increase, global wheat yield is projected to decline between 4.1% and 6.4%. Projected relative temperature impacts from different methods were similar for major wheat-producing countries China, India, USA and France, but less so for Russia. Point-based and grid-based simulations, and to some extent the statistical regressions, were consistent in projecting that warmer regions are likely to suffer more yield loss with increasing temperature than cooler regions. By forming a multi-method ensemble, it was possible to quantify `method uncertainty’ in addition to model uncertainty. This significantly improves confidence in estimates of climate impacts on global food security.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chistov, R.; Aushev, T.; Balagura, V.
We report the first observation of the decay B{sup +}{yields}{xi}{sub c}{sup 0}{lambda}{sub c}{sup +} with a significance of 8.7{sigma} and evidence for the decay B{sup 0}{yields}{xi}{sub c}{sup -}{lambda}{sub c}{sup +} with a significance of 3.8{sigma}. The product B(B{sup +}{yields}{xi}{sub c}{sup 0}{lambda}{sub c}{sup +})xB({xi}{sub c}{sup 0}{yields}{xi}{sup +}{pi}{sup -}) is measured to be (4.8{sub -0.9}{sup +1.0}{+-}1.1{+-}1.2)x10{sup -5}, and B(B{sup 0}{yields}{xi}{sub c}{sup -}{lambda}{sub c}{sup +})xB({xi}{sub c}{sup -}{yields}{xi}{sup +}{pi}{sup -}{pi}{sup -}) is measured to be (9.3{sub -2.8}{sup +3.7}{+-}1.9{+-}2.4)x10{sup -5}. The errors are statistical, systematic and the error of the {lambda}{sub c}{sup +}{yields}pK{sup -}{pi}{sup +} branching fraction, respectively. The decay B{sup +}{yields}{xi}{sub c}{sup 0}{lambda}{sub c}{supmore » +} is the first example of a two-body exclusive B{sup +} decay into two charmed baryons. The data used for this analysis was accumulated at the {upsilon}(4S) resonance, using the Belle detector at the e{sup +}e{sup -} asymmetric-energy collider KEKB. The integrated luminosity of the data sample is equal to 357 fb{sup -1}, corresponding to 386x10{sup 6} BB pairs.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mayer, B. P.; Valdez, C. A.; DeHope, A. J.
Critical to many modern forensic investigations is the chemical attribution of the origin of an illegal drug. This process greatly relies on identification of compounds indicative of its clandestine or commercial production. The results of these studies can yield detailed information on method of manufacture, sophistication of the synthesis operation, starting material source, and final product. In the present work, chemical attribution signatures (CAS) associated with the synthesis of the analgesic 3- methylfentanyl, N-(3-methyl-1-phenethylpiperidin-4-yl)-N-phenylpropanamide, were investigated. Six synthesis methods were studied in an effort to identify and classify route-specific signatures. These methods were chosen to minimize the use of scheduledmore » precursors, complicated laboratory equipment, number of overall steps, and demanding reaction conditions. Using gas and liquid chromatographies combined with mass spectrometric methods (GC-QTOF and LC-QTOF) in conjunction with inductivelycoupled plasma mass spectrometry (ICP-MS), over 240 distinct compounds and elements were monitored. As seen in our previous work with CAS of fentanyl synthesis the complexity of the resultant data matrix necessitated the use of multivariate statistical analysis. Using partial least squares discriminant analysis (PLS-DA), 62 statistically significant, route-specific CAS were identified. Statistical classification models using a variety of machine learning techniques were then developed with the ability to predict the method of 3-methylfentanyl synthesis from three blind crude samples generated by synthetic chemists without prior experience with these methods.« less
NASA Astrophysics Data System (ADS)
Alves, Gelio; Wang, Guanghui; Ogurtsov, Aleksey Y.; Drake, Steven K.; Gucek, Marjan; Suffredini, Anthony F.; Sacks, David B.; Yu, Yi-Kuo
2016-02-01
Correct and rapid identification of microorganisms is the key to the success of many important applications in health and safety, including, but not limited to, infection treatment, food safety, and biodefense. With the advance of mass spectrometry (MS) technology, the speed of identification can be greatly improved. However, the increasing number of microbes sequenced is challenging correct microbial identification because of the large number of choices present. To properly disentangle candidate microbes, one needs to go beyond apparent morphology or simple `fingerprinting'; to correctly prioritize the candidate microbes, one needs to have accurate statistical significance in microbial identification. We meet these challenges by using peptidome profiles of microbes to better separate them and by designing an analysis method that yields accurate statistical significance. Here, we present an analysis pipeline that uses tandem MS (MS/MS) spectra for microbial identification or classification. We have demonstrated, using MS/MS data of 81 samples, each composed of a single known microorganism, that the proposed pipeline can correctly identify microorganisms at least at the genus and species levels. We have also shown that the proposed pipeline computes accurate statistical significances, i.e., E-values for identified peptides and unified E-values for identified microorganisms. The proposed analysis pipeline has been implemented in MiCId, a freely available software for Microorganism Classification and Identification. MiCId is available for download at http://www.ncbi.nlm.nih.gov/CBBresearch/Yu/downloads.html.
Islam, R S; Tisi, D; Levy, M S; Lye, G J
2007-01-01
A major bottleneck in drug discovery is the production of soluble human recombinant protein in sufficient quantities for analysis. This problem is compounded by the complex relationship between protein yield and the large number of variables which affect it. Here, we describe a generic framework for the rapid identification and optimization of factors affecting soluble protein yield in microwell plate fermentations as a prelude to the predictive and reliable scaleup of optimized culture conditions. Recombinant expression of firefly luciferase in Escherichia coli was used as a model system. Two rounds of statistical design of experiments (DoE) were employed to first screen (D-optimal design) and then optimize (central composite face design) the yield of soluble protein. Biological variables from the initial screening experiments included medium type and growth and induction conditions. To provide insight into the impact of the engineering environment on cell growth and expression, plate geometry, shaking speed, and liquid fill volume were included as factors since these strongly influence oxygen transfer into the wells. Compared to standard reference conditions, both the screening and optimization designs gave up to 3-fold increases in the soluble protein yield, i.e., a 9-fold increase overall. In general the highest protein yields were obtained when cells were induced at a relatively low biomass concentration and then allowed to grow slowly up to a high final biomass concentration, >8 g.L-1. Consideration and analysis of the model results showed 6 of the original 10 variables to be important at the screening stage and 3 after optimization. The latter included the microwell plate shaking speeds pre- and postinduction, indicating the importance of oxygen transfer into the microwells and identifying this as a critical parameter for subsequent scale translation studies. The optimization process, also known as response surface methodology (RSM), predicted there to be a distinct optimum set of conditions for protein expression which could be verified experimentally. This work provides a generic approach to protein expression optimization in which both biological and engineering variables are investigated from the initial screening stage. The application of DoE reduces the total number of experiments needed to be performed, while experimentation at the microwell scale increases experimental throughput and reduces cost.
Application of a GCM Ensemble Seasonal Climate Forecasts to Crop Yield Prediction in East Africa
NASA Astrophysics Data System (ADS)
Ogutu, G.; Franssen, W.; Supit, I.; Hutjes, R. W. A.
2016-12-01
We evaluated the potential use of ECMWF System-4 seasonal climate forecasts (S4) for impacts analysis over East Africa. Using the 15 member, 7 months ensemble forecasts initiated every month for 1981-2010, we tested precipitation (tp), air temperature (tas) and surface shortwave radiation (rsds) forecast skill against the WATCH forcing Data ERA-Interim (WFDEI) re-analysis and other data. We used these forecasts as input in the WOFOST crop model to predict maize yields. Forecast skill is assessed using anomaly correlation (ACC), Ranked Probability Skill Score (RPSS) and the Relative Operating Curve Skill Score (ROCSS) for MAM, JJA and OND growing seasons. Predicted maize yields (S4-yields) are verified against historical observed FAO and nationally reported (NAT) yield statistics, and yields from the same crop model forced by WFDEI (WFDEI-yields). Predictability of the climate forecasts vary with season, location and lead-time. The OND tp forecasts show skill over a larger area up to three months lead-time compared to MAM and JJA. Upper- and lower-tercile tp forecasts are 20-80% better than climatology. Good tas forecast skill is apparent with three months lead-time. The rsds is less skillful than tp and tas in all seasons when verified against WFDEI but higher against others. S4-forecasts captures ENSO related anomalous years with region dependent skill. Anomalous ENSO influence is also seen in simulated yields. Focussing on the main sowing dates in the northern (July), equatorial (March-April) and southern (December) regions, WFDEI-yields are lower than FAO and NAT but anomalies are comparable. Yield anomalies are predictable 3-months before sowing in most of the regions. Differences in interannual variability in the range of ±40% may be related to sensitivity of WOFOST to drought stress while the ACCs are largely positive ranging from 0.3 to 0.6. Above and below-normal yields are predictable with 2-months lead time. We evidenced a potential use of seasonal climate forecasts with a crop simulation model to predict anomalous maize yields over East Africa. The findings open a window to better use of climate forecasts in food security early warning systems, and pre-season policy and farm management decisions.
Big-Data RHEED analysis for understanding epitaxial film growth processes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vasudevan, Rama K; Tselev, Alexander; Baddorf, Arthur P
Reflection high energy electron diffraction (RHEED) has by now become a standard tool for in-situ monitoring of film growth by pulsed laser deposition and molecular beam epitaxy. Yet despite the widespread adoption and wealth of information in RHEED image, most applications are limited to observing intensity oscillations of the specular spot, and much additional information on growth is discarded. With ease of data acquisition and increased computation speeds, statistical methods to rapidly mine the dataset are now feasible. Here, we develop such an approach to the analysis of the fundamental growth processes through multivariate statistical analysis of RHEED image sequence.more » This approach is illustrated for growth of LaxCa1-xMnO3 films grown on etched (001) SrTiO3 substrates, but is universal. The multivariate methods including principal component analysis and k-means clustering provide insight into the relevant behaviors, the timing and nature of a disordered to ordered growth change, and highlight statistically significant patterns. Fourier analysis yields the harmonic components of the signal and allows separation of the relevant components and baselines, isolating the assymetric nature of the step density function and the transmission spots from the imperfect layer-by-layer (LBL) growth. These studies show the promise of big data approaches to obtaining more insight into film properties during and after epitaxial film growth. Furthermore, these studies open the pathway to use forward prediction methods to potentially allow significantly more control over growth process and hence final film quality.« less
NASA Technical Reports Server (NTRS)
Ponomarev, Artem; Cucinotta, F.
2011-01-01
To create a generalized mechanistic model of DNA damage in human cells that will generate analytical and image data corresponding to experimentally observed DNA damage foci and will help to improve the experimental foci yields by simulating spatial foci patterns and resolving problems with quantitative image analysis. Material and Methods: The analysis of patterns of RIFs (radiation-induced foci) produced by low- and high-LET (linear energy transfer) radiation was conducted by using a Monte Carlo model that combines the heavy ion track structure with characteristics of the human genome on the level of chromosomes. The foci patterns were also simulated in the maximum projection plane for flat nuclei. Some data analysis was done with the help of image segmentation software that identifies individual classes of RIFs and colocolized RIFs, which is of importance to some experimental assays that assign DNA damage a dual phosphorescent signal. Results: The model predicts the spatial and genomic distributions of DNA DSBs (double strand breaks) and associated RIFs in a human cell nucleus for a particular dose of either low- or high-LET radiation. We used the model to do analyses for different irradiation scenarios. In the beam-parallel-to-the-disk-of-a-flattened-nucleus scenario we found that the foci appeared to be merged due to their high density, while, in the perpendicular-beam scenario, the foci appeared as one bright spot per hit. The statistics and spatial distribution of regions of densely arranged foci, termed DNA foci chains, were predicted numerically using this model. Another analysis was done to evaluate the number of ion hits per nucleus, which were visible from streaks of closely located foci. In another analysis, our image segmentaiton software determined foci yields directly from images with single-class or colocolized foci. Conclusions: We showed that DSB clustering needs to be taken into account to determine the true DNA damage foci yield, which helps to determine the DSB yield. Using the model analysis, a researcher can refine the DSB yield per nucleus per particle. We showed that purely geometric artifacts, present in the experimental images, can be analytically resolved with the model, and that the quantization of track hits and DSB yields can be provided to the experimentalists who use enumeration of radiation-induced foci in immunofluorescence experiments using proteins that detect DNA damage. An automated image segmentaiton software can prove useful in a faster and more precise object counting for colocolized foci images.
Koltun, G.F.; Kula, Stephanie P.
2013-01-01
This report presents the results of a study to develop methods for estimating selected low-flow statistics and for determining annual flow-duration statistics for Ohio streams. Regression techniques were used to develop equations for estimating 10-year recurrence-interval (10-percent annual-nonexceedance probability) low-flow yields, in cubic feet per second per square mile, with averaging periods of 1, 7, 30, and 90-day(s), and for estimating the yield corresponding to the long-term 80-percent duration flow. These equations, which estimate low-flow yields as a function of a streamflow-variability index, are based on previously published low-flow statistics for 79 long-term continuous-record streamgages with at least 10 years of data collected through water year 1997. When applied to the calibration dataset, average absolute percent errors for the regression equations ranged from 15.8 to 42.0 percent. The regression results have been incorporated into the U.S. Geological Survey (USGS) StreamStats application for Ohio (http://water.usgs.gov/osw/streamstats/ohio.html) in the form of a yield grid to facilitate estimation of the corresponding streamflow statistics in cubic feet per second. Logistic-regression equations also were developed and incorporated into the USGS StreamStats application for Ohio for selected low-flow statistics to help identify occurrences of zero-valued statistics. Quantiles of daily and 7-day mean streamflows were determined for annual and annual-seasonal (September–November) periods for each complete climatic year of streamflow-gaging station record for 110 selected streamflow-gaging stations with 20 or more years of record. The quantiles determined for each climatic year were the 99-, 98-, 95-, 90-, 80-, 75-, 70-, 60-, 50-, 40-, 30-, 25-, 20-, 10-, 5-, 2-, and 1-percent exceedance streamflows. Selected exceedance percentiles of the annual-exceedance percentiles were subsequently computed and tabulated to help facilitate consideration of the annual risk of exceedance or nonexceedance of annual and annual-seasonal-period flow-duration values. The quantiles are based on streamflow data collected through climatic year 2008.
Villodre, Celia; Rebasa, Pere; Estrada, José Luís; Zaragoza, Carmen; Zapater, Pedro; Mena, Luís; Lluís, Félix
2016-11-01
In a previous study, we found that Physiological and Operative Severity Score for the enUmeration of Mortality and Morbidity (POSSUM) overpredicts morbidity risk in emergency gastrointestinal surgery. Our aim was to find a POSSUM equation adjustment. A prospective observational study was performed on 2,361 patients presenting with a community-acquired gastrointestinal surgical emergency. The first 1,000 surgeries constituted the development cohort, the second 1,000 events were the first validation intramural cohort, and the remaining 361 cases belonged to a second validation extramural cohort. (1) A modified POSSUM equation was obtained. (2) Logistic regression was used to yield a statistically significant equation that included age, hemoglobin, white cell count, sodium and operative severity. (3) A chi-square automatic interaction detector decision tree analysis yielded a statistically significant equation with 4 variables, namely cardiac failure, sodium, operative severity, and peritoneal soiling. A modified POSSUM equation and a simplified scoring system (aLicante sUrgical Community Emergencies New Tool for the enUmeration of Morbidities [LUCENTUM]) are described. Both tools significantly improve prediction of surgical morbidity in community-acquired gastrointestinal surgical emergencies. Copyright © 2016 Elsevier Inc. All rights reserved.
Chuprom, Julalak; Bovornreungroj, Preeyanuch; Ahmad, Mehraj; Kantachote, Duangporn; Dueramae, Sawitree
2016-06-01
A new potent halophilic protease producer, Halobacterium sp. strain LBU50301 was isolated from salt-fermented fish samples ( budu ) and identified by phenotypic analysis, and 16S rDNA gene sequencing. Thereafter, sequential statistical strategy was used to optimize halophilic protease production from Halobacterium sp. strain LBU50301 by shake-flask fermentation. The classical one-factor-at-a-time (OFAT) approach determined gelatin was the best nitrogen source. Based on Plackett - Burman (PB) experimental design; gelatin, MgSO 4 ·7H 2 O, NaCl and pH significantly influenced the halophilic protease production. Central composite design (CCD) determined the optimum level of medium components. Subsequently, an 8.78-fold increase in corresponding halophilic protease yield (156.22 U/mL) was obtained, compared with that produced in the original medium (17.80 U/mL). Validation experiments proved the adequacy and accuracy of model, and the results showed the predicted value agreed well with the experimental values. An overall 13-fold increase in halophilic protease yield was achieved using a 3 L laboratory fermenter and optimized medium (231.33 U/mL).
Alshelleh, Mohammad; Inamdar, Sumant; McKinley, Matthew; Stewart, Molly; Novak, Jeffrey S; Greenberg, Ronald E; Sultan, Keith; Devito, Bethany; Cheung, Mary; Cerulli, Maurice A; Miller, Larry S; Sejpal, Divyesh V; Vegesna, Anil K; Trindade, Arvind J
2018-02-02
Volumetric laser endomicroscopy (VLE) is a new wide-field advanced imaging technology for Barrett's esophagus (BE). No data exist on incremental yield of dysplasia detection. Our aim is to report the incremental yield of dysplasia detection in BE using VLE. This is a retrospective study from a prospectively maintained database from 2011 to 2017 comparing the dysplasia yield of 4 different surveillance strategies in an academic BE tertiary care referral center. The groups were (1) random biopsies (RB), (2) Seattle protocol random biopsies (SP), (3) VLE without laser marking (VLE), and (4) VLE with laser marking (VLEL). A total of 448 consecutive patients (79 RB, 95 SP, 168 VLE, and 106 VLEL) met the inclusion criteria. After adjusting for visible lesions, the total dysplasia yield was 5.7%, 19.6%, 24.8%, and 33.7%, respectively. When compared with just the SP group, the VLEL group had statistically higher rates of overall dysplasia yield (19.6% vs 33.7%, P = .03; odds ratio, 2.1, P = .03). Both the VLEL and VLE groups had statistically significant differences in neoplasia (high-grade dysplasia and intramucosal cancer) detection compared with the SP group (14% vs 1%, P = .001 and 11% vs 1%, P = .003). A surveillance strategy involving VLEL led to a statistically significant higher yield of dysplasia and neoplasia detection compared with a standard random biopsy protocol. These results support the use of VLEL for surveillance in BE in academic centers. Copyright © 2018 American Society for Gastrointestinal Endoscopy. Published by Elsevier Inc. All rights reserved.
Jabari, Hamidreza; Sami, Ramin; Fakhri, Mohammad; Kiani, Arda
2012-01-01
Forceps biopsy is the standard procedure to obtain specimens in endobronchial lesions. New studies have proposed flexible cryoprobe as an accepted alternative method for this technique. Although diagnostic use of the cryobiopsy is confirmed in few studies, there is paucity of data with regard to an optimum protocol for this method since one of the main considerations in cryobiopsy is the freezing time. To evaluate diagnostic yield and safety of endobronchial biopsies using the flexible cryoprobe. Moreover, different freezing times were assessed to propose an optimized protocol for this diagnostic modality. For each patient with a confirmed intrabronchial lesion, diagnostic o value of forceps biopsy, cryobiopsy in three seconds, cryobiopsy in five seconds and combined results of cryobiopsy in both timings were recorded. A total of 60 patients (39 males and 21 females; Mean age 56.7 +/- 13.3) were included. Specimens that were obtained by cryobiopsy in five seconds were significantly larger than those of forceps biopsy and cryobiopsy in three seconds (p < 0.001). We showed that the achieved diagnostic yields for all three methods were not statistically different (p > 0.05). Simultaneous usage of samples produced in both cryobiopsies can significantly improve the diagnostic yield (p = 0.02). Statistical analysis showed that there were no significant differences in case of bleeding frequency among the three sampling methods. This study confirmed safety and feasibility of cryobiopsy. Additionally, combination of sampling with two different cold induction timings would significantly increase sensitivity of this emerging technique..
Tammimies, Kristiina; Marshall, Christian R; Walker, Susan; Kaur, Gaganjot; Thiruvahindrapuram, Bhooma; Lionel, Anath C; Yuen, Ryan K C; Uddin, Mohammed; Roberts, Wendy; Weksberg, Rosanna; Woodbury-Smith, Marc; Zwaigenbaum, Lonnie; Anagnostou, Evdokia; Wang, Zhuozhi; Wei, John; Howe, Jennifer L; Gazzellone, Matthew J; Lau, Lynette; Sung, Wilson W L; Whitten, Kathy; Vardy, Cathy; Crosbie, Victoria; Tsang, Brian; D'Abate, Lia; Tong, Winnie W L; Luscombe, Sandra; Doyle, Tyna; Carter, Melissa T; Szatmari, Peter; Stuckless, Susan; Merico, Daniele; Stavropoulos, Dimitri J; Scherer, Stephen W; Fernandez, Bridget A
2015-09-01
The use of genome-wide tests to provide molecular diagnosis for individuals with autism spectrum disorder (ASD) requires more study. To perform chromosomal microarray analysis (CMA) and whole-exome sequencing (WES) in a heterogeneous group of children with ASD to determine the molecular diagnostic yield of these tests in a sample typical of a developmental pediatric clinic. The sample consisted of 258 consecutively ascertained unrelated children with ASD who underwent detailed assessments to define morphology scores based on the presence of major congenital abnormalities and minor physical anomalies. The children were recruited between 2008 and 2013 in Newfoundland and Labrador, Canada. The probands were stratified into 3 groups of increasing morphological severity: essential, equivocal, and complex (scores of 0-3, 4-5, and ≥6). All probands underwent CMA, with WES performed for 95 proband-parent trios. The overall molecular diagnostic yield for CMA and WES in a population-based ASD sample stratified in 3 phenotypic groups. Of 258 probands, 24 (9.3%, 95%CI, 6.1%-13.5%) received a molecular diagnosis from CMA and 8 of 95 (8.4%, 95%CI, 3.7%-15.9%) from WES. The yields were statistically different between the morphological groups. Among the children who underwent both CMA and WES testing, the estimated proportion with an identifiable genetic etiology was 15.8% (95%CI, 9.1%-24.7%; 15/95 children). This included 2 children who received molecular diagnoses from both tests. The combined yield was significantly higher in the complex group when compared with the essential group (pairwise comparison, P = .002). [table: see text]. Among a heterogeneous sample of children with ASD, the molecular diagnostic yields of CMA and WES were comparable, and the combined molecular diagnostic yield was higher in children with more complex morphological phenotypes in comparison with the children in the essential category. If replicated in additional populations, these findings may inform appropriate selection of molecular diagnostic testing for children affected by ASD.
Weinfurtner, R Jared; Patel, Bhavika; Laronga, Christine; Lee, Marie C; Falcon, Shannon L; Mooney, Blaise P; Yue, Binglin; Drukteinis, Jennifer S
2015-06-01
Analysis of magnetic resonance imaging-guided breast biopsies yielding high-risk histopathologic features at a single institution found an overall upstage rate to malignancy of 14% at surgical excision. All upstaged lesions were associated with atypical ductal hyperplasia. Flat epithelial atypia and atypical lobular hyperplasia alone or with lobular carcinoma in situ were not associated with an upstage to malignancy. The purpose of the present study w as to determine the malignancy upstage rates and imaging features of high-risk histopathologic findings resulting from magnetic resonance imaging (MRI)-guided core needle breast biopsies. These features include atypical ductal hyperplasia (ADH), atypical lobular hyperplasia (ALH), flat epithelial atypia (FEA), and lobular carcinoma in situ (LCIS). A retrospective medical record review was performed on all MRI-guided core needle breast biopsies at a single institution from June 1, 2007 to December 1, 2013 to select biopsies yielding high-risk histopathologic findings. The patient demographics, MRI lesion characteristics, and histopathologic features at biopsy and surgical excision were analyzed. A total of 257 MRI-guided biopsies had been performed, and 50 yielded high-risk histopathologic features (19%). Biopsy site and surgical excision site correlation was confirmed in 29 of 50 cases. Four of 29 lesions (14%) were upstaged: 1 case to invasive ductal carcinoma and 3 cases to ductal carcinoma in situ. ADH alone had an overall upstage rate of 7% (1 of 14), mixed ADH/ALH a rate of 75% (3 of 4), ALH alone or with LCIS a rate of 0% (0 of 7), and FEA a rate of 0% (0 of 4). Only mixed ADH/ALH had a statistically significant upstage rate to malignancy compared with the other high-risk histopathologic subtypes combined. No specific imaging characteristics on MRI were associated with an upstage to malignancy on the statistical analysis. MRI-guided breast biopsies yielding high-risk histopathologic features were associated with an overall upstage to malignancy rate of 14% at surgical excision. All upstaged lesions were associated with ADH. FEA and ALH alone or with LCIS were not associated with an upstage to malignancy. Copyright © 2015 Elsevier Inc. All rights reserved.
Kinetic analysis of single molecule FRET transitions without trajectories
NASA Astrophysics Data System (ADS)
Schrangl, Lukas; Göhring, Janett; Schütz, Gerhard J.
2018-03-01
Single molecule Förster resonance energy transfer (smFRET) is a popular tool to study biological systems that undergo topological transitions on the nanometer scale. smFRET experiments typically require recording of long smFRET trajectories and subsequent statistical analysis to extract parameters such as the states' lifetimes. Alternatively, analysis of probability distributions exploits the shapes of smFRET distributions at well chosen exposure times and hence works without the acquisition of time traces. Here, we describe a variant that utilizes statistical tests to compare experimental datasets with Monte Carlo simulations. For a given model, parameters are varied to cover the full realistic parameter space. As output, the method yields p-values which quantify the likelihood for each parameter setting to be consistent with the experimental data. The method provides suitable results even if the actual lifetimes differ by an order of magnitude. We also demonstrated the robustness of the method to inaccurately determine input parameters. As proof of concept, the new method was applied to the determination of transition rate constants for Holliday junctions.
Topographic ERP analyses: a step-by-step tutorial review.
Murray, Micah M; Brunet, Denis; Michel, Christoph M
2008-06-01
In this tutorial review, we detail both the rationale for as well as the implementation of a set of analyses of surface-recorded event-related potentials (ERPs) that uses the reference-free spatial (i.e. topographic) information available from high-density electrode montages to render statistical information concerning modulations in response strength, latency, and topography both between and within experimental conditions. In these and other ways these topographic analysis methods allow the experimenter to glean additional information and neurophysiologic interpretability beyond what is available from canonical waveform analyses. In this tutorial we present the example of somatosensory evoked potentials (SEPs) in response to stimulation of each hand to illustrate these points. For each step of these analyses, we provide the reader with both a conceptual and mathematical description of how the analysis is carried out, what it yields, and how to interpret its statistical outcome. We show that these topographic analysis methods are intuitive and easy-to-use approaches that can remove much of the guesswork often confronting ERP researchers and also assist in identifying the information contained within high-density ERP datasets.
Sales, D C; Rangel, A H N; Urbano, S A; Freitas, Alfredo R; Tonhati, Humberto; Novaes, L P; Pereira, M I B; Borba, L H F
2017-06-01
Our aim was to identify the relationship between mozzarella cheese yield and buffalo milk composition, processing factors, and recovery of whey constituents. A production of 30 batches of mozzarella cheese at a dairy industry in northeast Brazil (Rio Grande do Norte) was monitored between March and November 2015. Mozzarella yield and 32 other variables were observed for each batch, and divided into 3 groups: milk composition variables (12); variables involved in the cheesemaking process (14); and variables for recovery of whey constituents (6). Data were analyzed using descriptive statistics, Pearson correlation, and principal component analysis. Most of the correlations between milk composition variables and between the variables of the manufacturing processes were not significant. Significant correlations were mostly observed between variables for recovery of whey constituents. Yield only showed significant correlation with time elapsed between curd cuttings and age of the starter culture, and it showed greater association with age of the starter culture, time elapsed between curd cuttings, and during stretching, as well as with milk pH and density. Thus, processing factors and milk characteristics are closely related to dairy efficiency in mozzarella manufacturing. Copyright © 2017 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Fishman, Jack; Creilson, John K.; Parker, Peter A.; Ainsworth, Elizabeth A.; Vining, G. Geoffrey; Szarka, John; Booker, Fitzgerald L.; Xu, Xiaojing
2010-01-01
Elevated concentrations of ground-level ozone (O3) are frequently measured over farmland regions in many parts of the world. While numerous experimental studies show that O3 can significantly decrease crop productivity, independent verifications of yield losses at current ambient O3 concentrations in rural locations are sparse. In this study, soybean crop yield data during a 5-year period over the Midwest of the United States were combined with ground and satellite O3 measurements to provide evidence that yield losses on the order of 10% could be estimated through the use of a multiple linear regression model. Yield loss trends based on both conventional ground-based instrumentation and satellite-derived tropospheric O3 measurements were statistically significant and were consistent with results obtained from open-top chamber experiments and an open-air experimental facility (SoyFACE, Soybean Free Air Concentration Enrichment) in central Illinois. Our analysis suggests that such losses are a relatively new phenomenon due to the increase in background tropospheric O3 levels over recent decades. Extrapolation of these findings supports previous studies that estimate the global economic loss to the farming community of more than $10 billion annually.
First observation of the decays {chi}{sub cJ}{yields}{pi}{sup 0}{pi}{sup 0}{pi}{sup 0}{pi}{sup 0}
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ablikim, M.; An, Z. H.; Bai, J. Z.
We present a study of the P-wave spin-triplet charmonium {chi}{sub cJ} decays (J=0, 1, 2) into {pi}{sup 0}{pi}{sup 0}{pi}{sup 0}{pi}{sup 0}. The analysis is based on 106x10{sup 6} {psi}{sup '} decays recorded with the BESIII detector at the BEPCII electron positron collider. The decay into the {pi}{sup 0}{pi}{sup 0}{pi}{sup 0}{pi}{sup 0} hadronic final state is observed for the first time. We measure the branching fractions B({chi}{sub c0}{yields}{pi}{sup 0}{pi}{sup 0}{pi}{sup 0}{pi}{sup 0})=(3.34{+-}0.06{+-}0.44)x10{sup -3}, B({chi}{sub c1}{yields}{pi}{sup 0}{pi}{sup 0}{pi}{sup 0}{pi}{sup 0})=(0.57{+-}0.03{+-}0.08)x10{sup -3}, and B({chi}{sub c2}{yields}{pi}{sup 0}{pi}{sup 0}{pi}{sup 0}{pi}{sup 0})=(1.21{+-}0.05{+-}0.16)x10{sup -3}, where the uncertainties are statistical and systematical, respectively.
Extending local canonical correlation analysis to handle general linear contrasts for FMRI data.
Jin, Mingwu; Nandy, Rajesh; Curran, Tim; Cordes, Dietmar
2012-01-01
Local canonical correlation analysis (CCA) is a multivariate method that has been proposed to more accurately determine activation patterns in fMRI data. In its conventional formulation, CCA has several drawbacks that limit its usefulness in fMRI. A major drawback is that, unlike the general linear model (GLM), a test of general linear contrasts of the temporal regressors has not been incorporated into the CCA formalism. To overcome this drawback, a novel directional test statistic was derived using the equivalence of multivariate multiple regression (MVMR) and CCA. This extension will allow CCA to be used for inference of general linear contrasts in more complicated fMRI designs without reparameterization of the design matrix and without reestimating the CCA solutions for each particular contrast of interest. With the proper constraints on the spatial coefficients of CCA, this test statistic can yield a more powerful test on the inference of evoked brain regional activations from noisy fMRI data than the conventional t-test in the GLM. The quantitative results from simulated and pseudoreal data and activation maps from fMRI data were used to demonstrate the advantage of this novel test statistic.
Physics-based statistical learning approach to mesoscopic model selection.
Taverniers, Søren; Haut, Terry S; Barros, Kipton; Alexander, Francis J; Lookman, Turab
2015-11-01
In materials science and many other research areas, models are frequently inferred without considering their generalization to unseen data. We apply statistical learning using cross-validation to obtain an optimally predictive coarse-grained description of a two-dimensional kinetic nearest-neighbor Ising model with Glauber dynamics (GD) based on the stochastic Ginzburg-Landau equation (sGLE). The latter is learned from GD "training" data using a log-likelihood analysis, and its predictive ability for various complexities of the model is tested on GD "test" data independent of the data used to train the model on. Using two different error metrics, we perform a detailed analysis of the error between magnetization time trajectories simulated using the learned sGLE coarse-grained description and those obtained using the GD model. We show that both for equilibrium and out-of-equilibrium GD training trajectories, the standard phenomenological description using a quartic free energy does not always yield the most predictive coarse-grained model. Moreover, increasing the amount of training data can shift the optimal model complexity to higher values. Our results are promising in that they pave the way for the use of statistical learning as a general tool for materials modeling and discovery.
Extending Local Canonical Correlation Analysis to Handle General Linear Contrasts for fMRI Data
Jin, Mingwu; Nandy, Rajesh; Curran, Tim; Cordes, Dietmar
2012-01-01
Local canonical correlation analysis (CCA) is a multivariate method that has been proposed to more accurately determine activation patterns in fMRI data. In its conventional formulation, CCA has several drawbacks that limit its usefulness in fMRI. A major drawback is that, unlike the general linear model (GLM), a test of general linear contrasts of the temporal regressors has not been incorporated into the CCA formalism. To overcome this drawback, a novel directional test statistic was derived using the equivalence of multivariate multiple regression (MVMR) and CCA. This extension will allow CCA to be used for inference of general linear contrasts in more complicated fMRI designs without reparameterization of the design matrix and without reestimating the CCA solutions for each particular contrast of interest. With the proper constraints on the spatial coefficients of CCA, this test statistic can yield a more powerful test on the inference of evoked brain regional activations from noisy fMRI data than the conventional t-test in the GLM. The quantitative results from simulated and pseudoreal data and activation maps from fMRI data were used to demonstrate the advantage of this novel test statistic. PMID:22461786
A Statistical Approach for Testing Cross-Phenotype Effects of Rare Variants
Broadaway, K. Alaine; Cutler, David J.; Duncan, Richard; Moore, Jacob L.; Ware, Erin B.; Jhun, Min A.; Bielak, Lawrence F.; Zhao, Wei; Smith, Jennifer A.; Peyser, Patricia A.; Kardia, Sharon L.R.; Ghosh, Debashis; Epstein, Michael P.
2016-01-01
Increasing empirical evidence suggests that many genetic variants influence multiple distinct phenotypes. When cross-phenotype effects exist, multivariate association methods that consider pleiotropy are often more powerful than univariate methods that model each phenotype separately. Although several statistical approaches exist for testing cross-phenotype effects for common variants, there is a lack of similar tests for gene-based analysis of rare variants. In order to fill this important gap, we introduce a statistical method for cross-phenotype analysis of rare variants using a nonparametric distance-covariance approach that compares similarity in multivariate phenotypes to similarity in rare-variant genotypes across a gene. The approach can accommodate both binary and continuous phenotypes and further can adjust for covariates. Our approach yields a closed-form test whose significance can be evaluated analytically, thereby improving computational efficiency and permitting application on a genome-wide scale. We use simulated data to demonstrate that our method, which we refer to as the Gene Association with Multiple Traits (GAMuT) test, provides increased power over competing approaches. We also illustrate our approach using exome-chip data from the Genetic Epidemiology Network of Arteriopathy. PMID:26942286
Can We Spin Straw Into Gold? An Evaluation of Immigrant Legal Status Imputation Approaches
Van Hook, Jennifer; Bachmeier, James D.; Coffman, Donna; Harel, Ofer
2014-01-01
Researchers have developed logical, demographic, and statistical strategies for imputing immigrants’ legal status, but these methods have never been empirically assessed. We used Monte Carlo simulations to test whether, and under what conditions, legal status imputation approaches yield unbiased estimates of the association of unauthorized status with health insurance coverage. We tested five methods under a range of missing data scenarios. Logical and demographic imputation methods yielded biased estimates across all missing data scenarios. Statistical imputation approaches yielded unbiased estimates only when unauthorized status was jointly observed with insurance coverage; when this condition was not met, these methods overestimated insurance coverage for unauthorized relative to legal immigrants. We next showed how bias can be reduced by incorporating prior information about unauthorized immigrants. Finally, we demonstrated the utility of the best-performing statistical method for increasing power. We used it to produce state/regional estimates of insurance coverage among unauthorized immigrants in the Current Population Survey, a data source that contains no direct measures of immigrants’ legal status. We conclude that commonly employed legal status imputation approaches are likely to produce biased estimates, but data and statistical methods exist that could substantially reduce these biases. PMID:25511332
Ebshish, Ali; Yaakob, Zahira; Taufiq-Yap, Yun Hin; Bshish, Ahmed
2014-03-19
In this work; a response surface methodology (RSM) was implemented to investigate the process variables in a hydrogen production system. The effects of five independent variables; namely the temperature (X₁); the flow rate (X₂); the catalyst weight (X₃); the catalyst loading (X₄) and the glycerol-water molar ratio (X₅) on the H₂ yield (Y₁) and the conversion of glycerol to gaseous products (Y₂) were explored. Using multiple regression analysis; the experimental results of the H₂ yield and the glycerol conversion to gases were fit to quadratic polynomial models. The proposed mathematical models have correlated the dependent factors well within the limits that were being examined. The best values of the process variables were a temperature of approximately 600 °C; a feed flow rate of 0.05 mL/min; a catalyst weight of 0.2 g; a catalyst loading of 20% and a glycerol-water molar ratio of approximately 12; where the H₂ yield was predicted to be 57.6% and the conversion of glycerol was predicted to be 75%. To validate the proposed models; statistical analysis using a two-sample t -test was performed; and the results showed that the models could predict the responses satisfactorily within the limits of the variables that were studied.
Universal Recurrence Time Statistics of Characteristic Earthquakes
NASA Astrophysics Data System (ADS)
Goltz, C.; Turcotte, D. L.; Abaimov, S.; Nadeau, R. M.
2006-12-01
Characteristic earthquakes are defined to occur quasi-periodically on major faults. Do recurrence time statistics of such earthquakes follow a particular statistical distribution? If so, which one? The answer is fundamental and has important implications for hazard assessment. The problem cannot be solved by comparing the goodness of statistical fits as the available sequences are too short. The Parkfield sequence of M ≍ 6 earthquakes, one of the most extensive reliable data sets available, has grown to merely seven events with the last earthquake in 2004, for example. Recently, however, advances in seismological monitoring and improved processing methods have unveiled so-called micro-repeaters, micro-earthquakes which recur exactly in the same location on a fault. It seems plausible to regard these earthquakes as a miniature version of the classic characteristic earthquakes. Micro-repeaters are much more frequent than major earthquakes, leading to longer sequences for analysis. Due to their recent discovery, however, available sequences contain less than 20 events at present. In this paper we present results for the analysis of recurrence times for several micro-repeater sequences from Parkfield and adjacent regions. To improve the statistical significance of our findings, we combine several sequences into one by rescaling the individual sets by their respective mean recurrence intervals and Weibull exponents. This novel approach of rescaled combination yields the most extensive data set possible. We find that the resulting statistics can be fitted well by an exponential distribution, confirming the universal applicability of the Weibull distribution to characteristic earthquakes. A similar result is obtained from rescaled combination, however, with regard to the lognormal distribution.
Monte Carlo simulation: Its status and future
DOE Office of Scientific and Technical Information (OSTI.GOV)
Murtha, J.A.
1997-04-01
Monte Carlo simulation is a statistics-based analysis tool that yields probability-vs.-value relationships for key parameters, including oil and gas reserves, capital exposure, and various economic yardsticks, such as net present value (NPV) and return on investment (ROI). Monte Carlo simulation is a part of risk analysis and is sometimes performed in conjunction with or as an alternative to decision [tree] analysis. The objectives are (1) to define Monte Carlo simulation in a more general context of risk and decision analysis; (2) to provide some specific applications, which can be interrelated; (3) to respond to some of the criticisms; (4) tomore » offer some cautions about abuses of the method and recommend how to avoid the pitfalls; and (5) to predict what the future has in store.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aubert, Bernard; Bona, M.; Karyotakis, Y.
2008-08-01
The authors present preliminary results of improved measurements of the CP-violating asymmetries and branching fractions in the decays B{sup 0} {yields} {pi}{sup +}{pi}{sup -}, B{sup 0} {yields} K{sup +}{pi}{sup -}, B{sup 0} {yields} {pi}{sup 0}{pi}{sup 0}, and B{sup 0} {yields} K{sup 0}{pi}{sup 0}. This update includes all data taken at the {Upsilon}(4S) resonance by the BABAR experiment at the asymmetric PEP-II B-meson factory at SLAC, corresponding to 467 {+-} 5 million B{bar B} pairs. They find S{sub {pi}{pi}} = -0.68 {+-} 0.10 {+-} 0.03, C{sub {pi}{pi}} = -0.25 {+-} 0.08 {+-} 0.02, {Alpha}{sub K{sub {pi}}} = -0.107 {+-} 0.016{sub -0.004},{supmore » +0.006}, C{sub {pi}{sup 0}{pi}{sup 0}} = -0.43 {+-} 0.26 {+-} 0.05, {Beta}(B{sup 0} {yields} {pi}{sup 0}{pi}{sup 0}) = (1.83 {+-} 0.21 {+-} 0.13) x 10{sup -6}, {Beta}(B{sup 0} {yields} K{sup 0}{pi}{sup 0}) = (10.1 {+-} 0.6 {+-} 0.4) x 10{sup -6}, where the first error is statistical and the second is systematic. They observe CP violation with a significance of 6.7{sigma} in B{sup 0} {yields} {pi}{sup -} and 6.1{sigma} in B{sup 0} {yields} K{sup +}{pi}{sup -}. Constraints on the Unitarity Triangle angle {alpha} are determined from the isospin relation between all B {yields} {pi}{pi} rates and asymmetries.« less
NASA Astrophysics Data System (ADS)
Glennie, Erin; Anyamba, Assaf
2018-06-01
A time series of Advanced Very High Resolution Radiometer (AVHRR) derived normalized difference vegetation index (NDVI) data were compared to National Agricultural Statistics Service (NASS) corn yield data in the United States Corn Belt from 1982 to 2014. The main objectives of the comparison were to assess 1) the consistency of regional Corn Belt responses to El Niño/Southern Oscillation (ENSO) teleconnection signals, and 2) the reliability of using NDVI as an indicator of crop yield. Regional NDVI values were used to model a seasonal curve and to define the growing season - May to October. Seasonal conditions in each county were represented by NDVI and land surface temperature (LST) composites, and corn yield was represented by average annual bushels produced per acre. Correlation analysis between the NDVI, LST, corn yield, and equatorial Pacific sea surface temperature anomalies revealed patterns in land surface dynamics and corn yield, as well as typical impacts of ENSO episodes. It was observed from the study that growing seasons coincident with La Niña events were consistently warmer, but El Niño events did not consistently impact NDVI, temperature, or corn yield data. Moreover, the El Niño and La Niña composite images suggest that impacts vary spatially across the Corn Belt. While corn is the dominant crop in the region, some inconsistencies between corn yield and NDVI may be attributed to soy crops and other background interference. The overall correlation between the total growing season NDVI anomaly and detrended corn yield was 0.61(p = 0.00013), though the strength of the relationship varies across the Corn Belt.
Statistical rice yield modeling using blended MODIS-Landsat based crop phenology metrics in Taiwan
NASA Astrophysics Data System (ADS)
Chen, C. R.; Chen, C. F.; Nguyen, S. T.; Lau, K. V.
2015-12-01
Taiwan is a populated island with a majority of residents settled in the western plains where soils are suitable for rice cultivation. Rice is not only the most important commodity, but also plays a critical role for agricultural and food marketing. Information of rice production is thus important for policymakers to devise timely plans for ensuring sustainably socioeconomic development. Because rice fields in Taiwan are generally small and yet crop monitoring requires information of crop phenology associating with the spatiotemporal resolution of satellite data, this study used Landsat-MODIS fusion data for rice yield modeling in Taiwan. We processed the data for the first crop (Feb-Mar to Jun-Jul) and the second (Aug-Sep to Nov-Dec) in 2014 through five main steps: (1) data pre-processing to account for geometric and radiometric errors of Landsat data, (2) Landsat-MODIS data fusion using using the spatial-temporal adaptive reflectance fusion model, (3) construction of the smooth time-series enhanced vegetation index 2 (EVI2), (4) rice yield modeling using EVI2-based crop phenology metrics, and (5) error verification. The fusion results by a comparison bewteen EVI2 derived from the fusion image and that from the reference Landsat image indicated close agreement between the two datasets (R2 > 0.8). We analysed smooth EVI2 curves to extract phenology metrics or phenological variables for establishment of rice yield models. The results indicated that the established yield models significantly explained more than 70% variability in the data (p-value < 0.001). The comparison results between the estimated yields and the government's yield statistics for the first and second crops indicated a close significant relationship between the two datasets (R2 > 0.8), in both cases. The root mean square error (RMSE) and mean absolute error (MAE) used to measure the model accuracy revealed the consistency between the estimated yields and the government's yield statistics. This study demonstrates advantages of using EVI2-based phenology metrics (derived from Landsat-MODIS fusion data) for rice yield estimation in Taiwan prior to the harvest period.
Phase Space Dissimilarity Measures for Structural Health Monitoring
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bubacz, Jacob A; Chmielewski, Hana T; Pape, Alexander E
A novel method for structural health monitoring (SHM), known as the Phase Space Dissimilarity Measures (PSDM) approach, is proposed and developed. The patented PSDM approach has already been developed and demonstrated for a variety of equipment and biomedical applications. Here, we investigate SHM of bridges via analysis of time serial accelerometer measurements. This work has four aspects. The first is algorithm scalability, which was found to scale linearly from one processing core to four cores. Second, the same data are analyzed to determine how the use of the PSDM approach affects sensor placement. We found that a relatively low-density placementmore » sufficiently captures the dynamics of the structure. Third, the same data are analyzed by unique combinations of accelerometer axes (vertical, longitudinal, and lateral with respect to the bridge) to determine how the choice of axes affects the analysis. The vertical axis is found to provide satisfactory SHM data. Fourth, statistical methods were investigated to validate the PSDM approach for this application, yielding statistically significant results.« less
Calibrating genomic and allelic coverage bias in single-cell sequencing.
Zhang, Cheng-Zhong; Adalsteinsson, Viktor A; Francis, Joshua; Cornils, Hauke; Jung, Joonil; Maire, Cecile; Ligon, Keith L; Meyerson, Matthew; Love, J Christopher
2015-04-16
Artifacts introduced in whole-genome amplification (WGA) make it difficult to derive accurate genomic information from single-cell genomes and require different analytical strategies from bulk genome analysis. Here, we describe statistical methods to quantitatively assess the amplification bias resulting from whole-genome amplification of single-cell genomic DNA. Analysis of single-cell DNA libraries generated by different technologies revealed universal features of the genome coverage bias predominantly generated at the amplicon level (1-10 kb). The magnitude of coverage bias can be accurately calibrated from low-pass sequencing (∼0.1 × ) to predict the depth-of-coverage yield of single-cell DNA libraries sequenced at arbitrary depths. We further provide a benchmark comparison of single-cell libraries generated by multi-strand displacement amplification (MDA) and multiple annealing and looping-based amplification cycles (MALBAC). Finally, we develop statistical models to calibrate allelic bias in single-cell whole-genome amplification and demonstrate a census-based strategy for efficient and accurate variant detection from low-input biopsy samples.
Calibrating genomic and allelic coverage bias in single-cell sequencing
Francis, Joshua; Cornils, Hauke; Jung, Joonil; Maire, Cecile; Ligon, Keith L.; Meyerson, Matthew; Love, J. Christopher
2016-01-01
Artifacts introduced in whole-genome amplification (WGA) make it difficult to derive accurate genomic information from single-cell genomes and require different analytical strategies from bulk genome analysis. Here, we describe statistical methods to quantitatively assess the amplification bias resulting from whole-genome amplification of single-cell genomic DNA. Analysis of single-cell DNA libraries generated by different technologies revealed universal features of the genome coverage bias predominantly generated at the amplicon level (1–10 kb). The magnitude of coverage bias can be accurately calibrated from low-pass sequencing (~0.1 ×) to predict the depth-of-coverage yield of single-cell DNA libraries sequenced at arbitrary depths. We further provide a benchmark comparison of single-cell libraries generated by multi-strand displacement amplification (MDA) and multiple annealing and looping-based amplification cycles (MALBAC). Finally, we develop statistical models to calibrate allelic bias in single-cell whole-genome amplification and demonstrate a census-based strategy for efficient and accurate variant detection from low-input biopsy samples. PMID:25879913
Prigge, R.; Micke, H.; Krüger, J.
1963-01-01
As part of a collaborative assay of the proposed Fifth International Standard for Gas-Gangrene Antitoxin (Perfringens), five ampoules of the proposed replacement material were assayed in the authors' laboratory against the then current Fourth International Standard. Both in vitro and in vivo methods were used. This paper presents the results and their statistical analysis. The two methods yielded different results which were not likely to have been due to chance, but exact statistical comparison is not possible. It is thought, however, that the differences may be due, at least in part, to differences in the relative proportions of zeta-antitoxin and alpha-antitoxin in the Fourth and Fifth International Standards and the consequent different reactions with the test toxin that was used for titration. PMID:14107746
Li, Yan; Shi, Zhou; Wu, Hao-Xiang; Li, Feng; Li, Hong-Yi
2013-10-01
The loss of cultivated land has increasingly become an issue of regional and national concern in China. Definition of management zones is an important measure to protect limited cultivated land resource. In this study, combined spatial data were applied to define management zones in Fuyang city, China. The yield of cultivated land was first calculated and evaluated and the spatial distribution pattern mapped; the limiting factors affecting the yield were then explored; and their maps of the spatial variability were presented using geostatistics analysis. Data were jointly analyzed for management zone definition using a combination of principal component analysis with a fuzzy clustering method, two cluster validity functions were used to determine the optimal number of cluster. Finally one-way variance analysis was performed on 3,620 soil sampling points to assess how well the defined management zones reflected the soil properties and productivity level. It was shown that there existed great potential for increasing grain production, and the amount of cultivated land played a key role in maintaining security in grain production. Organic matter, total nitrogen, available phosphorus, elevation, thickness of the plow layer, and probability of irrigation guarantee were the main limiting factors affecting the yield. The optimal number of management zones was three, and there existed significantly statistical differences between the crop yield and field parameters in each defined management zone. Management zone I presented the highest potential crop yield, fertility level, and best agricultural production condition, whereas management zone III lowest. The study showed that the procedures used may be effective in automatically defining management zones; by the development of different management zones, different strategies of cultivated land management and practice in each zone could be determined, which is of great importance to enhance cultivated land conservation, stabilize agricultural production, promote sustainable use of cultivated land and guarantee food security.
Panayi, Efstathios; Peters, Gareth W; Kyriakides, George
2017-01-01
Quantifying the effects of environmental factors over the duration of the growing process on Agaricus Bisporus (button mushroom) yields has been difficult, as common functional data analysis approaches require fixed length functional data. The data available from commercial growers, however, is of variable duration, due to commercial considerations. We employ a recently proposed regression technique termed Variable-Domain Functional Regression in order to be able to accommodate these irregular-length datasets. In this way, we are able to quantify the contribution of covariates such as temperature, humidity and water spraying volumes across the growing process, and for different lengths of growing processes. Our results indicate that optimal oxygen and temperature levels vary across the growing cycle and we propose environmental schedules for these covariates to optimise overall yields.
Panayi, Efstathios; Kyriakides, George
2017-01-01
Quantifying the effects of environmental factors over the duration of the growing process on Agaricus Bisporus (button mushroom) yields has been difficult, as common functional data analysis approaches require fixed length functional data. The data available from commercial growers, however, is of variable duration, due to commercial considerations. We employ a recently proposed regression technique termed Variable-Domain Functional Regression in order to be able to accommodate these irregular-length datasets. In this way, we are able to quantify the contribution of covariates such as temperature, humidity and water spraying volumes across the growing process, and for different lengths of growing processes. Our results indicate that optimal oxygen and temperature levels vary across the growing cycle and we propose environmental schedules for these covariates to optimise overall yields. PMID:28961254
Cehreli, S Burcak; Polat-Ozsoy, Omur; Sar, Cagla; Cubukcu, H Evren; Cehreli, Zafer C
2012-04-01
The amount of the residual adhesive after bracket debonding is frequently assessed in a qualitative manner, utilizing the adhesive remnant index (ARI). This study aimed to investigate whether quantitative assessment of the adhesive remnant yields more precise results compared to qualitative methods utilizing the 4- and 5-point ARI scales. Twenty debonded brackets were selected. Evaluation and scoring of the adhesive remnant on bracket bases were made consecutively using: 1. qualitative assessment (visual scoring) and 2. quantitative measurement (image analysis) on digital photographs. Image analysis was made on scanning electron micrographs (SEM) and high-precision elemental maps of the adhesive remnant as determined by energy dispersed X-ray spectrometry. Evaluations were made in accordance with the original 4-point and the modified 5-point ARI scales. Intra-class correlation coefficients (ICCs) were calculated, and the data were evaluated using Friedman test followed by Wilcoxon signed ranks test with Bonferroni correction. ICC statistics indicated high levels of agreement for qualitative visual scoring among examiners. The 4-point ARI scale was compliant with the SEM assessments but indicated significantly less adhesive remnant compared to the results of quantitative elemental mapping. When the 5-point scale was used, both quantitative techniques yielded similar results with those obtained qualitatively. These results indicate that qualitative visual scoring using the ARI is capable of generating similar results with those assessed by quantitative image analysis techniques. In particular, visual scoring with the 5-point ARI scale can yield similar results with both the SEM analysis and elemental mapping.
Iyer, Sneha R; Gogate, Parag R
2017-01-01
The current work investigates the application of low intensity ultrasonic irradiation for improving the cooling crystallization of Mefenamic Acid for the first time. The crystal shape and size has been analyzed with the help of optical microscope and image analysis software respectively. The effect of ultrasonic irradiation on crystal size, particle size distribution (PSD) and yield has been investigated, also establishing the comparison with conventional approach. It has been observed that application of ultrasound not only enhances the yield but also reduces the induction time for crystallization as compared to conventional cooling crystallization technique. In the presence of ultrasound, the maximum yield was obtained at optimum conditions of power dissipation of 30W and ultrasonic irradiation time of 10min. The yield was further improved by application of ultrasound in cycles where the formed crystals are allowed to grow in the absence of ultrasonic irradiation. It was also observed that the desired crystal morphology was obtained for the ultrasound assisted crystallization. The conventionally obtained needle shaped crystals transformed into plate shaped crystals for the ultrasound assisted crystallization. The particle size distribution was analyzed using statistical means on the basis of skewness and kurtosis values. It was observed that the skewness and excess kurtosis value for ultrasound assisted crystallization was significantly lower as compared to the conventional approach. XRD analysis also revealed better crystal properties for the processed mefenamic acid using ultrasound assisted approach. The overall process intensification benefits of mefenamic acid crystallization using the ultrasound assisted approach were reduced particle size, increase in the yield and uniform PSD coupled with desired morphology. Copyright © 2016 Elsevier B.V. All rights reserved.
A statistical approach to instrument calibration
Robert R. Ziemer; David Strauss
1978-01-01
Summary - It has been found that two instruments will yield different numerical values when used to measure identical points. A statistical approach is presented that can be used to approximate the error associated with the calibration of instruments. Included are standard statistical tests that can be used to determine if a number of successive calibrations of the...
Student and Professor Gender Effects in Introductory Business Statistics
ERIC Educational Resources Information Center
Haley, M. Ryan; Johnson, Marianne F.; Kuennen, Eric W.
2007-01-01
Studies have yielded highly mixed results as to differences in male and female student performance in statistics courses; the role that professors play in these differences is even less clear. In this paper, we consider the impact of professor and student gender on student performance in an introductory business statistics course taught by…
NASA Astrophysics Data System (ADS)
Licquia, Timothy C.; Newman, Jeffrey A.
2016-11-01
The exponential scale length (L d ) of the Milky Way’s (MW’s) disk is a critical parameter for describing the global physical size of our Galaxy, important both for interpreting other Galactic measurements and helping us to understand how our Galaxy fits into extragalactic contexts. Unfortunately, current estimates span a wide range of values and are often statistically incompatible with one another. Here, we perform a Bayesian meta-analysis to determine an improved, aggregate estimate for L d , utilizing a mixture-model approach to account for the possibility that any one measurement has not properly accounted for all statistical or systematic errors. Within this machinery, we explore a variety of ways of modeling the nature of problematic measurements, and then employ a Bayesian model averaging technique to derive net posterior distributions that incorporate any model-selection uncertainty. Our meta-analysis combines 29 different (15 visible and 14 infrared) photometric measurements of L d available in the literature; these involve a broad assortment of observational data sets, MW models and assumptions, and methodologies, all tabulated herein. Analyzing the visible and infrared measurements separately yields estimates for L d of {2.71}-0.20+0.22 kpc and {2.51}-0.13+0.15 kpc, respectively, whereas considering them all combined yields 2.64 ± 0.13 kpc. The ratio between the visible and infrared scale lengths determined here is very similar to that measured in external spiral galaxies. We use these results to update the model of the Galactic disk from our previous work, constraining its stellar mass to be {4.8}-1.1+1.5× {10}10 M ⊙, and the MW’s total stellar mass to be {5.7}-1.1+1.5× {10}10 M ⊙.
NASA Technical Reports Server (NTRS)
George, Kerry; Wu, Honglu; Willingham, Veronica; Cucinotta, Francis A.
2002-01-01
High-LET radiation is more efficient in producing complex-type chromosome exchanges than sparsely ionizing radiation, and this can potentially be used as a biomarker of radiation quality. To investigate if complex chromosome exchanges are induced by the high-LET component of space radiation exposure, damage was assessed in astronauts' blood lymphocytes before and after long duration missions of 3-4 months. The frequency of simple translocations increased significantly for most of the crewmembers studied. However, there were few complex exchanges detected and only one crewmember had a significant increase after flight. It has been suggested that the yield of complex chromosome damage could be underestimated when analyzing metaphase cells collected at one time point after irradiation, and analysis of chemically-induced PCC may be more accurate since problems with complicated cell-cycle delays are avoided. However, in this case the yields of chromosome damage were similar for metaphase and PCC analysis of astronauts' lymphocytes. It appears that the use of complex-type exchanges as biomarker of radiation quality in vivo after low-dose chronic exposure in mixed radiation fields is hampered by statistical uncertainties.
Ngugi, Henry K; Esker, Paul D; Scherm, Harald
2011-01-01
The continuing exponential increase in scientific knowledge, the growing availability of large databases containing raw or partially annotated information, and the increased need to document impacts of large-scale research and funding programs provide a great incentive for integrating and adding value to previously published (or unpublished) research through quantitative synthesis. Meta-analysis has become the standard for quantitative evidence synthesis in many disciplines, offering a broadly accepted and statistically powerful framework for estimating the magnitude, consistency, and homogeneity of the effect of interest across studies. Here, we review previous and current uses of meta-analysis in plant pathology with a focus on applications in epidemiology and disease management. About a dozen formal meta-analyses have been published in the plant pathological literature in the past decade, and several more are currently in progress. Three broad research questions have been addressed, the most common being the comparative efficacy of chemical treatments for managing disease and reducing yield loss across environments. The second most common application has been the quantification of relationships between disease intensity and yield, or between different measures of disease, across studies. Lastly, meta-analysis has been applied to assess factors affecting pathogen-biocontrol agent interactions or the effectiveness of biological control of plant disease or weeds. In recent years, fixed-effects meta-analysis has been largely replaced by random- (or mixed-) effects analysis owing to the statistical benefits associated with the latter and the wider availability of computer software to conduct these analyses. Another recent trend has been the more common use of multivariate meta-analysis or meta-regression to analyze the impacts of study-level independent variables (moderator variables) on the response of interest. The application of meta-analysis to practical problems in epidemiology and disease management is illustrated with case studies from our work on Phakopsora pachyrhizi on soybean and Erwinia amylovora on apple. We show that although meta-analyses are often used to corroborate and validate general conclusions drawn from more traditional, qualitative reviews, they can also reveal new patterns and interpretations not obvious from individual studies.
Theoretical Studies of Kinetic Mechanisms of Negative Ion Formation in Plasmas.
1987-06-01
927258 ILLUSTRATIONS Figure Title Pg e 1 Long-Range Behavior of Excited IVg States of Li2 21 . 2 Long-Range Behavior of Excited It* States of Li2 22U 2 3...34) yields a statistically better fit with X2 - 0.002 as compared to X2 - 0.01 for the Ceperley and Partridge potential (Ref. 24). A significantly...including those reported by Jordan and Amdur (Ref. 37), yield significantly poorer statistical fits. We have not analyzed the new potential of Nitz, et
A new statistical methodology predicting chip failure probability considering electromigration
NASA Astrophysics Data System (ADS)
Sun, Ted
In this research thesis, we present a new approach to analyze chip reliability subject to electromigration (EM) whose fundamental causes and EM phenomenon happened in different materials are presented in this thesis. This new approach utilizes the statistical nature of EM failure in order to assess overall EM risk. It includes within-die temperature variations from the chip's temperature map extracted by an Electronic Design Automation (EDA) tool to estimate the failure probability of a design. Both the power estimation and thermal analysis are performed in the EDA flow. We first used the traditional EM approach to analyze the design with a single temperature across the entire chip that involves 6 metal and 5 via layers. Next, we used the same traditional approach but with a realistic temperature map. The traditional EM analysis approach and that coupled with a temperature map and the comparison between the results of considering and not considering temperature map are presented in in this research. A comparison between these two results confirms that using a temperature map yields a less pessimistic estimation of the chip's EM risk. Finally, we employed the statistical methodology we developed considering a temperature map and different use-condition voltages and frequencies to estimate the overall failure probability of the chip. The statistical model established considers the scaling work with the usage of traditional Black equation and four major conditions. The statistical result comparisons are within our expectations. The results of this statistical analysis confirm that the chip level failure probability is higher i) at higher use-condition frequencies for all use-condition voltages, and ii) when a single temperature instead of a temperature map across the chip is considered. In this thesis, I start with an overall review on current design types, common flows, and necessary verifications and reliability checking steps used in this IC design industry. Furthermore, the important concepts about "Scripting Automation" which is used in all the integration of using diversified EDA tools in this research work are also described in detail with several examples and my completed coding works are also put in the appendix for your reference. Hopefully, this construction of my thesis will give readers a thorough understanding about my research work from the automation of EDA tools to the statistical data generation, from the nature of EM to the statistical model construction, and the comparisons among the traditional EM analysis and the statistical EM analysis approaches.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steenbergen, K. G., E-mail: kgsteen@gmail.com; Gaston, N.
2014-02-14
Inspired by methods of remote sensing image analysis, we analyze structural variation in cluster molecular dynamics (MD) simulations through a unique application of the principal component analysis (PCA) and Pearson Correlation Coefficient (PCC). The PCA analysis characterizes the geometric shape of the cluster structure at each time step, yielding a detailed and quantitative measure of structural stability and variation at finite temperature. Our PCC analysis captures bond structure variation in MD, which can be used to both supplement the PCA analysis as well as compare bond patterns between different cluster sizes. Relying only on atomic position data, without requirement formore » a priori structural input, PCA and PCC can be used to analyze both classical and ab initio MD simulations for any cluster composition or electronic configuration. Taken together, these statistical tools represent powerful new techniques for quantitative structural characterization and isomer identification in cluster MD.« less
Steenbergen, K G; Gaston, N
2014-02-14
Inspired by methods of remote sensing image analysis, we analyze structural variation in cluster molecular dynamics (MD) simulations through a unique application of the principal component analysis (PCA) and Pearson Correlation Coefficient (PCC). The PCA analysis characterizes the geometric shape of the cluster structure at each time step, yielding a detailed and quantitative measure of structural stability and variation at finite temperature. Our PCC analysis captures bond structure variation in MD, which can be used to both supplement the PCA analysis as well as compare bond patterns between different cluster sizes. Relying only on atomic position data, without requirement for a priori structural input, PCA and PCC can be used to analyze both classical and ab initio MD simulations for any cluster composition or electronic configuration. Taken together, these statistical tools represent powerful new techniques for quantitative structural characterization and isomer identification in cluster MD.
Race, Socioeconomic Status, and Implicit Bias: Implications for Closing the Achievement Gap
NASA Astrophysics Data System (ADS)
Schlosser, Elizabeth Auretta Cox
This study accessed the relationship between race, socioeconomic status, age and the race implicit bias held by middle and high school science teachers in Mobile and Baldwin County Public School Systems. Seventy-nine participants were administered the race Implicit Association Test (race IAT), created by Greenwald, A. G., Nosek, B. A., & Banaji, M. R., (2003) and a demographic survey. Quantitative analysis using analysis of variances, ANOVA and t-tests were used in this study. An ANOVA was performed comparing the race IAT scores of African American science teachers and their Caucasian counterparts. A statically significant difference was found (F = .4.56, p = .01). An ANOVA was also performed using the race IAT scores comparing the age of the participants; the analysis yielded no statistical difference based on age. A t-test was performed comparing the race IAT scores of African American teachers who taught at either Title I or non-Title I schools; no statistical difference was found between groups (t = -17.985, p < .001). A t-test was also performed comparing the race IAT scores of Caucasian teachers who taught at either Title I or non-Title I schools; a statistically significant difference was found between groups ( t = 2.44, p > .001). This research examines the implications of the achievement gap among African American and Caucasian students in science.
Network meta-analysis: a technique to gather evidence from direct and indirect comparisons
2017-01-01
Systematic reviews and pairwise meta-analyses of randomized controlled trials, at the intersection of clinical medicine, epidemiology and statistics, are positioned at the top of evidence-based practice hierarchy. These are important tools to base drugs approval, clinical protocols and guidelines formulation and for decision-making. However, this traditional technique only partially yield information that clinicians, patients and policy-makers need to make informed decisions, since it usually compares only two interventions at the time. In the market, regardless the clinical condition under evaluation, usually many interventions are available and few of them have been studied in head-to-head studies. This scenario precludes conclusions to be drawn from comparisons of all interventions profile (e.g. efficacy and safety). The recent development and introduction of a new technique – usually referred as network meta-analysis, indirect meta-analysis, multiple or mixed treatment comparisons – has allowed the estimation of metrics for all possible comparisons in the same model, simultaneously gathering direct and indirect evidence. Over the last years this statistical tool has matured as technique with models available for all types of raw data, producing different pooled effect measures, using both Frequentist and Bayesian frameworks, with different software packages. However, the conduction, report and interpretation of network meta-analysis still poses multiple challenges that should be carefully considered, especially because this technique inherits all assumptions from pairwise meta-analysis but with increased complexity. Thus, we aim to provide a basic explanation of network meta-analysis conduction, highlighting its risks and benefits for evidence-based practice, including information on statistical methods evolution, assumptions and steps for performing the analysis. PMID:28503228
El Niño-Southern Oscillation Impacts on Winter Vegetable Production in Florida*.
NASA Astrophysics Data System (ADS)
Hansen, James W.; Jones, James W.; Kiker, Clyde F.; Hodges, Alan W.
1999-01-01
Florida's mild winters allow the state to play a vital role in supplying fresh vegetables for U.S. consumers. Producers also benefit from premium prices when low temperatures prevent production in most of the country. This study characterizes the influence of the El Niño-Southern Oscillation (ENSO) on the Florida vegetable industry using statistical analysis of the response of historical crop (yield, prices, production, and value) and weather variables (freeze hazard, temperatures, rainfall, and solar radiation) to ENSO phase and its interaction with location and time of year. Annual mean yields showed little evidence of response to ENSO phase and its interaction with location. ENSO phase and season interacted to influence quarterly yields, prices, production, and value. Yields (tomato, bell pepper, sweet corn, and snap bean) were lower and prices (bell pepper and snap bean) were higher in El Niño than in neutral or La Niña winters. Production and value of tomatoes were higher in La Niña winters. The yield response can be explained by increased rainfall, reduced daily maximum temperatures, and reduced solar radiation in El Niño winters. Yield and production of winter vegetables appeared to be less responsive to ENSO phase after 1980; for tomato and bell pepper, this may be due to improvements in production technology that mitigate problems associated with excess rainfall. Winter yield and price responses to El Niño events have important implications for both producers and consumers of winter vegetables, and suggest opportunities for further research.
Kelder, Johannes C; Cowie, Martin R; McDonagh, Theresa A; Hardman, Suzanna M C; Grobbee, Diederick E; Cost, Bernard; Hoes, Arno W
2011-06-01
Diagnosing early stages of heart failure with mild symptoms is difficult. B-type natriuretic peptide (BNP) has promising biochemical test characteristics, but its diagnostic yield on top of readily available diagnostic knowledge has not been sufficiently quantified in early stages of heart failure. To quantify the added diagnostic value of BNP for the diagnosis of heart failure in a population relevant to GPs and validate the findings in an independent primary care patient population. Individual patient data meta-analysis followed by external validation. The additional diagnostic yield of BNP above standard clinical information was compared with ECG and chest x-ray results. Derivation was performed on two existing datasets from Hillingdon (n=127) and Rotterdam (n=149) while the UK Natriuretic Peptide Study (n=306) served as validation dataset. Included were patients with suspected heart failure referred to a rapid-access diagnostic outpatient clinic. Case definition was according to the ESC guideline. Logistic regression was used to assess discrimination (with the c-statistic) and calibration. Of the 276 patients in the derivation set, 30.8% had heart failure. The clinical model (encompassing age, gender, known coronary artery disease, diabetes, orthopnoea, elevated jugular venous pressure, crackles, pitting oedema and S3 gallop) had a c-statistic of 0.79. Adding, respectively, chest x-ray results, ECG results or BNP to the clinical model increased the c-statistic to 0.84, 0.85 and 0.92. Neither ECG nor chest x-ray added significantly to the 'clinical plus BNP' model. All models had adequate calibration. The 'clinical plus BNP' diagnostic model performed well in an independent cohort with comparable inclusion criteria (c-statistic=0.91 and adequate calibration). Using separate cut-off values for 'ruling in' (typically implying referral for echocardiography) and for 'ruling out' heart failure--creating a grey zone--resulted in insufficient proportions of patients with a correct diagnosis. BNP has considerable diagnostic value in addition to signs and symptoms in patients suspected of heart failure in primary care. However, using BNP alone with the currently recommended cut-off levels is not sufficient to make a reliable diagnosis of heart failure.
NASA Astrophysics Data System (ADS)
Sheehan, J. J.
2016-12-01
We report here a first-of-its-kind analysis of the potential for intensification of global grazing systems. Intensification is calculated using the statistical yield gap methodology developed previously by others (Mueller et al 2012 and Licker et al 2010) for global crop systems. Yield gaps are estimated by binning global pasture land area into 100 equal area sized bins of similar climate (defined by ranges of rainfall and growing degree days). Within each bin, grid cells of pastureland are ranked from lowest to highest productivity. The global intensification potential is defined as the sum of global production across all bins at a given percentile ranking (e.g. performance at the 90th percentile) divided by the total current global production. The previous yield gap studies focused on crop systems because productivity data on these systems is readily available. Nevertheless, global crop land represents only one-third of total global agricultural land, while pasture systems account for the remaining two-thirds. Thus, it is critical to conduct the same kind of analysis on what is the largest human use of land on the planet—pasture systems. In 2013, Herrero et al announced the completion of a geospatial data set that augmented the animal census data with data and modeling about production systems and overall food productivity (Herrero et al, PNAS 2013). With this data set, it is now possible to apply yield gap analysis to global pasture systems. We used the Herrero et al data set to evaluate yield gaps for meat and milk production from pasture based systems for cattle, sheep and goats. The figure included with this abstract shows the intensification potential for kcal per hectare per year of meat and milk from global cattle, sheep and goats as a function of increasing levels of performance. Performance is measured as the productivity achieved at a given ranked percentile within each bin.We find that if all pasture land were raised to their 90th percentile of performance, global output of meat and milk could increase 2.8 fold. This is much higher than that reported previously for major grain crops like corn and wheat. Our results suggest that efforts to address poor performance of pasture systems around the world could substantially improve the outlook for meeting future food demand.
Supratentorial lesions contribute to trigeminal neuralgia in multiple sclerosis.
Fröhlich, Kilian; Winder, Klemens; Linker, Ralf A; Engelhorn, Tobias; Dörfler, Arnd; Lee, De-Hyung; Hilz, Max J; Schwab, Stefan; Seifert, Frank
2018-06-01
Background It has been proposed that multiple sclerosis lesions afflicting the pontine trigeminal afferents contribute to trigeminal neuralgia in multiple sclerosis. So far, there are no imaging studies that have evaluated interactions between supratentorial lesions and trigeminal neuralgia in multiple sclerosis patients. Methods We conducted a retrospective study and sought multiple sclerosis patients with trigeminal neuralgia and controls in a local database. Multiple sclerosis lesions were manually outlined and transformed into stereotaxic space. We determined the lesion overlap and performed a voxel-wise subtraction analysis. Secondly, we conducted a voxel-wise non-parametric analysis using the Liebermeister test. Results From 12,210 multiple sclerosis patient records screened, we identified 41 patients with trigeminal neuralgia. The voxel-wise subtraction analysis yielded associations between trigeminal neuralgia and multiple sclerosis lesions in the pontine trigeminal afferents, as well as larger supratentorial lesion clusters in the contralateral insula and hippocampus. The non-parametric statistical analysis using the Liebermeister test yielded similar areas to be associated with multiple sclerosis-related trigeminal neuralgia. Conclusions Our study confirms previous data on associations between multiple sclerosis-related trigeminal neuralgia and pontine lesions, and showed for the first time an association with lesions in the insular region, a region involved in pain processing and endogenous pain modulation.
NASA Astrophysics Data System (ADS)
Alves, Gelio; Wang, Guanghui; Ogurtsov, Aleksey Y.; Drake, Steven K.; Gucek, Marjan; Sacks, David B.; Yu, Yi-Kuo
2018-06-01
Rapid and accurate identification and classification of microorganisms is of paramount importance to public health and safety. With the advance of mass spectrometry (MS) technology, the speed of identification can be greatly improved. However, the increasing number of microbes sequenced is complicating correct microbial identification even in a simple sample due to the large number of candidates present. To properly untwine candidate microbes in samples containing one or more microbes, one needs to go beyond apparent morphology or simple "fingerprinting"; to correctly prioritize the candidate microbes, one needs to have accurate statistical significance in microbial identification. We meet these challenges by using peptide-centric representations of microbes to better separate them and by augmenting our earlier analysis method that yields accurate statistical significance. Here, we present an updated analysis workflow that uses tandem MS (MS/MS) spectra for microbial identification or classification. We have demonstrated, using 226 MS/MS publicly available data files (each containing from 2500 to nearly 100,000 MS/MS spectra) and 4000 additional MS/MS data files, that the updated workflow can correctly identify multiple microbes at the genus and often the species level for samples containing more than one microbe. We have also shown that the proposed workflow computes accurate statistical significances, i.e., E values for identified peptides and unified E values for identified microbes. Our updated analysis workflow MiCId, a freely available software for Microorganism Classification and Identification, is available for download at https://www.ncbi.nlm.nih.gov/CBBresearch/Yu/downloads.html.
Alves, Gelio; Wang, Guanghui; Ogurtsov, Aleksey Y; Drake, Steven K; Gucek, Marjan; Sacks, David B; Yu, Yi-Kuo
2018-06-05
Rapid and accurate identification and classification of microorganisms is of paramount importance to public health and safety. With the advance of mass spectrometry (MS) technology, the speed of identification can be greatly improved. However, the increasing number of microbes sequenced is complicating correct microbial identification even in a simple sample due to the large number of candidates present. To properly untwine candidate microbes in samples containing one or more microbes, one needs to go beyond apparent morphology or simple "fingerprinting"; to correctly prioritize the candidate microbes, one needs to have accurate statistical significance in microbial identification. We meet these challenges by using peptide-centric representations of microbes to better separate them and by augmenting our earlier analysis method that yields accurate statistical significance. Here, we present an updated analysis workflow that uses tandem MS (MS/MS) spectra for microbial identification or classification. We have demonstrated, using 226 MS/MS publicly available data files (each containing from 2500 to nearly 100,000 MS/MS spectra) and 4000 additional MS/MS data files, that the updated workflow can correctly identify multiple microbes at the genus and often the species level for samples containing more than one microbe. We have also shown that the proposed workflow computes accurate statistical significances, i.e., E values for identified peptides and unified E values for identified microbes. Our updated analysis workflow MiCId, a freely available software for Microorganism Classification and Identification, is available for download at https://www.ncbi.nlm.nih.gov/CBBresearch/Yu/downloads.html . Graphical Abstract ᅟ.
Liu, Zechang; Wang, Liping; Liu, Yumei
2018-01-18
Hops impart flavor to beer, with the volatile components characterizing the various hop varieties and qualities. Fingerprinting, especially flavor fingerprinting, is often used to identify 'flavor products' because inconsistencies in the description of flavor may lead to an incorrect definition of beer quality. Compared to flavor fingerprinting, volatile fingerprinting is simpler and easier. We performed volatile fingerprinting using head space-solid phase micro-extraction gas chromatography-mass spectrometry combined with similarity analysis and principal component analysis (PCA) for evaluating and distinguishing between three major Chinese hops. Eighty-four volatiles were identified, which were classified into seven categories. Volatile fingerprinting based on similarity analysis did not yield any obvious result. By contrast, hop varieties and qualities were identified using volatile fingerprinting based on PCA. The potential variables explained the variance in the three hop varieties. In addition, the dendrogram and principal component score plot described the differences and classifications of hops. Volatile fingerprinting plus multivariate statistical analysis can rapidly differentiate between the different varieties and qualities of the three major Chinese hops. Furthermore, this method can be used as a reference in other fields. © 2018 Society of Chemical Industry. © 2018 Society of Chemical Industry.
Multivariate analysis of fears in dental phobic patients according to a reduced FSS-II scale.
Hakeberg, M; Gustafsson, J E; Berggren, U; Carlsson, S G
1995-10-01
This study analyzed and assessed dimensions of a questionnaire developed to measure general fears and phobias. A previous factor analysis among 109 dental phobics had revealed a five-factor structure with 22 items and an explained total variance of 54%. The present study analyzed the same material using a multivariate statistical procedure (LISREL) to reveal structural latent variables. The LISREL analysis, based on the correlation matrix, yielded a chi-square of 216.6 with 195 degrees of freedom (P = 0.138) and showed a model with seven latent variables. One was a general fear factor correlated to all 22 items. The other six factors concerned "Illness & Death" (5 items), "Failures & Embarrassment" (5 items), "Social situations" (5 items), "Physical injuries" (4 items), "Animals & Natural phenomena" (4 items). One item (opposite sex) was included in both "Failures & Embarrassment" and "Social situations". The last factor, "Social interaction", combined all the items in "Failures & Embarrassment" and "Social situations" (9 items). In conclusion, this multivariate statistical analysis (LISREL) revealed and confirmed a factor structure similar to our previous study, but added two important dimensions not shown with a traditional factor analysis. This reduced FSS-II version measures general fears and phobias and may be used on a routine clinical basis as well as in dental phobia research.
McCormick, Frank; Gupta, Anil; Bruce, Ben; Harris, Josh; Abrams, Geoff; Wilson, Hillary; Hussey, Kristen; Cole, Brian J.
2014-01-01
Purpose: The purpose of this study was to measure and compare the subjective, objective, and radiographic healing outcomes of single-row (SR), double-row (DR), and transosseous equivalent (TOE) suture techniques for arthroscopic rotator cuff repair. Materials and Methods: A retrospective comparative analysis of arthroscopic rotator cuff repairs by one surgeon from 2004 to 2010 at minimum 2-year followup was performed. Cohorts were matched for age, sex, and tear size. Subjective outcome variables included ASES, Constant, SST, UCLA, and SF-12 scores. Objective outcome variables included strength, active range of motion (ROM). Radiographic healing was assessed by magnetic resonance imaging (MRI). Statistical analysis was performed using analysis of variance (ANOVA), Mann — Whitney and Kruskal — Wallis tests with significance, and the Fisher exact probability test <0.05. Results: Sixty-three patients completed the study requirements (20 SR, 21 DR, 22 TOE). There was a clinically and statistically significant improvement in outcomes with all repair techniques (ASES mean improvement P = <0.0001). The mean final ASES scores were: SR 83; (SD 21.4); DR 87 (SD 18.2); TOE 87 (SD 13.2); (P = 0.73). There was a statistically significant improvement in strength for each repair technique (P < 0.001). There was no significant difference between techniques across all secondary outcome assessments: ASES improvement, Constant, SST, UCLA, SF-12, ROM, Strength, and MRI re-tear rates. There was a decrease in re-tear rates from single row (22%) to double-row (18%) to transosseous equivalent (11%); however, this difference was not statistically significant (P = 0.6). Conclusions: Compared to preoperatively, arthroscopic rotator cuff repair, using SR, DR, or TOE techniques, yielded a clinically and statistically significant improvement in subjective and objective outcomes at a minimum 2-year follow-up. Level of Evidence: Therapeutic level 3. PMID:24926159
Kitchenham, B A; Rowlands, G J; Shorbagi, H
1975-05-01
Regression analyses were performed on data from 48 Compton metabolic profile tests relating the concentrations of certain constituents in the blood of dairy cows to their milk yield, age and stage of lactation. The common partial regression coefficients for milk yield, age and stage of lactation were estimated for each blood constituent. The relationships of greatest statistical significance were between the concentrations of inorganic phosphate and globulin and age, and the concentration of albumin and milk yield.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Emanuel, A.E.
1991-03-01
This article presents a preliminary analysis of the effect of randomly varying harmonic voltages on the temperature rise of squirrel-cage motors. The stochastic process of random variations of harmonic voltages is defined by means of simple statistics (mean, standard deviation, type of distribution). Computational models based on a first-order approximation of the motor losses and on the Monte Carlo method yield results which prove that equipment with large thermal time-constant is capable of withstanding for a short period of time larger distortions than THD = 5%.
Davidson, William; Beck, Hall P
2018-01-01
This study empirically confirmed the relationships between the degree to which students satisfied three basic needs (competence, relatedness, and autonomy) and the strength of their commitments to the university they attended and to obtaining a baccalaureate degree. A questionnaire was administered online to 1257 students at two 4-year universities. Regression analysis yielded statistically significant associations between the three needs and Institutional Commitment and Degree Commitment, explaining more than 20% of the variance in the latter two variables.
Metrics and methods for characterizing dairy farm intensification using farm survey data.
Gonzalez-Mejia, Alejandra; Styles, David; Wilson, Paul; Gibbons, James
2018-01-01
Evaluation of agricultural intensification requires comprehensive analysis of trends in farm performance across physical and socio-economic aspects, which may diverge across farm types. Typical reporting of economic indicators at sectorial or the "average farm" level does not represent farm diversity and provides limited insight into the sustainability of specific intensification pathways. Using farm business data from a total of 7281 farm survey observations of English and Welsh dairy farms over a 14-year period we calculate a time series of 16 key performance indicators (KPIs) pertinent to farm structure, environmental and socio-economic aspects of sustainability. We then apply principle component analysis and model-based clustering analysis to identify statistically the number of distinct dairy farm typologies for each year of study, and link these clusters through time using multidimensional scaling. Between 2001 and 2014, dairy farms have largely consolidated and specialized into two distinct clusters: more extensive farms relying predominantly on grass, with lower milk yields but higher labour intensity, and more intensive farms producing more milk per cow with more concentrate and more maize, but lower labour intensity. There is some indication that these clusters are converging as the extensive cluster is intensifying slightly faster than the intensive cluster, in terms of milk yield per cow and use of concentrate feed. In 2014, annual milk yields were 6,835 and 7,500 l/cow for extensive and intensive farm types, respectively, whilst annual concentrate feed use was 1.3 and 1.5 tonnes per cow. For several KPIs such as milk yield the mean trend across all farms differed substantially from the extensive and intensive typologies mean. The indicators and analysis methodology developed allows identification of distinct farm types and industry trends using readily available survey data. The identified groups allow the accurate evaluation of the consequences of the reduction in dairy farm numbers and intensification at national and international scales.
Metrics and methods for characterizing dairy farm intensification using farm survey data
Gonzalez-Mejia, Alejandra; Styles, David; Wilson, Paul
2018-01-01
Evaluation of agricultural intensification requires comprehensive analysis of trends in farm performance across physical and socio-economic aspects, which may diverge across farm types. Typical reporting of economic indicators at sectorial or the “average farm” level does not represent farm diversity and provides limited insight into the sustainability of specific intensification pathways. Using farm business data from a total of 7281 farm survey observations of English and Welsh dairy farms over a 14-year period we calculate a time series of 16 key performance indicators (KPIs) pertinent to farm structure, environmental and socio-economic aspects of sustainability. We then apply principle component analysis and model-based clustering analysis to identify statistically the number of distinct dairy farm typologies for each year of study, and link these clusters through time using multidimensional scaling. Between 2001 and 2014, dairy farms have largely consolidated and specialized into two distinct clusters: more extensive farms relying predominantly on grass, with lower milk yields but higher labour intensity, and more intensive farms producing more milk per cow with more concentrate and more maize, but lower labour intensity. There is some indication that these clusters are converging as the extensive cluster is intensifying slightly faster than the intensive cluster, in terms of milk yield per cow and use of concentrate feed. In 2014, annual milk yields were 6,835 and 7,500 l/cow for extensive and intensive farm types, respectively, whilst annual concentrate feed use was 1.3 and 1.5 tonnes per cow. For several KPIs such as milk yield the mean trend across all farms differed substantially from the extensive and intensive typologies mean. The indicators and analysis methodology developed allows identification of distinct farm types and industry trends using readily available survey data. The identified groups allow the accurate evaluation of the consequences of the reduction in dairy farm numbers and intensification at national and international scales. PMID:29742166
NASA Astrophysics Data System (ADS)
Takabayashi, Sadao; Klein, William P.; Onodera, Craig; Rapp, Blake; Flores-Estrada, Juan; Lindau, Elias; Snowball, Lejmarc; Sam, Joseph T.; Padilla, Jennifer E.; Lee, Jeunghoon; Knowlton, William B.; Graugnard, Elton; Yurke, Bernard; Kuang, Wan; Hughes, William L.
2014-10-01
High precision, high yield, and high density self-assembly of nanoparticles into arrays is essential for nanophotonics. Spatial deviations as small as a few nanometers can alter the properties of near-field coupled optical nanostructures. Several studies have reported assemblies of few nanoparticle structures with controlled spacing using DNA nanostructures with variable yield. Here, we report multi-tether design strategies and attachment yields for homo- and hetero-nanoparticle arrays templated by DNA origami nanotubes. Nanoparticle attachment yield via DNA hybridization is comparable with streptavidin-biotin binding. Independent of the number of binding sites, >97% site-occupation was achieved with four tethers and 99.2% site-occupation is theoretically possible with five tethers. The interparticle distance was within 2 nm of all design specifications and the nanoparticle spatial deviations decreased with interparticle spacing. Modified geometric, binomial, and trinomial distributions indicate that site-bridging, steric hindrance, and electrostatic repulsion were not dominant barriers to self-assembly and both tethers and binding sites were statistically independent at high particle densities.High precision, high yield, and high density self-assembly of nanoparticles into arrays is essential for nanophotonics. Spatial deviations as small as a few nanometers can alter the properties of near-field coupled optical nanostructures. Several studies have reported assemblies of few nanoparticle structures with controlled spacing using DNA nanostructures with variable yield. Here, we report multi-tether design strategies and attachment yields for homo- and hetero-nanoparticle arrays templated by DNA origami nanotubes. Nanoparticle attachment yield via DNA hybridization is comparable with streptavidin-biotin binding. Independent of the number of binding sites, >97% site-occupation was achieved with four tethers and 99.2% site-occupation is theoretically possible with five tethers. The interparticle distance was within 2 nm of all design specifications and the nanoparticle spatial deviations decreased with interparticle spacing. Modified geometric, binomial, and trinomial distributions indicate that site-bridging, steric hindrance, and electrostatic repulsion were not dominant barriers to self-assembly and both tethers and binding sites were statistically independent at high particle densities. Electronic supplementary information (ESI) available. See DOI: 10.1039/c4nr03069a
Stability of agronomic and yield related traits of Jatropha curcas accessions raised from cuttings
NASA Astrophysics Data System (ADS)
Mat, Nurul Hidayah Che; Yaakob, Zahira; Ratnam, Wickneswari
2016-11-01
Monitoring stability of agronomic and yield related traits is important for prediction of crop yields. This study was a latter study for the evaluation of 295 J. curcas individuals representing 21 accessions from eight countries at Biodiesel Research Station of Universiti Kebangsaan Malaysia, Kuala Pilah planted in December 2012. In this study, 183 J. curcas individuals were selected randomly from the population and their growth performance evaluated from December 2013 to December 2014. All the individual plants were raised from cuttings. The yield related data were recorded periodically and performance of each accession was analyzed using Statistical Analysis System (SAS) 9.4. Five traits which were number of fruits per plant (NFPP), number of fruits per inflorescence (NFPI), hundred seed weight (g) (HSW), number of seeds per plant (NSPP) and yield per plant (g) (YPP) showed significant differences among the accessions after two years of planting. Maximum values for each trait were 208 cm for plant height (PH), 31 for number of branches per plant (BPP), 115 for number of inflorescence per plant (NIPP), 582 for NFPP, 7 for NFPI, 307 for number of flowers per inflorescence (NFI), 17 for number of female flowers per inflorescence (NFFPI), 91.6 g for HSW, 1647.1 for NSPP and 927.6 g for YPP. Most of the plants which had performed well in the first year were among the best performers in the second year.
Vasudevan, Rama K; Tselev, Alexander; Baddorf, Arthur P; Kalinin, Sergei V
2014-10-28
Reflection high energy electron diffraction (RHEED) has by now become a standard tool for in situ monitoring of film growth by pulsed laser deposition and molecular beam epitaxy. Yet despite the widespread adoption and wealth of information in RHEED images, most applications are limited to observing intensity oscillations of the specular spot, and much additional information on growth is discarded. With ease of data acquisition and increased computation speeds, statistical methods to rapidly mine the data set are now feasible. Here, we develop such an approach to the analysis of the fundamental growth processes through multivariate statistical analysis of a RHEED image sequence. This approach is illustrated for growth of La(x)Ca(1-x)MnO(3) films grown on etched (001) SrTiO(3) substrates, but is universal. The multivariate methods including principal component analysis and k-means clustering provide insight into the relevant behaviors, the timing and nature of a disordered to ordered growth change, and highlight statistically significant patterns. Fourier analysis yields the harmonic components of the signal and allows separation of the relevant components and baselines, isolating the asymmetric nature of the step density function and the transmission spots from the imperfect layer-by-layer (LBL) growth. These studies show the promise of big data approaches to obtaining more insight into film properties during and after epitaxial film growth. Furthermore, these studies open the pathway to use forward prediction methods to potentially allow significantly more control over growth process and hence final film quality.
Scanning capacitance microscopy of ErAs nanoparticles embedded in GaAs pn junctions
NASA Astrophysics Data System (ADS)
Park, K. W.; Nair, H. P.; Crook, A. M.; Bank, S. R.; Yu, E. T.
2011-09-01
Scanning capacitance microscopy is used to characterize the electronic properties of ErAs nanoparticles embedded in GaAs pn junctions grown by molecular beam epitaxy. Voltage-dependent capacitance images reveal localized variations in subsurface electronic structure near buried ErAs nanoparticles at lateral length scales of 20-30 nm. Numerical modeling indicates that these variations arise from inhomogeneities in charge modulation due to Fermi level pinning behavior associated with the embedded ErAs nanoparticles. Statistical analysis of image data yields an average particle radius of 6-8 nm—well below the direct resolution limit in scanning capacitance microscopy but discernible via analysis of patterns in nanoscale capacitance images.
Refraction of coastal ocean waves
NASA Technical Reports Server (NTRS)
Shuchman, R. A.; Kasischke, E. S.
1981-01-01
Refraction of gravity waves in the coastal area off Cape Hatteras, NC as documented by synthetic aperture radar (SAR) imagery from Seasat orbit 974 (collected on September 3, 1978) is discussed. An analysis of optical Fourier transforms (OFTs) from more than 70 geographical positions yields estimates of wavelength and wave direction for each position. In addition, independent estimates of the same two quantities are calculated using two simple theoretical wave-refraction models. The OFT results are then compared with the theoretical results. A statistical analysis shows a significant degree of linear correlation between the data sets. This is considered to indicate that the Seasat SAR produces imagery whose clarity is sufficient to show the refraction of gravity waves in shallow water.
Wide-Field Imaging of Single-Nanoparticle Extinction with Sub-nm2 Sensitivity
NASA Astrophysics Data System (ADS)
Payne, Lukas M.; Langbein, Wolfgang; Borri, Paola
2018-03-01
We report on a highly sensitive wide-field imaging technique for quantitative measurement of the optical extinction cross section σext of single nanoparticles. The technique is simple and high speed, and it enables the simultaneous acquisition of hundreds of nanoparticles for statistical analysis. Using rapid referencing, fast acquisition, and a deconvolution analysis, a shot-noise-limited sensitivity down to 0.4 nm2 is achieved. Measurements on a set of individual gold nanoparticles of 5 nm diameter using this method yield σext=(10.0 ±3.1 ) nm2, which is consistent with theoretical expectations and well above the background fluctuations of 0.9 nm2 .
Ngo, Long H; Inouye, Sharon K; Jones, Richard N; Travison, Thomas G; Libermann, Towia A; Dillon, Simon T; Kuchel, George A; Vasunilashorn, Sarinnapha M; Alsop, David C; Marcantonio, Edward R
2017-06-06
The nested case-control study (NCC) design within a prospective cohort study is used when outcome data are available for all subjects, but the exposure of interest has not been collected, and is difficult or prohibitively expensive to obtain for all subjects. A NCC analysis with good matching procedures yields estimates that are as efficient and unbiased as estimates from the full cohort study. We present methodological considerations in a matched NCC design and analysis, which include the choice of match algorithms, analysis methods to evaluate the association of exposures of interest with outcomes, and consideration of overmatching. Matched, NCC design within a longitudinal observational prospective cohort study in the setting of two academic hospitals. Study participants are patients aged over 70 years who underwent scheduled major non-cardiac surgery. The primary outcome was postoperative delirium from in-hospital interviews and medical record review. The main exposure was IL-6 concentration (pg/ml) from blood sampled at three time points before delirium occurred. We used nonparametric signed ranked test to test for the median of the paired differences. We used conditional logistic regression to model the risk of IL-6 on delirium incidence. Simulation was used to generate a sample of cohort data on which unconditional multivariable logistic regression was used, and the results were compared to those of the conditional logistic regression. Partial R-square was used to assess the level of overmatching. We found that the optimal match algorithm yielded more matched pairs than the greedy algorithm. The choice of analytic strategy-whether to consider measured cytokine levels as the predictor or outcome-- yielded inferences that have different clinical interpretations but similar levels of statistical significance. Estimation results from NCC design using conditional logistic regression, and from simulated cohort design using unconditional logistic regression, were similar. We found minimal evidence for overmatching. Using a matched NCC approach introduces methodological challenges into the study design and data analysis. Nonetheless, with careful selection of the match algorithm, match factors, and analysis methods, this design is cost effective and, for our study, yields estimates that are similar to those from a prospective cohort study design.
A knowledge-based T2-statistic to perform pathway analysis for quantitative proteomic data
Chen, Yi-Hau
2017-01-01
Approaches to identify significant pathways from high-throughput quantitative data have been developed in recent years. Still, the analysis of proteomic data stays difficult because of limited sample size. This limitation also leads to the practice of using a competitive null as common approach; which fundamentally implies genes or proteins as independent units. The independent assumption ignores the associations among biomolecules with similar functions or cellular localization, as well as the interactions among them manifested as changes in expression ratios. Consequently, these methods often underestimate the associations among biomolecules and cause false positives in practice. Some studies incorporate the sample covariance matrix into the calculation to address this issue. However, sample covariance may not be a precise estimation if the sample size is very limited, which is usually the case for the data produced by mass spectrometry. In this study, we introduce a multivariate test under a self-contained null to perform pathway analysis for quantitative proteomic data. The covariance matrix used in the test statistic is constructed by the confidence scores retrieved from the STRING database or the HitPredict database. We also design an integrating procedure to retain pathways of sufficient evidence as a pathway group. The performance of the proposed T2-statistic is demonstrated using five published experimental datasets: the T-cell activation, the cAMP/PKA signaling, the myoblast differentiation, and the effect of dasatinib on the BCR-ABL pathway are proteomic datasets produced by mass spectrometry; and the protective effect of myocilin via the MAPK signaling pathway is a gene expression dataset of limited sample size. Compared with other popular statistics, the proposed T2-statistic yields more accurate descriptions in agreement with the discussion of the original publication. We implemented the T2-statistic into an R package T2GA, which is available at https://github.com/roqe/T2GA. PMID:28622336
A knowledge-based T2-statistic to perform pathway analysis for quantitative proteomic data.
Lai, En-Yu; Chen, Yi-Hau; Wu, Kun-Pin
2017-06-01
Approaches to identify significant pathways from high-throughput quantitative data have been developed in recent years. Still, the analysis of proteomic data stays difficult because of limited sample size. This limitation also leads to the practice of using a competitive null as common approach; which fundamentally implies genes or proteins as independent units. The independent assumption ignores the associations among biomolecules with similar functions or cellular localization, as well as the interactions among them manifested as changes in expression ratios. Consequently, these methods often underestimate the associations among biomolecules and cause false positives in practice. Some studies incorporate the sample covariance matrix into the calculation to address this issue. However, sample covariance may not be a precise estimation if the sample size is very limited, which is usually the case for the data produced by mass spectrometry. In this study, we introduce a multivariate test under a self-contained null to perform pathway analysis for quantitative proteomic data. The covariance matrix used in the test statistic is constructed by the confidence scores retrieved from the STRING database or the HitPredict database. We also design an integrating procedure to retain pathways of sufficient evidence as a pathway group. The performance of the proposed T2-statistic is demonstrated using five published experimental datasets: the T-cell activation, the cAMP/PKA signaling, the myoblast differentiation, and the effect of dasatinib on the BCR-ABL pathway are proteomic datasets produced by mass spectrometry; and the protective effect of myocilin via the MAPK signaling pathway is a gene expression dataset of limited sample size. Compared with other popular statistics, the proposed T2-statistic yields more accurate descriptions in agreement with the discussion of the original publication. We implemented the T2-statistic into an R package T2GA, which is available at https://github.com/roqe/T2GA.
Development of low cost medium for ethanol production from syngas by Clostridium ragsdalei.
Gao, Jie; Atiyeh, Hasan K; Phillips, John R; Wilkins, Mark R; Huhnke, Raymond L
2013-11-01
The development of a low cost medium for ethanol production is critical for process feasibility. Ten media were formulated for Clostridium ragsdalei by reduction, elimination and replacement of expensive nutrients. Cost analysis and effects of medium components on growth and product formation were investigated. Fermentations were performed in 250 mL bottles using syngas (20% CO, 15% CO2, 5% H2 and 60% N2). The standard medium M1 cost is $9.83/L, of which 93% is attributed to morpholinoethane sulfonic acid (MES) buffer. Statistical analysis of the results showed that MES removal did not affect cell growth and ethanol production (P>0.05). Based on cells' elemental composition, a minimal mineral concentration medium M7 was formulated, which provided 29% higher ethanol yield from CO at 3% of the cost compared to medium M1. Ethanol yield from CO in the completely defined medium M9 was 36% higher than while at 5% the cost of medium M1. Copyright © 2013 Elsevier Ltd. All rights reserved.
Are Shunt Revisions Associated with IQ in Congenital Hydrocephalus? A Meta -Analysis.
Arrington, C Nikki; Ware, Ashley L; Ahmed, Yusra; Kulesz, Paulina A; Dennis, Maureen; Fletcher, Jack M
2016-12-01
Although it is generally acknowledged that shunt revisions are associated with reductions in cognitive functions in individuals with congenital hydrocephalus, the literature yields mixed results and is inconclusive. The current study used meta-analytic methods to empirically synthesize studies addressing the association of shunt revisions and IQ in individuals with congenital hydrocephalus. Six studies and three in-house datasets yielded 11 independent samples for meta-analysis. Groups representing lower and higher numbers of shunt revisions were coded to generate effect sizes for differences in IQ scores. Mean effect size across studies was statistically significant, but small (Hedges' g = 0.25, p < 0.001, 95 % CI [0.08, 0.43]) with more shunt revisions associated with lower IQ scores. Results show an association of lower IQ and more shunt revisions of about 3 IQ points, a small effect, but within the error of measurement associated with IQ tests. Although clinical significance of this effect is not clear, results suggest that repeated shunt revisions because of shunt failure is associated with a reduction in cognitive functions.
Chatterji, Madhabi
2002-01-01
This study examines validity of data generated by the School Readiness for Reforms: Leader Questionnaire (SRR-LQ) using an iterative procedure that combines classical and Rasch rating scale analysis. Following content-validation and pilot-testing, principal axis factor extraction and promax rotation of factors yielded a five factor structure consistent with the content-validated subscales of the original instrument. Factors were identified based on inspection of pattern and structure coefficients. The rotated factor pattern, inter-factor correlations, convergent validity coefficients, and Cronbach's alpha reliability estimates supported the hypothesized construct properties. To further examine unidimensionality and efficacy of the rating scale structures, item-level data from each factor-defined subscale were subjected to analysis with the Rasch rating scale model. Data-to-model fit statistics and separation reliability for items and persons met acceptable criteria. Rating scale results suggested consistency of expected and observed step difficulties in rating categories, and correspondence of step calibrations with increases in the underlying variables. The combined approach yielded more comprehensive diagnostic information on the quality of the five SRR-LQ subscales; further research is continuing.
NASA Astrophysics Data System (ADS)
Codrington, Martin John Michael
The Quark Gluon Plasma (QGP) is a form of matter in which quarks and gluons are deconfined, and was suggested to be formed in high-energy heavy-ion collisions. Since the discovery of high-pT hadron suppression in central Au+Au collisions at the Relativistic Heavy Ion Collider (RHIC), and the related discovery of the quenching of the away-side jet in these collisions, the role of jets as key probes of the QGP was reaffirmed. The Solenoidal Tracker At RHIC (STAR) detector system, which is suited for jet studies because of its large solid-angle coverage, has produced a number of interesting jet measurements in recent years, including γ-jet measurements, attempts at full heavy-ion jet reconstruction, and two-dimensional correlations. A long-range correlation in pseudorapidity (the "Ridge") was studied (with statistical significance) out to ptrig.T <˜ 7 GeV/c and was assumed to have an integrated yield independent of ptrig.T . Further studies out to higher pT were limited by the minimum biased statistics taken in Run 4 (2004) with STAR. This work presents results of a ridge analysis with (non-reconstructed) π 0s and direct-γ-rich triggers out to ˜13.5 GeV/c in ptrig.T using triggered data from Run 7 (2007) and Run 10 (2010) Au+Au collisions detected with STAR. Preliminary results seem to indicate that the ridge yield decreases with ptrig.T , and that the ridge yield for direct-γ-rich triggers is consistent with zero.
Evaluating the capabilities of watershed-scale models in estimating sediment yield at field-scale.
Sommerlot, Andrew R; Nejadhashemi, A Pouyan; Woznicki, Sean A; Giri, Subhasis; Prohaska, Michael D
2013-09-30
Many watershed model interfaces have been developed in recent years for predicting field-scale sediment loads. They share the goal of providing data for decisions aimed at improving watershed health and the effectiveness of water quality conservation efforts. The objectives of this study were to: 1) compare three watershed-scale models (Soil and Water Assessment Tool (SWAT), Field_SWAT, and the High Impact Targeting (HIT) model) against calibrated field-scale model (RUSLE2) in estimating sediment yield from 41 randomly selected agricultural fields within the River Raisin watershed; 2) evaluate the statistical significance among models; 3) assess the watershed models' capabilities in identifying areas of concern at the field level; 4) evaluate the reliability of the watershed-scale models for field-scale analysis. The SWAT model produced the most similar estimates to RUSLE2 by providing the closest median and the lowest absolute error in sediment yield predictions, while the HIT model estimates were the worst. Concerning statistically significant differences between models, SWAT was the only model found to be not significantly different from the calibrated RUSLE2 at α = 0.05. Meanwhile, all models were incapable of identifying priorities areas similar to the RUSLE2 model. Overall, SWAT provided the most correct estimates (51%) within the uncertainty bounds of RUSLE2 and is the most reliable among the studied models, while HIT is the least reliable. The results of this study suggest caution should be exercised when using watershed-scale models for field level decision-making, while field specific data is of paramount importance. Copyright © 2013 Elsevier Ltd. All rights reserved.
Reese, Sarah E; Archer, Kellie J; Therneau, Terry M; Atkinson, Elizabeth J; Vachon, Celine M; de Andrade, Mariza; Kocher, Jean-Pierre A; Eckel-Passow, Jeanette E
2013-11-15
Batch effects are due to probe-specific systematic variation between groups of samples (batches) resulting from experimental features that are not of biological interest. Principal component analysis (PCA) is commonly used as a visual tool to determine whether batch effects exist after applying a global normalization method. However, PCA yields linear combinations of the variables that contribute maximum variance and thus will not necessarily detect batch effects if they are not the largest source of variability in the data. We present an extension of PCA to quantify the existence of batch effects, called guided PCA (gPCA). We describe a test statistic that uses gPCA to test whether a batch effect exists. We apply our proposed test statistic derived using gPCA to simulated data and to two copy number variation case studies: the first study consisted of 614 samples from a breast cancer family study using Illumina Human 660 bead-chip arrays, whereas the second case study consisted of 703 samples from a family blood pressure study that used Affymetrix SNP Array 6.0. We demonstrate that our statistic has good statistical properties and is able to identify significant batch effects in two copy number variation case studies. We developed a new statistic that uses gPCA to identify whether batch effects exist in high-throughput genomic data. Although our examples pertain to copy number data, gPCA is general and can be used on other data types as well. The gPCA R package (Available via CRAN) provides functionality and data to perform the methods in this article. reesese@vcu.edu
NASA Astrophysics Data System (ADS)
da Silva, Roberto; Vainstein, Mendeli H.; Gonçalves, Sebastián; Paula, Felipe S. F.
2013-08-01
Statistics of soccer tournament scores based on the double round robin system of several countries are studied. Exploring the dynamics of team scoring during tournament seasons from recent years we find evidences of superdiffusion. A mean-field analysis results in a drift velocity equal to that of real data but in a different diffusion coefficient. Along with the analysis of real data we present the results of simulations of soccer tournaments obtained by an agent-based model which successfully describes the final scoring distribution [da Silva , Comput. Phys. Commun.CPHCBZ0010-465510.1016/j.cpc.2012.10.030 184, 661 (2013)]. Such model yields random walks of scores over time with the same anomalous diffusion as observed in real data.
Yang, Shuman; Luo, Yunhua; Yang, Lang; Dall'Ara, Enrico; Eastell, Richard; Goertzen, Andrew L; McCloskey, Eugene V; Leslie, William D; Lix, Lisa M
2018-05-01
Dual-energy X-ray absorptiometry (DXA)-based finite element analysis (FEA) has been studied for assessment of hip fracture risk. Femoral strength (FS) is the maximum force that the femur can sustain before its weakest region reaches the yielding limit. Fracture risk index (FRI), which also considers subject-specific impact force, is defined as the ratio of von Mises stress induced by a sideways fall to the bone yield stress over the proximal femur. We compared risk stratification for prior hip fracture using FS and FRI derived from DXA-based FEA. The study cohort included women aged ≥65years undergoing baseline hip DXA, with femoral neck T-scores <-1 and no osteoporosis treatment; 324 cases had prior hip fracture and 655 controls had no prior fracture. Using anonymized DXA hip scans, we measured FS and FRI. Separate multivariable logistic regression models were used to estimate odds ratios (ORs), c-statistics and their 95% confidence intervals (95% CIs) for the association of hip fracture with FS and FRI. Increased hip fracture risk was associated with lower FS (OR per SD 1.36, 95% CI: 1.15, 1.62) and higher FRI (OR per SD 1.99, 95% CI: 1.63, 2.43) after adjusting for Fracture Risk Assessment Tool (FRAX) hip fracture probability computed with bone mineral density (BMD). The c-statistic for the model containing FS (0.69; 95% CI: 0.65, 0.72) was lower than the c-statistic for the model with FRI (0.77; 95% CI: 0.74, 0.80) or femoral neck BMD (0.74; 95% CI: 0.71, 0.77; all P<0.05). FS and FRI were independently associated with hip fracture, but there were differences in performance characteristics. Copyright © 2018 Elsevier Inc. All rights reserved.
Böcker, Ulrich; Dinter, Dietmar; Litterer, Caroline; Hummel, Frank; Knebel, Phillip; Franke, Andreas; Weiss, Christel; Singer, Manfred V; Löhr, J-Matthias
2010-04-01
New technology has considerably advanced the diagnosis of small-bowel pathology. However, its significance in clinical algorithms has not yet been fully assessed. The aim of the present analysis was to compare the diagnostic utility and yield of video-capsule enteroscopy (VCE) to that of magnetic resonance imaging (MRI) in patients with suspected or established Crohn's disease (Group I), obscure gastrointestinal blood loss (Group II), or suspected tumors (Group III). Forty-six out of 182 patients who underwent both modalities were included: 21 in Group I, 20 in Group II, and five in Group III. Pathology was assessed in three predetermined sections of the small bowel (upper, middle, and lower). The McNemar and Wilcoxon tests were used for statistical analysis. In Group I, lesions were found by VCE in nine of the 21 patients and by MRI in six. In five patients, both modalities showed pathology. In Group II, pathological changes were detected in 11 of the 20 patients by VCE and in eight patients by MRI. In five cases, pathology was found with both modalities. In Group III, neither modality showed small-bowel pathology. For the patient groups combined, diagnostic yield was 43% with VCE and 30% with MRI. The diagnostic yield of VCE was superior to that of MRI in the upper small bowel in both Groups I and II. VCE is superior to MRI for the detection of lesions related to Crohn's disease or obscure gastrointestinal bleeding in the upper small bowel.
Alternatives to Crop Insurance for Mitigating Hydrologic Risk in the Upper Mississippi River Basin
NASA Astrophysics Data System (ADS)
Baker, J. M.; Griffis, T. J.; Gorski, G.; Wood, J. D.
2015-12-01
Corn and soybean production in the Upper Mississippi River Basin can be limited by either excess or shortage of water, often in the same year within the same watershed. Most producers indemnify themselves against these hazards through the Federal crop insurance program, which is heavily subsidized, thus discouraging expenditures on other forms of risk mitigation. The cost is not trivial, amounting to more than 60 billion USD over the past 15 years. Examination of long-term precipitation and streamflow records at the 8-digit scale suggests that inter-annual hydrologic variability in the region is increasing, particularly in an area stretching from NW IL through much of IA and southern MN. Analysis of crop insurance statistics shows that these same watersheds exhibit the highest frequency of coincident claims for yield losses to both excess water and drought within the same year. An emphasis on development of water management strategies to increase landscape storage and subsequent reuse through supplemental irrigation in this region could reduce the cost of the crop insurance program and stabilize yield. However, we also note that analysis of yield data from USDA-NASS shows that interannual yield variability at the watershed scale is much more muted than the indemnity data suggest, indicating that adverse selection is probably a factor in the crop insurance marketplace. Consequently, we propose that hydrologic mitigation practices may be most cost-effective if they are carefully targeted, using topographic, soil, and meteorological data, in combination with more site-specificity in crop insurance data.
Transfer Student Success: Educationally Purposeful Activities Predictive of Undergraduate GPA
ERIC Educational Resources Information Center
Fauria, Renee M.; Fuller, Matthew B.
2015-01-01
Researchers evaluated the effects of Educationally Purposeful Activities (EPAs) on transfer and nontransfer students' cumulative GPAs. Hierarchical, linear, and multiple regression models yielded seven statistically significant educationally purposeful items that influenced undergraduate student GPAs. Statistically significant positive EPAs for…
A statistical comparison of two carbon fiber/epoxy fabrication techniques
NASA Technical Reports Server (NTRS)
Hodge, A. J.
1991-01-01
A statistical comparison of the compression strengths of specimens that were fabricated by either a platen press or an autoclave were performed on IM6/3501-6 carbon/epoxy composites of 16-ply (0,+45,90,-45)(sub S2) lay-up configuration. The samples were cured with the same parameters and processing materials. It was found that the autoclaved panels were thicker than the platen press cured samples. Two hundred samples of each type of cure process were compression tested. The autoclaved samples had an average strength of 450 MPa (65.5 ksi), while the press cured samples had an average strength of 370 MPa (54.0 ksi). A Weibull analysis of the data showed that there is only a 30 pct. probability that the two types of cure systems yield specimens that can be considered from the same family.
VISSR Atmospheric Sounder (VAS) simulation experiment for a severe storm environment
NASA Technical Reports Server (NTRS)
Chesters, D.; Uccellini, L. W.; Mostek, A.
1981-01-01
Radiance fields were simulated for prethunderstorm environments in Oklahoma to demonstrate three points: (1) significant moisture gradients can be seen directly in images of the VISSIR Atmospheric Sounder (VAS) channels; (2) temperature and moisture profiles can be retrieved from VAS radiances with sufficient accuracy to be useful for mesoscale analysis of a severe storm environment; and (3) the quality of VAS mesoscale soundings improves with conditioning by local weather statistics. The results represent the optimum retrievability of mesoscale information from VAS radiance without the use of ancillary data. The simulations suggest that VAS data will yield the best soundings when a human being classifies the scene, picks relatively clear areas for retrieval, and applies a "local" statistical data base to resolve the ambiguities of satellite observations in favor of the most probable atmospheric structure.
Statistical analysis of experimental multifragmentation events in 64Zn+112Sn at 40 MeV/nucleon
NASA Astrophysics Data System (ADS)
Lin, W.; Zheng, H.; Ren, P.; Liu, X.; Huang, M.; Wada, R.; Chen, Z.; Wang, J.; Xiao, G. Q.; Qu, G.
2018-04-01
A statistical multifragmentation model (SMM) is applied to the experimentally observed multifragmentation events in an intermediate heavy-ion reaction. Using the temperature and symmetry energy extracted from the isobaric yield ratio (IYR) method based on the modified Fisher model (MFM), SMM is applied to the reaction 64Zn+112Sn at 40 MeV/nucleon. The experimental isotope distribution and mass distribution of the primary reconstructed fragments are compared without afterburner and they are well reproduced. The extracted temperature T and symmetry energy coefficient asym from SMM simulated events, using the IYR method, are also consistent with those from the experiment. These results strongly suggest that in the multifragmentation process there is a freezeout volume, in which the thermal and chemical equilibrium is established before or at the time of the intermediate-mass fragments emission.
Badhan, Ajay; Wang, Yu-Xi; Gruninger, Robert; Patton, Donald; Powlowski, Justin; Tsang, Adrian; McAllister, Tim A
2015-01-01
Identification of recalcitrant factors that limit digestion of forages and the development of enzymatic approaches that improve hydrolysis could play a key role in improving the efficiency of meat and milk production in ruminants. Enzyme fingerprinting of barley silage fed to heifers and total tract indigestible fibre residue (TIFR) collected from feces was used to identify cell wall components resistant to total tract digestion. Enzyme fingerprinting results identified acetyl xylan esterases as key to the enhanced ruminal digestion. FTIR analysis also suggested cross-link cell wall polymers as principal components of indigested fiber residues in feces. Based on structural information from enzymatic fingerprinting and FTIR, enzyme pretreatment to enhance glucose yield from barley straw and alfalfa hay upon exposure to mixed rumen-enzymes was developed. Prehydrolysis effects of recombinant fungal fibrolytic hydrolases were analyzed using microassay in combination with statistical experimental design. Recombinant hemicellulases and auxiliary enzymes initiated degradation of plant structural polysaccharides upon application and improved the in vitro saccharification of alfalfa and barley straw by mixed rumen enzymes. The validation results showed that microassay in combination with statistical experimental design can be successfully used to predict effective enzyme pretreatments that can enhance plant cell wall digestion by mixed rumen enzymes.
Estimators of The Magnitude-Squared Spectrum and Methods for Incorporating SNR Uncertainty
Lu, Yang; Loizou, Philipos C.
2011-01-01
Statistical estimators of the magnitude-squared spectrum are derived based on the assumption that the magnitude-squared spectrum of the noisy speech signal can be computed as the sum of the (clean) signal and noise magnitude-squared spectra. Maximum a posterior (MAP) and minimum mean square error (MMSE) estimators are derived based on a Gaussian statistical model. The gain function of the MAP estimator was found to be identical to the gain function used in the ideal binary mask (IdBM) that is widely used in computational auditory scene analysis (CASA). As such, it was binary and assumed the value of 1 if the local SNR exceeded 0 dB, and assumed the value of 0 otherwise. By modeling the local instantaneous SNR as an F-distributed random variable, soft masking methods were derived incorporating SNR uncertainty. The soft masking method, in particular, which weighted the noisy magnitude-squared spectrum by the a priori probability that the local SNR exceeds 0 dB was shown to be identical to the Wiener gain function. Results indicated that the proposed estimators yielded significantly better speech quality than the conventional MMSE spectral power estimators, in terms of yielding lower residual noise and lower speech distortion. PMID:21886543
NASA Astrophysics Data System (ADS)
Böttger, Simon; Hermann, Sascha; Schulz, Stefan E.; Gessner, Thomas
2016-10-01
For an industrial realization of devices based on single-walled carbon nanotube (SWCNTs) such as field-effect transistors (FETs) it becomes increasingly important to consider technological aspects such as intrinsic device structure, integration process controllability as well as yield. From the perspective of a wafer-level integration technology, the influence of SWCNT length on the performance of short-channel CNT-FETs is demonstrated by means of a statistical and comparative study. Therefore, a methodological development of a length separation process based on size-exclusion chromatography was conducted in order to extract well-separated SWCNT dispersions with narrowed length distribution. It could be shown that short SWCNTs adversely affect integrability and reproducibility, underlined by a 25% decline of the integration yield with respect to long SWCNTs. Furthermore, it turns out that the significant changes in electrical performance are directly linked to a SWCNT chain formation in the transistor channel. In particular, CNT-FETs with long SWCNTs outperform reference and short SWCNTs with respect to hole mobility and subthreshold controllability by up to 300% and up to 140%, respectively. As a whole, this study provides a statistical and comparative analysis towards chain-less CNT-FETs fabricated with a wafer-level technology.
Simultaneous Analysis and Quality Assurance for Diffusion Tensor Imaging
Lauzon, Carolyn B.; Asman, Andrew J.; Esparza, Michael L.; Burns, Scott S.; Fan, Qiuyun; Gao, Yurui; Anderson, Adam W.; Davis, Nicole; Cutting, Laurie E.; Landman, Bennett A.
2013-01-01
Diffusion tensor imaging (DTI) enables non-invasive, cyto-architectural mapping of in vivo tissue microarchitecture through voxel-wise mathematical modeling of multiple magnetic resonance imaging (MRI) acquisitions, each differently sensitized to water diffusion. DTI computations are fundamentally estimation processes and are sensitive to noise and artifacts. Despite widespread adoption in the neuroimaging community, maintaining consistent DTI data quality remains challenging given the propensity for patient motion, artifacts associated with fast imaging techniques, and the possibility of hardware changes/failures. Furthermore, the quantity of data acquired per voxel, the non-linear estimation process, and numerous potential use cases complicate traditional visual data inspection approaches. Currently, quality inspection of DTI data has relied on visual inspection and individual processing in DTI analysis software programs (e.g. DTIPrep, DTI-studio). However, recent advances in applied statistical methods have yielded several different metrics to assess noise level, artifact propensity, quality of tensor fit, variance of estimated measures, and bias in estimated measures. To date, these metrics have been largely studied in isolation. Herein, we select complementary metrics for integration into an automatic DTI analysis and quality assurance pipeline. The pipeline completes in 24 hours, stores statistical outputs, and produces a graphical summary quality analysis (QA) report. We assess the utility of this streamlined approach for empirical quality assessment on 608 DTI datasets from pediatric neuroimaging studies. The efficiency and accuracy of quality analysis using the proposed pipeline is compared with quality analysis based on visual inspection. The unified pipeline is found to save a statistically significant amount of time (over 70%) while improving the consistency of QA between a DTI expert and a pool of research associates. Projection of QA metrics to a low dimensional manifold reveal qualitative, but clear, QA-study associations and suggest that automated outlier/anomaly detection would be feasible. PMID:23637895
Simultaneous analysis and quality assurance for diffusion tensor imaging.
Lauzon, Carolyn B; Asman, Andrew J; Esparza, Michael L; Burns, Scott S; Fan, Qiuyun; Gao, Yurui; Anderson, Adam W; Davis, Nicole; Cutting, Laurie E; Landman, Bennett A
2013-01-01
Diffusion tensor imaging (DTI) enables non-invasive, cyto-architectural mapping of in vivo tissue microarchitecture through voxel-wise mathematical modeling of multiple magnetic resonance imaging (MRI) acquisitions, each differently sensitized to water diffusion. DTI computations are fundamentally estimation processes and are sensitive to noise and artifacts. Despite widespread adoption in the neuroimaging community, maintaining consistent DTI data quality remains challenging given the propensity for patient motion, artifacts associated with fast imaging techniques, and the possibility of hardware changes/failures. Furthermore, the quantity of data acquired per voxel, the non-linear estimation process, and numerous potential use cases complicate traditional visual data inspection approaches. Currently, quality inspection of DTI data has relied on visual inspection and individual processing in DTI analysis software programs (e.g. DTIPrep, DTI-studio). However, recent advances in applied statistical methods have yielded several different metrics to assess noise level, artifact propensity, quality of tensor fit, variance of estimated measures, and bias in estimated measures. To date, these metrics have been largely studied in isolation. Herein, we select complementary metrics for integration into an automatic DTI analysis and quality assurance pipeline. The pipeline completes in 24 hours, stores statistical outputs, and produces a graphical summary quality analysis (QA) report. We assess the utility of this streamlined approach for empirical quality assessment on 608 DTI datasets from pediatric neuroimaging studies. The efficiency and accuracy of quality analysis using the proposed pipeline is compared with quality analysis based on visual inspection. The unified pipeline is found to save a statistically significant amount of time (over 70%) while improving the consistency of QA between a DTI expert and a pool of research associates. Projection of QA metrics to a low dimensional manifold reveal qualitative, but clear, QA-study associations and suggest that automated outlier/anomaly detection would be feasible.
NASA Astrophysics Data System (ADS)
Prasanna, V.
2018-01-01
This study makes use of temperature and precipitation from CMIP5 climate model output for climate change application studies over the Indian region during the summer monsoon season (JJAS). Bias correction of temperature and precipitation from CMIP5 GCM simulation results with respect to observation is discussed in detail. The non-linear statistical bias correction is a suitable bias correction method for climate change data because it is simple and does not add up artificial uncertainties to the impact assessment of climate change scenarios for climate change application studies (agricultural production changes) in the future. The simple statistical bias correction uses observational constraints on the GCM baseline, and the projected results are scaled with respect to the changing magnitude in future scenarios, varying from one model to the other. Two types of bias correction techniques are shown here: (1) a simple bias correction using a percentile-based quantile-mapping algorithm and (2) a simple but improved bias correction method, a cumulative distribution function (CDF; Weibull distribution function)-based quantile-mapping algorithm. This study shows that the percentile-based quantile mapping method gives results similar to the CDF (Weibull)-based quantile mapping method, and both the methods are comparable. The bias correction is applied on temperature and precipitation variables for present climate and future projected data to make use of it in a simple statistical model to understand the future changes in crop production over the Indian region during the summer monsoon season. In total, 12 CMIP5 models are used for Historical (1901-2005), RCP4.5 (2005-2100), and RCP8.5 (2005-2100) scenarios. The climate index from each CMIP5 model and the observed agricultural yield index over the Indian region are used in a regression model to project the changes in the agricultural yield over India from RCP4.5 and RCP8.5 scenarios. The results revealed a better convergence of model projections in the bias corrected data compared to the uncorrected data. The study can be extended to localized regional domains aimed at understanding the changes in the agricultural productivity in the future with an agro-economy or a simple statistical model. The statistical model indicated that the total food grain yield is going to increase over the Indian region in the future, the increase in the total food grain yield is approximately 50 kg/ ha for the RCP4.5 scenario from 2001 until the end of 2100, and the increase in the total food grain yield is approximately 90 kg/ha for the RCP8.5 scenario from 2001 until the end of 2100. There are many studies using bias correction techniques, but this study applies the bias correction technique to future climate scenario data from CMIP5 models and applied it to crop statistics to find future crop yield changes over the Indian region.
Commercially sterilized mussel meats (Mytilus chilensis): a study on process yield.
Almonacid, S; Bustamante, J; Simpson, R; Urtubia, A; Pinto, M; Teixeira, A
2012-06-01
The processing steps most responsible for yield loss in the manufacture of canned mussel meats are the thermal treatments of precooking to remove meats from shells, and thermal processing (retorting) to render the final canned product commercially sterile for long-term shelf stability. The objective of this study was to investigate and evaluate the impact of different combinations of process variables on the ultimate drained weight in the final mussel product (Mytilu chilensis), while verifying that any differences found were statistically and economically significant. The process variables selected for this study were precooking time, brine salt concentration, and retort temperature. Results indicated 2 combinations of process variables producing the widest difference in final drained weight, designated best combination and worst combination with 35% and 29% yield, respectively. Significance of this difference was determined by employing a Bootstrap methodology, which assumes an empirical distribution of statistical error. A difference of nearly 6 percentage points in total yield was found. This represents a 20% increase in annual sales from the same quantity of raw material, in addition to increase in yield, the conditions for the best process included a retort process time 65% shorter than that for the worst process, this difference in yield could have significant economic impact, important to the mussel canning industry. © 2012 Institute of Food Technologists®
NASA Astrophysics Data System (ADS)
Zhang, Yi; Zhao, Yanxia; Wang, Chunyi; Chen, Sining
2017-11-01
Assessment of the impact of climate change on crop productions with considering uncertainties is essential for properly identifying and decision-making agricultural practices that are sustainable. In this study, we employed 24 climate projections consisting of the combinations of eight GCMs and three emission scenarios representing the climate projections uncertainty, and two crop statistical models with 100 sets of parameters in each model representing parameter uncertainty within the crop models. The goal of this study was to evaluate the impact of climate change on maize ( Zea mays L.) yield at three locations (Benxi, Changling, and Hailun) across Northeast China (NEC) in periods 2010-2039 and 2040-2069, taking 1976-2005 as the baseline period. The multi-models ensembles method is an effective way to deal with the uncertainties. The results of ensemble simulations showed that maize yield reductions were less than 5 % in both future periods relative to the baseline. To further understand the contributions of individual sources of uncertainty, such as climate projections and crop model parameters, in ensemble yield simulations, variance decomposition was performed. The results indicated that the uncertainty from climate projections was much larger than that contributed by crop model parameters. Increased ensemble yield variance revealed the increasing uncertainty in the yield simulation in the future periods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gaspero, Mario
2010-08-05
A narrow peak in the {pi}{sup +{pi}-} mass distribution was seen by the Rome-Syracuse Collaboration in p-barn{yields}2{pi}{sup +}3{pi}{sup -} annihilation at rest in 1970. It was ignored for 40 years. The reanalysis of this peak finds that it has the mass 757.4{+-}2.8{sub stat{+-}}1.2{sub sys} MeV/c{sup 2} and a width consistent with the experimental resolution. The evidence of the peak is 5.2 standard deviations. The peak is generated in (1.03{+-}0.21{sub stat{+-}}0.21{sub sys})% of the p-barn annihilations at rest. No spin analysis is possible with the statistics of the experiment but there are arguments suggesting that it has J{sup p} = 0{supmore » +}.« less
Barekati-Goudarzi, Mohamad; Boldor, Dorin; Nde, Divine B
2016-02-01
In-situ transesterification (simultaneous extraction and transesterification) of Chinese tallow tree seeds into methyl esters using a batch microwave system was investigated in this study. A high degree of oil extraction and efficient conversion of oil to biodiesel were found in the proposed range. The process was further optimized in terms of product yields and conversion rates using Doehlert optimization methodology. Based on the experimental results and statistical analysis, the optimal production yield conditions for this process were determined as: catalyst concentration of 1.74wt.%, solvent ratio about 3 (v/w), reaction time of 20min and temperature of 58.1°C. H(+)NMR was used to calculate reaction conversion. All methyl esters produced using this method met ASTM biodiesel quality specifications. Copyright © 2015 Elsevier Ltd. All rights reserved.
Brown, Christopher U; Jacob, Gregor; Stoudt, Mark; Moylan, Shawn; Slotwinski, John; Donmez, Alkan
2016-08-01
Six different organizations participated in this interlaboratory study to quantify the variability in the tensile properties of Inconel 625 specimens manufactured using laser-powder-bed-fusion additive manufacturing machines. The tensile specimens were heat treated and tensile tests conducted until failure. The properties measured were yield strength, ultimate tensile strength, elastic modulus, and elongation. Statistical analysis revealed that between-participant variability for yield strength, ultimate tensile strength, and elastic modulus values were significantly higher (up to 4 times) than typical within-participant variations. Only between-participant and within-participant variability were both similar for elongation. A scanning electron microscope was used to examine one tensile specimen for fractography. The fracture surface does not have many secondary cracks or other features that would reduce the mechanical properties. In fact, the features largely consist of microvoid coalescence and are entirely consistent with ductile failure.
Brown, Christopher U.; Jacob, Gregor; Stoudt, Mark; Moylan, Shawn; Slotwinski, John; Donmez, Alkan
2017-01-01
Six different organizations participated in this interlaboratory study to quantify the variability in the tensile properties of Inconel 625 specimens manufactured using laser-powder-bed-fusion additive manufacturing machines. The tensile specimens were heat treated and tensile tests conducted until failure. The properties measured were yield strength, ultimate tensile strength, elastic modulus, and elongation. Statistical analysis revealed that between-participant variability for yield strength, ultimate tensile strength, and elastic modulus values were significantly higher (up to 4 times) than typical within-participant variations. Only between-participant and within-participant variability were both similar for elongation. A scanning electron microscope was used to examine one tensile specimen for fractography. The fracture surface does not have many secondary cracks or other features that would reduce the mechanical properties. In fact, the features largely consist of microvoid coalescence and are entirely consistent with ductile failure. PMID:28243032
Karpe, Avinash V; Beale, David J; Godhani, Nainesh B; Morrison, Paul D; Harding, Ian H; Palombo, Enzo A
2015-12-16
Winery-derived biomass waste was degraded by Penicillium chrysogenum under solid state fermentation over 8 days in a (2)H2O-supplemented medium. Multivariate statistical analysis of the gas chromatography-mass spectrometry (GC-MS) data resulted in the identification of 94 significant metabolites, within 28 different metabolic pathways. The majority of biomass sugars were utilized by day 4 to yield products such as sugars, fatty acids, isoprenoids, and amino acids. The fungus was observed to metabolize xylose to xylitol, an intermediate of ethanol production. However, enzyme inhibition and autolysis were observed from day 6, indicating 5 days as the optimal time for fermentation. P. chrysogenum displayed metabolism of pentoses (to alcohols) and degraded tannins and lignins, properties that are lacking in other biomass-degrading ascomycetes. Rapid fermentation (3-5 days) may not only increase the pentose metabolizing efficiency but also increase the yield of medicinally important metabolites, such as syringate.
NASA Astrophysics Data System (ADS)
Brown, Christopher U.; Jacob, Gregor; Stoudt, Mark; Moylan, Shawn; Slotwinski, John; Donmez, Alkan
2016-08-01
Six different organizations participated in this interlaboratory study to quantify the variability in the tensile properties of Inconel 625 specimens manufactured using laser powder bed fusion-additive manufacturing machines. The tensile specimens were heat treated and tensile tests were conducted until failure. The properties measured were yield strength, ultimate tensile strength, elastic modulus, and elongation. Statistical analysis revealed that between-participant variability for yield strength, ultimate tensile strength, and elastic modulus values were significantly higher (up to four times) than typical within-participant variations. Only between-participant and within-participant variability were both similar for elongation. A scanning electron microscope was used to examine one tensile specimen for fractography. The fracture surface does not have many secondary cracks or other features that would reduce the mechanical properties. In fact, the features largely consist of microvoid coalescence and are entirely consistent with ductile failure.
NASA Astrophysics Data System (ADS)
Abgrall, N.; Aduszkiewicz, A.; Ajaz, M.; Ali, Y.; Andronov, E.; Antićić, T.; Antoniou, N.; Baatar, B.; Bay, F.; Blondel, A.; Blümer, J.; Bogomilov, M.; Brandin, A.; Bravar, A.; Brzychczyk, J.; Bunyatov, S. A.; Busygina, O.; Christakoglou, P.; Ćirković, M.; Czopowicz, T.; Davis, N.; Debieux, S.; Dembinski, H.; Deveaux, M.; Diakonos, F.; Di Luise, S.; Dominik, W.; Dumarchez, J.; Dynowski, K.; Engel, R.; Ereditato, A.; Feofilov, G. A.; Fodor, Z.; Garibov, A.; Gaździcki, M.; Golubeva, M.; Grebieszkow, K.; Grzeszczuk, A.; Guber, F.; Haesler, A.; Hasegawa, T.; Hervé, A. E.; Hierholzer, M.; Igolkin, S.; Ivashkin, A.; Johnson, S. R.; Kadija, K.; Kapoyannis, A.; Kaptur, E.; Kisiel, J.; Kobayashi, T.; Kolesnikov, V. I.; Kolev, D.; Kondratiev, V. P.; Korzenev, A.; Kowalik, K.; Kowalski, S.; Koziel, M.; Krasnoperov, A.; Kuich, M.; Kurepin, A.; Larsen, D.; László, A.; Lewicki, M.; Lyubushkin, V. V.; Maćkowiak-Pawłowska, M.; Maksiak, B.; Malakhov, A. I.; Manić, D.; Marcinek, A.; Marino, A. D.; Marton, K.; Mathes, H.-J.; Matulewicz, T.; Matveev, V.; Melkumov, G. L.; Messerly, B.; Mills, G. B.; Morozov, S.; Mrówczyński, S.; Nagai, Y.; Nakadaira, T.; Naskręt, M.; Nirkko, M.; Nishikawa, K.; Panagiotou, A. D.; Paolone, V.; Pavin, M.; Petukhov, O.; Pistillo, C.; Płaneta, R.; Popov, B. A.; Posiadała-Zezula, M.; Puławski, S.; Puzović, J.; Rauch, W.; Ravonel, M.; Redij, A.; Renfordt, R.; Richter-Wąs, E.; Robert, A.; Röhrich, D.; Rondio, E.; Roth, M.; Rubbia, A.; Rumberger, B. T.; Rustamov, A.; Rybczynski, M.; Sadovsky, A.; Sakashita, K.; Sarnecki, R.; Schmidt, K.; Sekiguchi, T.; Selyuzhenkov, I.; Seryakov, A.; Seyboth, P.; Sgalaberna, D.; Shibata, M.; Słodkowski, M.; Staszel, P.; Stefanek, G.; Stepaniak, J.; Ströbele, H.; Šuša, T.; Szuba, M.; Tada, M.; Taranenko, A.; Tefelska, A.; Tefelski, D.; Tereshchenko, V.; Tsenov, R.; Turko, L.; Ulrich, R.; Unger, M.; Vassiliou, M.; Veberič, D.; Vechernin, V. V.; Vesztergombi, G.; Vinogradov, L.; Wilczek, A.; Włodarczyk, Z.; Wojtaszek-Szwarc, A.; Wyszyński, O.; Yarritu, K.; Zambelli, L.; Zimmerman, E. D.; Friend, M.; Galymov, V.; Hartz, M.; Hiraki, T.; Ichikawa, A.; Kubo, H.; Matsuoka, K.; Murakami, A.; Nakaya, T.; Suzuki, K.; Tzanov, M.; Yu, M.
2016-11-01
Measurements of particle emission from a replica of the T2K 90 cm-long carbon target were performed in the NA61/SHINE experiment at CERN SPS, using data collected during a high-statistics run in 2009. An efficient use of the long-target measurements for neutrino flux predictions in T2K requires dedicated reconstruction and analysis techniques. Fully-corrected differential yields of π ^± -mesons from the surface of the T2K replica target for incoming 31 GeV/ c protons are presented. A possible strategy to implement these results into the T2K neutrino beam predictions is discussed and the propagation of the uncertainties of these results to the final neutrino flux is performed.
Statistical analysis and yield management in LED design through TCAD device simulation
NASA Astrophysics Data System (ADS)
Létay, Gergö; Ng, Wei-Choon; Schneider, Lutz; Bregy, Adrian; Pfeiffer, Michael
2007-02-01
This paper illustrates how technology computer-aided design (TCAD), which nowadays is an essential part of CMOS technology, can be applied to LED development and manufacturing. In the first part, the essential electrical and optical models inherent to LED modeling are reviewed. The second part of the work describes a methodology to improve the efficiency of the simulation procedure by using the concept of process compact models (PCMs). The last part demonstrates the capabilities of PCMs using an example of a blue InGaN LED. In particular, a parameter screening is performed to find the most important parameters, an optimization task incorporating the robustness of the design is carried out, and finally the impact of manufacturing tolerances on yield is investigated. It is indicated how the concept of PCMs can contribute to an efficient design for manufacturing DFM-aware development.
Study of B to pi l nu and B to rho l nu Decays and Determination of |V_ub|
DOE Office of Scientific and Technical Information (OSTI.GOV)
del Amo Sanchez, P.; Lees, J.P.; Poireau, V.
2011-12-09
We present an analysis of exclusive charmless semileptonic B-meson decays based on 377 million B{bar B} pairs recorded with the BABAR detector at the {Upsilon} (4S) resonance. We select four event samples corresponding to the decay modes B{sup 0} {yields} {pi}{sup -}{ell}{sup +}{nu}, B{sup +} {yields} {pi}{sup 0}{ell}{sup +}{nu}, B{sup 0} {yields} {rho}{sup -}{ell}{sup +}{nu}, and B{sup +} {yields} {rho}{sup 0}{ell}{sup +}{nu}, and find the measured branching fractions to be consistent with isospin symmetry. Assuming isospin symmetry, we combine the two B {yields} {pi}{ell}{nu} samples, and similarly the two B {yields} {rho}{ell}{nu} samples, and measure the branching fractions {Beta}(B{sup 0}more » {yields} {pi}{sup -}{ell}{sup +}{nu}) = (1.41 {+-} 0.05 {+-} 0.07) x 10{sup -4} and {Beta}(B{sup 0} {yields} {rho}{sup 0}{ell}{sup +}{nu}) = (1.75 {+-} 0.15 {+-} 0.27) x 10{sup -4}, where the errors are statistical and systematic. We compare the measured distribution in q{sup 2}, the momentum transfer squared, with predictions for the form factors from QCD calculations and determine the CKM matrix element |V{sub ub}|. Based on the measured partial branching fraction for B {yields} {pi}{ell}{nu} in the range q{sup 2} < 12 GeV{sup 2} and the most recent LCSR calculations we obtain |V{sub ub}| = (3.78 {+-} 0.13{sub -0.40}{sup +0.55}) x 10{sup -3}, where the errors refer to the experimental and theoretical uncertainties. From a simultaneous fit to the data over the full q{sup 2} range and the FNAL/MILC lattice QCD results, we obtain |V{sub ub}| = (2.95 {+-} 0.31) x 10{sup -3} from B {yields} {pi}{ell}{nu}, where the error is the combined experimental and theoretical uncertainty.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leng, Guoyong
Temperature is known to be correlated with crop yields, causing reduction of crop yield with climate warming without adaptations or CO2 fertilization effects. The historical temperature-crop yield relation has often been used for informing future changes. This relationship, however, may change over time following alternations in other environmental factors. Results show that the strength of the relationship between the interannual variability of growing season temperature and corn yield (RGST_CY) has declined in the United States between 1980 and 2010 with a loss in the statistical significance. The regression slope which represents the anomalies in corn yield that occur in associationmore » with 1 degree temperature anomaly has decreased significantly from -6.9%/K of the first half period to -2.4%/K~-3.5%/K of the second half period. This implies that projected corn yield reduction will be overestimated by a fact of 2 in a given warming scenario, if the corn-temperature relation is derived from the earlier historical period. Changes in RGST_CY are mainly observed in Midwest Corn Belt and central High Plains, and are well reproduced by 11 process-based crop models. In Midwest rain-fed systems, the decrease of negative temperature effects coincides with an increase in water availability by precipitation. In irrigated areas where water stress is minimized, the decline of beneficial temperature effects is significantly related to the increase in extreme hot days. The results indicate that an extrapolation of historical yield response to temperature may bias the assessment of agriculture vulnerability to climate change. Efforts to reduce climate impacts on agriculture should pay attention not only to climate change, but also to changes in climate-crop yield relations. There are some caveats that should be acknowledged as the analysis is restricted to the changes in the linear relation between growing season mean temperature and corn yield for the specific study period.« less
Leng, Guoyong
2017-12-15
Temperature is known to be correlated with crop yields, causing reduction of crop yield with climate warming without adaptations or CO 2 fertilization effects. The historical temperature-crop yield relation has often been used for informing future changes. This relationship, however, may change over time following alternations in other environmental factors. Results show that the strength of the relationship between the interannual variability of growing season temperature and corn yield (R GST_CY ) has declined in the United States between 1980 and 2010 with a loss in the statistical significance. The regression slope which represents the anomalies in corn yield that occur in association with 1 degree temperature anomaly has decreased significantly from -6.9%/K of the first half period to -2.4%/K--3.5%/K of the second half period. This implies that projected corn yield reduction will be overestimated by a fact of 2 in a given warming scenario, if the corn-temperature relation is derived from the earlier historical period. Changes in R GST_CY are mainly observed in Midwest Corn Belt and central High Plains, but are partly reproduced by 11 process-based crop models. In Midwest rain-fed systems, the decrease of negative temperature effects coincides with an increase in water availability by precipitation. In irrigated areas where water stress is minimized, the decline of beneficial temperature effects is significantly related to the increase in extreme hot days. The results indicate that an extrapolation of historical yield response to temperature may bias the assessment of agriculture vulnerability to climate change. Efforts to reduce climate impacts on agriculture should pay attention not only to climate change, but also to changes in climate-crop yield relations. There are some caveats that should be acknowledged as the analysis is restricted to the changes in the linear relation between growing season mean temperature and corn yield for the specific study period. Copyright © 2017 Elsevier B.V. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-20
... and opinions, but are not statistical surveys that yield quantitative results that can be generalized... generic clearance for qualitative information will not be used for quantitative information collections... for submission for other generic mechanisms that are designed to yield quantitative results. The...
Statistical modeling of yield and variance instability in conventional and organic cropping systems
USDA-ARS?s Scientific Manuscript database
Cropping systems research was undertaken to address declining crop diversity and verify competitiveness of alternatives to the predominant conventional cropping system in the northern Corn Belt. To understand and capitalize on temporal yield variability within corn and soybean fields, we quantified ...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-13
... insights on perceptions and opinions, but are not statistical surveys that yield quantitative results that.... This type of generic clearance for qualitative information will not be used for quantitative... for submission for other generic mechanisms that are designed to yield quantitative results. The...
Ebshish, Ali; Yaakob, Zahira; Taufiq-Yap, Yun Hin; Bshish, Ahmed
2014-01-01
In this work; a response surface methodology (RSM) was implemented to investigate the process variables in a hydrogen production system. The effects of five independent variables; namely the temperature (X1); the flow rate (X2); the catalyst weight (X3); the catalyst loading (X4) and the glycerol-water molar ratio (X5) on the H2 yield (Y1) and the conversion of glycerol to gaseous products (Y2) were explored. Using multiple regression analysis; the experimental results of the H2 yield and the glycerol conversion to gases were fit to quadratic polynomial models. The proposed mathematical models have correlated the dependent factors well within the limits that were being examined. The best values of the process variables were a temperature of approximately 600 °C; a feed flow rate of 0.05 mL/min; a catalyst weight of 0.2 g; a catalyst loading of 20% and a glycerol-water molar ratio of approximately 12; where the H2 yield was predicted to be 57.6% and the conversion of glycerol was predicted to be 75%. To validate the proposed models; statistical analysis using a two-sample t-test was performed; and the results showed that the models could predict the responses satisfactorily within the limits of the variables that were studied. PMID:28788567
Intermediate quantum maps for quantum computation
NASA Astrophysics Data System (ADS)
Giraud, O.; Georgeot, B.
2005-10-01
We study quantum maps displaying spectral statistics intermediate between Poisson and Wigner-Dyson. It is shown that they can be simulated on a quantum computer with a small number of gates, and efficiently yield information about fidelity decay or spectral statistics. We study their matrix elements and entanglement production and show that they converge with time to distributions which differ from random matrix predictions. A randomized version of these maps can be implemented even more economically and yields pseudorandom operators with original properties, enabling, for example, one to produce fractal random vectors. These algorithms are within reach of present-day quantum computers.
Bowden, Peter; Beavis, Ron; Marshall, John
2009-11-02
A goodness of fit test may be used to assign tandem mass spectra of peptides to amino acid sequences and to directly calculate the expected probability of mis-identification. The product of the peptide expectation values directly yields the probability that the parent protein has been mis-identified. A relational database could capture the mass spectral data, the best fit results, and permit subsequent calculations by a general statistical analysis system. The many files of the Hupo blood protein data correlated by X!TANDEM against the proteins of ENSEMBL were collected into a relational database. A redundant set of 247,077 proteins and peptides were correlated by X!TANDEM, and that was collapsed to a set of 34,956 peptides from 13,379 distinct proteins. About 6875 distinct proteins were only represented by a single distinct peptide, 2866 proteins showed 2 distinct peptides, and 3454 proteins showed at least three distinct peptides by X!TANDEM. More than 99% of the peptides were associated with proteins that had cumulative expectation values, i.e. probability of false positive identification, of one in one hundred or less. The distribution of peptides per protein from X!TANDEM was significantly different than those expected from random assignment of peptides.
Mieth, Bettina; Kloft, Marius; Rodríguez, Juan Antonio; Sonnenburg, Sören; Vobruba, Robin; Morcillo-Suárez, Carlos; Farré, Xavier; Marigorta, Urko M.; Fehr, Ernst; Dickhaus, Thorsten; Blanchard, Gilles; Schunk, Daniel; Navarro, Arcadi; Müller, Klaus-Robert
2016-01-01
The standard approach to the analysis of genome-wide association studies (GWAS) is based on testing each position in the genome individually for statistical significance of its association with the phenotype under investigation. To improve the analysis of GWAS, we propose a combination of machine learning and statistical testing that takes correlation structures within the set of SNPs under investigation in a mathematically well-controlled manner into account. The novel two-step algorithm, COMBI, first trains a support vector machine to determine a subset of candidate SNPs and then performs hypothesis tests for these SNPs together with an adequate threshold correction. Applying COMBI to data from a WTCCC study (2007) and measuring performance as replication by independent GWAS published within the 2008–2015 period, we show that our method outperforms ordinary raw p-value thresholding as well as other state-of-the-art methods. COMBI presents higher power and precision than the examined alternatives while yielding fewer false (i.e. non-replicated) and more true (i.e. replicated) discoveries when its results are validated on later GWAS studies. More than 80% of the discoveries made by COMBI upon WTCCC data have been validated by independent studies. Implementations of the COMBI method are available as a part of the GWASpi toolbox 2.0. PMID:27892471
Mieth, Bettina; Kloft, Marius; Rodríguez, Juan Antonio; Sonnenburg, Sören; Vobruba, Robin; Morcillo-Suárez, Carlos; Farré, Xavier; Marigorta, Urko M; Fehr, Ernst; Dickhaus, Thorsten; Blanchard, Gilles; Schunk, Daniel; Navarro, Arcadi; Müller, Klaus-Robert
2016-11-28
The standard approach to the analysis of genome-wide association studies (GWAS) is based on testing each position in the genome individually for statistical significance of its association with the phenotype under investigation. To improve the analysis of GWAS, we propose a combination of machine learning and statistical testing that takes correlation structures within the set of SNPs under investigation in a mathematically well-controlled manner into account. The novel two-step algorithm, COMBI, first trains a support vector machine to determine a subset of candidate SNPs and then performs hypothesis tests for these SNPs together with an adequate threshold correction. Applying COMBI to data from a WTCCC study (2007) and measuring performance as replication by independent GWAS published within the 2008-2015 period, we show that our method outperforms ordinary raw p-value thresholding as well as other state-of-the-art methods. COMBI presents higher power and precision than the examined alternatives while yielding fewer false (i.e. non-replicated) and more true (i.e. replicated) discoveries when its results are validated on later GWAS studies. More than 80% of the discoveries made by COMBI upon WTCCC data have been validated by independent studies. Implementations of the COMBI method are available as a part of the GWASpi toolbox 2.0.
NASA Astrophysics Data System (ADS)
Mieth, Bettina; Kloft, Marius; Rodríguez, Juan Antonio; Sonnenburg, Sören; Vobruba, Robin; Morcillo-Suárez, Carlos; Farré, Xavier; Marigorta, Urko M.; Fehr, Ernst; Dickhaus, Thorsten; Blanchard, Gilles; Schunk, Daniel; Navarro, Arcadi; Müller, Klaus-Robert
2016-11-01
The standard approach to the analysis of genome-wide association studies (GWAS) is based on testing each position in the genome individually for statistical significance of its association with the phenotype under investigation. To improve the analysis of GWAS, we propose a combination of machine learning and statistical testing that takes correlation structures within the set of SNPs under investigation in a mathematically well-controlled manner into account. The novel two-step algorithm, COMBI, first trains a support vector machine to determine a subset of candidate SNPs and then performs hypothesis tests for these SNPs together with an adequate threshold correction. Applying COMBI to data from a WTCCC study (2007) and measuring performance as replication by independent GWAS published within the 2008-2015 period, we show that our method outperforms ordinary raw p-value thresholding as well as other state-of-the-art methods. COMBI presents higher power and precision than the examined alternatives while yielding fewer false (i.e. non-replicated) and more true (i.e. replicated) discoveries when its results are validated on later GWAS studies. More than 80% of the discoveries made by COMBI upon WTCCC data have been validated by independent studies. Implementations of the COMBI method are available as a part of the GWASpi toolbox 2.0.
Drew, L.J.; Schuenemeyer, J.H.; Amstrong, T.R.; Sutphin, D.M.
2001-01-01
A model is proposed to explain the statistical relations between the mean initial water well yields from eight time increments from 1984 to 1998 for wells drilled into the crystalline bedrock aquifer system in the Pinardville area of southern New Hampshire and the type of bedrock, mean well depth, and mean well elevation. Statistical analyses show that the mean total yield of drilling increments is positively correlated with mean total well depth and mean well elevation. In addition, the mean total well yield varies with rock type from a minimum of 46.9 L/min (12.4 gpm) in the Damon Pond granite to a maximum of 74.5 L/min (19.7 gpm) in the Permian pegmatite and granite unit. Across the eight drilling increments that comprise 211 wells each, the percentages of very low-yield wells (1.9 L/min [0.5 gpm] or less) and high-yield wells (151.4 L/min [40 gpm] or more) increased, and those of intermediate-yield wells decreased. As housing development progressed during the 1984 to 1998 interval, the mean depth of the wells and their elevations increased, and the mix of percentages of the bedrock types drilled changed markedly. The proposed model uses a feed-forward mechanism to explain the interaction between the increasing mean elevation, mean well depth, and percentages of very low-yielding wells and the mean well yield. The increasing percentages of very low-yielding wells through time and the economics of the housing market may control the system that forces the mean well depths, percentages of high-yield wells, and mean well yields to increase. The reason for the increasing percentages of very low-yield wells is uncertain, but the explanation is believed to involve the complex structural geology and tectonic history of the Pinardville quadrangle.
Estimating short-run and long-run interaction mechanisms in interictal state.
Ozkaya, Ata; Korürek, Mehmet
2010-04-01
We address the issue of analyzing electroencephalogram (EEG) from seizure patients in order to test, model and determine the statistical properties that distinguish between EEG states (interictal, pre-ictal, ictal) by introducing a new class of time series analysis methods. In the present study: firstly, we employ statistical methods to determine the non-stationary behavior of focal interictal epileptiform series within very short time intervals; secondly, for such intervals that are deemed non-stationary we suggest the concept of Autoregressive Integrated Moving Average (ARIMA) process modelling, well known in time series analysis. We finally address the queries of causal relationships between epileptic states and between brain areas during epileptiform activity. We estimate the interaction between different EEG series (channels) in short time intervals by performing Granger-causality analysis and also estimate such interaction in long time intervals by employing Cointegration analysis, both analysis methods are well-known in econometrics. Here we find: first, that the causal relationship between neuronal assemblies can be identified according to the duration and the direction of their possible mutual influences; second, that although the estimated bidirectional causality in short time intervals yields that the neuronal ensembles positively affect each other, in long time intervals neither of them is affected (increasing amplitudes) from this relationship. Moreover, Cointegration analysis of the EEG series enables us to identify whether there is a causal link from the interictal state to ictal state.
Singh, Yogendra; Srivastav, S K
2013-04-01
Over the past few decades, L-asparaginase has emerged as an excellent anti-neoplastic agent. In present study, a new strain ITBHU02, isolated from soil site near degrading hospital waste, was investigated for the production of extracellular L-asparaginase. Further, it was renamed as Bacillus aryabhattai ITBHU02 based on its phenotypical features, biochemical characteristics, fatty acid methyl ester (FAME) profile and phylogenetic similarity of 16S rDNA sequences. The strain was found protease-deficient and its optimal growth occurred at 37 degrees C and pH 7.5. The strain was capable of producing enzyme L-asparaginase with maximum specific activity of 3.02 +/- 0.3 Umg(-1) protein, when grown in un-optimized medium composition and physical parameters. In order to improve the production of L-asparaginase by the isolate, response surface methodology (RSM) and genetic algorithm (GA) based techniques were implemented. The data achieved through the statistical design matrix were used for regression analysis and analysis of variance studies. Furthermore, GA was implemented utilizing polynomial regression equation as a fitness function. Maximum average L-asparaginase productivity of 6.35 Umg(-1) was found at GA optimized concentrations of 4.07, 0.82, 4.91, and 5.2 gL(-1) for KH2PO4, MgSO4 x 7H2O, L-asparagine, and glucose respectively. The GA optimized yield of the enzyme was 7.8% higher in comparison to the yield obtained through RSM based optimization.
Pavlidis, Paul; Qin, Jie; Arango, Victoria; Mann, John J; Sibille, Etienne
2004-06-01
One of the challenges in the analysis of gene expression data is placing the results in the context of other data available about genes and their relationships to each other. Here, we approach this problem in the study of gene expression changes associated with age in two areas of the human prefrontal cortex, comparing two computational methods. The first method, "overrepresentation analysis" (ORA), is based on statistically evaluating the fraction of genes in a particular gene ontology class found among the set of genes showing age-related changes in expression. The second method, "functional class scoring" (FCS), examines the statistical distribution of individual gene scores among all genes in the gene ontology class and does not involve an initial gene selection step. We find that FCS yields more consistent results than ORA, and the results of ORA depended strongly on the gene selection threshold. Our findings highlight the utility of functional class scoring for the analysis of complex expression data sets and emphasize the advantage of considering all available genomic information rather than sets of genes that pass a predetermined "threshold of significance."
78 FR 2370 - New England Fishery Management Council (NEFMC); Public Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-11
... p.m. to address employment matters. Tuesday, January 29, 2013 Following introductions and any... catch based on Scientific and Statistical Committee advice, management uncertainty, optimum yield and a...'s Scientific and Statistical Committee will report on its acceptable biological catch...
The Influence of Liquids on the Mechanical Properties of Allografts in Bone Impaction Grafting.
Putzer, David; Ammann, Christoph Gert; Coraça-Huber, Débora; Lechner, Ricarda; Schmölz, Werner; Nogler, Michael
2017-10-01
Allografts are used to compensate for bone defects resulting from revision surgery, tumor surgery, and reconstructive bone surgery. Although it is well known that the reduction of fat content of allografts increases mechanical properties, the content of liquids with a known grain size distribution has not been assessed so far. The aim of the study was to compare the mechanical properties of dried allografts (DA) with allografts mixed with a saline solution (ASS) and with allografts mixed with blood (AB) having a similar grain size distribution. Fresh-frozen morselized bone chips were cleaned chemically, sieved, and reassembled in specific portions with a known grain size distribution. A uniaxial compression was used to assess the yield limit, initial density, density at yield limit, and flowability of the three groups before and after compaction with a fall hammer apparatus. No statistically significant difference could be found for the yield limit between DA and ASS (p = 0.339) and between ASS and AB (p = 0.554). DA showed a statistically significant higher yield limit than AB (p = 0.022). Excluding the effect of the grain size distribution on the mechanical properties, it was shown that allografts have a lower yield limit when lipids are present. The liquid content of allografts seems to play an inferior role as no statistically significant difference could be found between DA and ASS. It is suggested, in accordance with other studies, to chemically clean allografts before implantation to reduce the contamination risk and the fat content.
Ozaki, Vitor A.; Ghosh, Sujit K.; Goodwin, Barry K.; Shirota, Ricardo
2009-01-01
This article presents a statistical model of agricultural yield data based on a set of hierarchical Bayesian models that allows joint modeling of temporal and spatial autocorrelation. This method captures a comprehensive range of the various uncertainties involved in predicting crop insurance premium rates as opposed to the more traditional ad hoc, two-stage methods that are typically based on independent estimation and prediction. A panel data set of county-average yield data was analyzed for 290 counties in the State of Paraná (Brazil) for the period of 1990 through 2002. Posterior predictive criteria are used to evaluate different model specifications. This article provides substantial improvements in the statistical and actuarial methods often applied to the calculation of insurance premium rates. These improvements are especially relevant to situations where data are limited. PMID:19890450
Data Analysis and Statistical Methods for the Assessment and Interpretation of Geochronologic Data
NASA Astrophysics Data System (ADS)
Reno, B. L.; Brown, M.; Piccoli, P. M.
2007-12-01
Ages are traditionally reported as a weighted mean with an uncertainty based on least squares analysis of analytical error on individual dates. This method does not take into account geological uncertainties, and cannot accommodate asymmetries in the data. In most instances, this method will understate uncertainty on a given age, which may lead to over interpretation of age data. Geologic uncertainty is difficult to quantify, but is typically greater than analytical uncertainty. These factors make traditional statistical approaches inadequate to fully evaluate geochronologic data. We propose a protocol to assess populations within multi-event datasets and to calculate age and uncertainty from each population of dates interpreted to represent a single geologic event using robust and resistant statistical methods. To assess whether populations thought to represent different events are statistically separate exploratory data analysis is undertaken using a box plot, where the range of the data is represented by a 'box' of length given by the interquartile range, divided at the median of the data, with 'whiskers' that extend to the furthest datapoint that lies within 1.5 times the interquartile range beyond the box. If the boxes representing the populations do not overlap, they are interpreted to represent statistically different sets of dates. Ages are calculated from statistically distinct populations using a robust tool such as the tanh method of Kelsey et al. (2003, CMP, 146, 326-340), which is insensitive to any assumptions about the underlying probability distribution from which the data are drawn. Therefore, this method takes into account the full range of data, and is not drastically affected by outliers. The interquartile range of each population of dates (the interquartile range) gives a first pass at expressing uncertainty, which accommodates asymmetry in the dataset; outliers have a minor affect on the uncertainty. To better quantify the uncertainty, a resistant tool that is insensitive to local misbehavior of data is preferred, such as the normalized median absolute deviations proposed by Powell et al. (2002, Chem Geol, 185, 191-204). We illustrate the method using a dataset of 152 monazite dates determined using EPMA chemical data from a single sample from the Neoproterozoic Brasília Belt, Brazil. Results are compared with ages and uncertainties calculated using traditional methods to demonstrate the differences. The dataset was manually culled into three populations representing discrete compositional domains within chemically-zoned monazite grains. The weighted mean ages and least squares uncertainties for these populations are 633±6 (2σ) Ma for a core domain, 614±5 (2σ) Ma for an intermediate domain and 595±6 (2σ) Ma for a rim domain. Probability distribution plots indicate asymmetric distributions of all populations, which cannot be accounted for with traditional statistical tools. These three domains record distinct ages outside the interquartile range for each population of dates, with the core domain lying in the subrange 642-624 Ma, the intermediate domain 617-609 Ma and the rim domain 606-589 Ma. The tanh estimator yields ages of 631±7 (2σ) for the core domain, 616±7 (2σ) for the intermediate domain and 601±8 (2σ) for the rim domain. Whereas the uncertainties derived using a resistant statistical tool are larger than those derived from traditional statistical tools, the method yields more realistic uncertainties that better address the spread in the dataset and account for asymmetry in the data.
Reynolds number dependence of relative dispersion statistics in isotropic turbulence
NASA Astrophysics Data System (ADS)
Sawford, Brian L.; Yeung, P. K.; Hackl, Jason F.
2008-06-01
Direct numerical simulation results for a range of relative dispersion statistics over Taylor-scale Reynolds numbers up to 650 are presented in an attempt to observe and quantify inertial subrange scaling and, in particular, Richardson's t3 law. The analysis includes the mean-square separation and a range of important but less-studied differential statistics for which the motion is defined relative to that at time t =0. It seeks to unambiguously identify and quantify the Richardson scaling by demonstrating convergence with both the Reynolds number and initial separation. According to these criteria, the standard compensated plots for these statistics in inertial subrange scaling show clear evidence of a Richardson range but with an imprecise estimate for the Richardson constant. A modified version of the cube-root plots introduced by Ott and Mann [J. Fluid Mech. 422, 207 (2000)] confirms such convergence. It has been used to yield more precise estimates for Richardson's constant g which decrease with Taylor-scale Reynolds numbers over the range of 140-650. Extrapolation to the large Reynolds number limit gives an asymptotic value for Richardson's constant in the range g =0.55-0.57, depending on the functional form used to make the extrapolation.
What Your Yield Says about You
ERIC Educational Resources Information Center
Hoover, Eric
2009-01-01
The recession has turned Americans into numbers addicts. Seemingly endless supplies of statistics--stock prices, retail sales, and the gross domestic product--offer various views about the health of the nation's economy. Higher education has its own economic indicators. Among the most important is "yield," the percentage of admitted students who…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-04
... provides useful insights on perceptions and opinions, but are not statistical surveys that yield quantitative results that can be generalized to the population of study. This feedback will provide insights... used for quantitative information collections that are designed to yield reliably actionable results...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-11
... target population to which generalizations will be made, the sampling frame, the sample design (including... for submission for other generic mechanisms that are designed to yield quantitative results. The MSPB... insights on perceptions and opinions, but are not statistical surveys that yield quantitative results that...
Chevance, Aurélie; Schuster, Tibor; Steele, Russell; Ternès, Nils; Platt, Robert W
2015-10-01
Robustness of an existing meta-analysis can justify decisions on whether to conduct an additional study addressing the same research question. We illustrate the graphical assessment of the potential impact of an additional study on an existing meta-analysis using published data on statin use and the risk of acute kidney injury. A previously proposed graphical augmentation approach is used to assess the sensitivity of the current test and heterogeneity statistics extracted from existing meta-analysis data. In addition, we extended the graphical augmentation approach to assess potential changes in the pooled effect estimate after updating a current meta-analysis and applied the three graphical contour definitions to data from meta-analyses on statin use and acute kidney injury risk. In the considered example data, the pooled effect estimates and heterogeneity indices demonstrated to be considerably robust to the addition of a future study. Supportingly, for some previously inconclusive meta-analyses, a study update might yield statistically significant kidney injury risk increase associated with higher statin exposure. The illustrated contour approach should become a standard tool for the assessment of the robustness of meta-analyses. It can guide decisions on whether to conduct additional studies addressing a relevant research question. Copyright © 2015 Elsevier Inc. All rights reserved.
Yuan, Ke-Hai; Jiang, Ge; Cheng, Ying
2017-11-01
Data in psychology are often collected using Likert-type scales, and it has been shown that factor analysis of Likert-type data is better performed on the polychoric correlation matrix than on the product-moment covariance matrix, especially when the distributions of the observed variables are skewed. In theory, factor analysis of the polychoric correlation matrix is best conducted using generalized least squares with an asymptotically correct weight matrix (AGLS). However, simulation studies showed that both least squares (LS) and diagonally weighted least squares (DWLS) perform better than AGLS, and thus LS or DWLS is routinely used in practice. In either LS or DWLS, the associations among the polychoric correlation coefficients are completely ignored. To mend such a gap between statistical theory and empirical work, this paper proposes new methods, called ridge GLS, for factor analysis of ordinal data. Monte Carlo results show that, for a wide range of sample sizes, ridge GLS methods yield uniformly more accurate parameter estimates than existing methods (LS, DWLS, AGLS). A real-data example indicates that estimates by ridge GLS are 9-20% more efficient than those by existing methods. Rescaled and adjusted test statistics as well as sandwich-type standard errors following the ridge GLS methods also perform reasonably well. © 2017 The British Psychological Society.
Brennan, Sandra B; Corben, Adriana; Liberman, Laura; Dershaw, D David; Brogi, Edi; Van Zee, Kimberly J; Morris, Elizabeth
2012-10-01
The objective of our study was to determine the frequency of cancer at surgery in breast lesions yielding papilloma at MRI-guided 9-gauge vacuum-assisted biopsy (VAB) and to determine whether any features are associated with cancer upgrade. For this study, 1487 MRI-guided vacuum-assisted biopsies performed from January 2004 to March 2011 were reviewed. Lesions yielding papilloma were identified and classified as papilloma with or without atypia. Surgical findings were reviewed to determine the cancer rate. Statistical analysis was performed and 95% CIs were calculated. Papilloma was identified in 75 of the 1487 MRI-guided vacuum-assisted biopsies (5%). These 75 papillomas occurred in 73 women with a median age of 49 years (age range, 27-70 years). Of the 75 papillomas, 25 (33%) had atypia and 50 (67%) did not on core needle biopsy. Subsequent surgery of 67 of the 75 papillomas (89%) yielded ductal carcinoma in situ (DCIS) in four (6%; 95% CI, 2-15%). Surgery yielded DCIS in two of 23 papillomas with atypia (9%; 95% CI, 1-28%) at MRI-guided VAB and in two of 44 papillomas without atypia (5%; 95% CI, 0.4-16%) at MRI-guided VAB; these cancer rates did not differ significantly (p=0.6). Postmenopausal status (p=0.04) and histologic size of less than 0.2 cm (p=0.04) had a significant association with the cancer upgrade rate. Papilloma with or without atypia was found in 5% of patients who underwent MRI-guided VAB during the study period. Surgery revealed cancer in 6%. DCIS was found at surgery in 9% of lesions yielding papilloma with atypia versus 5% of lesions yielding papilloma without atypia. For lesions yielding papilloma with or without atypia at MRI-guided VAB, surgical excision is warranted.
Motorist actions at a crosswalk with an in-pavement flashing light system.
Karkee, Ganesh J; Nambisan, Shashi S; Pulugurtha, Srinivas S
2010-12-01
An in-pavement flashing light system is used at crosswalks to alert motorists and pedestrians of possible conflicts and to influence their behavior to enhance safety. The relative behaviors of the drivers and the pedestrians affect safety. An evaluation of motorist behavior at a pedestrian crosswalk with an in-pavement flashing light system is presented in this manuscript. Field observations provide the basis to evaluate motorist behavior at a crosswalk with an in-pavement flashing light system. Outcomes of pedestrian and motorists actions were observed to quantify measures of effectiveness (MOEs) such as yielding behavior of motorists, vehicle speeds, and yielding distance from the crosswalk. A before-and-after study design was used. The before condition was prior to the activation of the in-pavement flashing light system and the after condition was after the activation of the in-pavement flashing light system. The study was conducted on a relatively low-volume roadway located in the Henderson, Nevada. The significance of the differences in the MOEs between the 2 study periods was evaluated using statistical analysis tools such as a one-tailed test for proportions and the Welch-Satterthwaite t-test. The results show that the installation of the in-pavement flashing light system increased the yielding behavior of motorists significantly (P < 0.001). The vehicular speeds decreased when pedestrians were waiting at the curb to cross and when they were crossing (P < 0.001). Motorists yielded to pedestrians on an average about 3 m (∼10 feet) upstream from the yield markings and the yielding distances were consistent in both directions. The in-pavement flashing light system is seen to be effective to improve motorists' yielding behavior and the speeds of vehicles were also observed to decrease in the presence of pedestrians.
Statistics of Data Fitting: Flaws and Fixes of Polynomial Analysis of Channeled Spectra
NASA Astrophysics Data System (ADS)
Karstens, William; Smith, David
2013-03-01
Starting from general statistical principles, we have critically examined Baumeister's procedure* for determining the refractive index of thin films from channeled spectra. Briefly, the method assumes that the index and interference fringe order may be approximated by polynomials quadratic and cubic in photon energy, respectively. The coefficients of the polynomials are related by differentiation, which is equivalent to comparing energy differences between fringes. However, we find that when the fringe order is calculated from the published IR index for silicon* and then analyzed with Baumeister's procedure, the results do not reproduce the original index. This problem has been traced to 1. Use of unphysical powers in the polynomials (e.g., time-reversal invariance requires that the index is an even function of photon energy), and 2. Use of insufficient terms of the correct parity. Exclusion of unphysical terms and addition of quartic and quintic terms to the index and order polynomials yields significantly better fits with fewer parameters. This represents a specific example of using statistics to determine if the assumed fitting model adequately captures the physics contained in experimental data. The use of analysis of variance (ANOVA) and the Durbin-Watson statistic to test criteria for the validity of least-squares fitting will be discussed. *D.F. Edwards and E. Ochoa, Appl. Opt. 19, 4130 (1980). Supported in part by the US Department of Energy, Office of Nuclear Physics under contract DE-AC02-06CH11357.
Sun, Jin; Rutkoski, Jessica E; Poland, Jesse A; Crossa, José; Jannink, Jean-Luc; Sorrells, Mark E
2017-07-01
High-throughput phenotyping (HTP) platforms can be used to measure traits that are genetically correlated with wheat ( L.) grain yield across time. Incorporating such secondary traits in the multivariate pedigree and genomic prediction models would be desirable to improve indirect selection for grain yield. In this study, we evaluated three statistical models, simple repeatability (SR), multitrait (MT), and random regression (RR), for the longitudinal data of secondary traits and compared the impact of the proposed models for secondary traits on their predictive abilities for grain yield. Grain yield and secondary traits, canopy temperature (CT) and normalized difference vegetation index (NDVI), were collected in five diverse environments for 557 wheat lines with available pedigree and genomic information. A two-stage analysis was applied for pedigree and genomic selection (GS). First, secondary traits were fitted by SR, MT, or RR models, separately, within each environment. Then, best linear unbiased predictions (BLUPs) of secondary traits from the above models were used in the multivariate prediction models to compare predictive abilities for grain yield. Predictive ability was substantially improved by 70%, on average, from multivariate pedigree and genomic models when including secondary traits in both training and test populations. Additionally, (i) predictive abilities slightly varied for MT, RR, or SR models in this data set, (ii) results indicated that including BLUPs of secondary traits from the MT model was the best in severe drought, and (iii) the RR model was slightly better than SR and MT models under drought environment. Copyright © 2017 Crop Science Society of America.
Delving deeper into technological innovations to understand differences in rice quality.
Calingacion, Mariafe; Fang, Lu; Quiatchon-Baeza, Lenie; Mumm, Roland; Riedel, Arthur; Hall, Robert D; Fitzgerald, Melissa
2015-12-01
Increasing demand for better quality rice varieties, which are also more suited to growth under sub-optimal cultivation conditions, is driving innovation in rice research. Here we have used a multi-disciplinary approach, involving SNP-based genotyping together with phenotyping based on yield analysis, metabolomic analysis of grain volatiles, and sensory panel analysis to determine differences between two contrasting rice varieties, Apo and IR64. Plants were grown under standard and drought-induced conditions. Results revealed important differences between the volatile profiles of the two rice varieties and we relate these differences to those perceived by the sensory panel. Apo, which is the more drought tolerant variety, was less affected by the drought condition concerning both sensory profile and yield; IR64, which has higher quality but is drought sensitive, showed greater differences in these characteristics in response to the two growth conditions. Metabolomics analyses using GCxGC-MS, followed by multivariate statistical analyses of the data, revealed a number of discriminatory compounds between the varieties, but also effects of the difference in cultivation conditions. Results indicate the complexity of rice volatile profile, even of non-aromatic varieties, and how metabolomics can be used to help link changes in aroma profile with the sensory phenotype. Our outcomes also suggest valuable multi-disciplinary approaches which can be used to help define the aroma profile in rice, and its underlying genetic background, in order to support breeders in the generation of improved rice varieties combining high yield with high quality, and tolerance of both these traits to climate change.
NASA Astrophysics Data System (ADS)
Farshadfar, M.; Farshadfar, E.
The present research was conducted to determine the genetic variability of 18 Lucerne cultivars, based on morphological and biochemical markers. The traits studied were plant height, tiller number, biomass, dry yield, dry yield/biomass, dry leaf/dry yield, macro and micro elements, crude protein, dry matter, crude fiber and ash percentage and SDS- PAGE in seed and leaf samples. Field experiments included 18 plots of two meter rows. Data based on morphological, chemical and SDS-PAGE markers were analyzed using SPSSWIN soft ware and the multivariate statistical procedures: cluster analysis (UPGMA), principal component. Analysis of analysis of variance and mean comparison for morphological traits reflected significant differences among genotypes. Genotype 13 and 15 had the greatest values for most traits. The Genotypic Coefficient of Variation (GCV), Phenotypic Coefficient of Variation (PCV) and Heritability (Hb) parameters for different characters raged from 12.49 to 26.58% for PCV, hence the GCV ranged from 6.84 to 18.84%. The greatest value of Hb was 0.94 for stem number. Lucerne genotypes could be classified, based on morphological traits, into four clusters and 94% of the variance among the genotypes was explained by two PCAs: Based on chemical traits they were classified into five groups and 73.492% of variance was explained by four principal components: Dry matter, protein, fiber, P, K, Na, Mg and Zn had higher variance. Genotypes based on the SDS-PAGE patterns all genotypes were classified into three clusters. The greatest genetic distance was between cultivar 10 and others, therefore they would be suitable parent in a breeding program.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Noo, F; Guo, Z
2016-06-15
Purpose: Penalized-weighted least-square reconstruction has become an important research topic in CT, to reduce dose without affecting image quality. Two components impact image quality in this reconstruction: the statistical weights and the use of an edge-preserving penalty term. We are interested in assessing the influence of statistical weights on their own, without the edge-preserving feature. Methods: The influence of statistical weights on image quality was assessed in terms of low-contrast detail detection using LROC analysis. The task amounted to detect and localize a 6-mm lesion with random contrast inside the FORBILD head phantom. A two-alternative forced-choice experiment was used withmore » two human observers performing the task. Reconstructions without and with statistical weights were compared, both using the same quadratic penalty term. The beam energy was set to 30keV to amplify spatial differences in attenuation and thereby the role of statistical weights. A fan-beam data acquisition geometry was used. Results: Visual inspection of images clearly showed a difference in noise between the two reconstructions methods. As expected, the reconstruction without statistical weights exhibited noise streaks. The other reconstruction appeared better in this aspect, but presented other disturbing noise patterns and artifacts induced by the weights. The LROC analysis yield the following 95-percent confidence interval for the difference in reader-averaged AUC (reconstruction without weights minus reconstruction with weights): [0.0026,0.0599]. The mean AUC value was 0.9094. Conclusion: We have investigated the impact of statistical weights without the use of edge-preserving penalty in penalized weighted least-square reconstruction. A decrease rather than increase in image quality was observed when using statistical weights. Thus, the observers were better able to cope with the noise streaks than the noise patterns and artifacts induced by the statistical weights. It may be that different results would be obtained if the penalty term was used with a pixel-dependent weight. F Noo receives research support from Siemens Healthcare GmbH.« less
The impact exploration of agricultural drought on winter wheat yield in the North China Plain
NASA Astrophysics Data System (ADS)
Yang, Jianhua; Wu, Jianjun; Han, Xinyi; Zhou, Hongkui
2017-04-01
Drought is one of the most serious agro-climatic disasters in the North China Plain, which has a great influence on winter wheat yield. Global warming exacerbates the drought trend of this region, so it is important to study the effect of drought on winter wheat yield. In order to assess the drought-induced winter wheat yield losses, SPEI (standardized precipitation evapotranspiration index), the widely used drought index, was selected to quantify the drought from 1981 to 2013. Additionally, the EPIC (Environmental Policy Integrated Climate) crop model was used to simulate winter wheat yield at 47 stations in this region from 1981 to 2013. We analyzed the relationship between winter wheat yield and the SPEI at different time scales in each month during the growing season. The trends of the SPEI and the trends of winter wheat yield at 47 stations over the past 32 years were compared with each other. To further quantify the effect of drought on winter wheat yield, we defined the year that SPEI varied from -0.5 to 0.5 as the normal year, and calculated the average winter wheat yield of the normal years as a reference yield, then calculated the reduction ratios of winter wheat based on the yields mentioned above in severe drought years. As a reference, we compared the results with the reduction ratios calculated from the statistical yield data. The results showed that the 9 to 12-month scales' SPEI in April, May and June had a high correlation with winter wheat yield. The trends of the SPEI and the trends of winter wheat yield over the past 32 years showed a positive correlation (p<0.01) and have similar spatial distributions. The proportion of the stations with the same change trend between the SPEI and winter wheat yield was 70%, indicating that drought was the main factor leading to a decline in winter wheat yield in this region. The reduction ratios based on the simulated yield and the reduction ratios calculated from the statistical yield data have a high positive correlation (p<0.01), which may provide a way to quantitatively evaluate the winter wheat yield losses caused by drought. Key words: drought, winter wheat yield, SPEI, EPIC, the North China Plain
Using structural equation modeling for network meta-analysis.
Tu, Yu-Kang; Wu, Yun-Chun
2017-07-14
Network meta-analysis overcomes the limitations of traditional pair-wise meta-analysis by incorporating all available evidence into a general statistical framework for simultaneous comparisons of several treatments. Currently, network meta-analyses are undertaken either within the Bayesian hierarchical linear models or frequentist generalized linear mixed models. Structural equation modeling (SEM) is a statistical method originally developed for modeling causal relations among observed and latent variables. As random effect is explicitly modeled as a latent variable in SEM, it is very flexible for analysts to specify complex random effect structure and to make linear and nonlinear constraints on parameters. The aim of this article is to show how to undertake a network meta-analysis within the statistical framework of SEM. We used an example dataset to demonstrate the standard fixed and random effect network meta-analysis models can be easily implemented in SEM. It contains results of 26 studies that directly compared three treatment groups A, B and C for prevention of first bleeding in patients with liver cirrhosis. We also showed that a new approach to network meta-analysis based on the technique of unrestricted weighted least squares (UWLS) method can also be undertaken using SEM. For both the fixed and random effect network meta-analysis, SEM yielded similar coefficients and confidence intervals to those reported in the previous literature. The point estimates of two UWLS models were identical to those in the fixed effect model but the confidence intervals were greater. This is consistent with results from the traditional pairwise meta-analyses. Comparing to UWLS model with common variance adjusted factor, UWLS model with unique variance adjusted factor has greater confidence intervals when the heterogeneity was larger in the pairwise comparison. The UWLS model with unique variance adjusted factor reflects the difference in heterogeneity within each comparison. SEM provides a very flexible framework for univariate and multivariate meta-analysis, and its potential as a powerful tool for advanced meta-analysis is still to be explored.
Shi, Weiwei; Bugrim, Andrej; Nikolsky, Yuri; Nikolskya, Tatiana; Brennan, Richard J
2008-01-01
ABSTRACT The ideal toxicity biomarker is composed of the properties of prediction (is detected prior to traditional pathological signs of injury), accuracy (high sensitivity and specificity), and mechanistic relationships to the endpoint measured (biological relevance). Gene expression-based toxicity biomarkers ("signatures") have shown good predictive power and accuracy, but are difficult to interpret biologically. We have compared different statistical methods of feature selection with knowledge-based approaches, using GeneGo's database of canonical pathway maps, to generate gene sets for the classification of renal tubule toxicity. The gene set selection algorithms include four univariate analyses: t-statistics, fold-change, B-statistics, and RankProd, and their combination and overlap for the identification of differentially expressed probes. Enrichment analysis following the results of the four univariate analyses, Hotelling T-square test, and, finally out-of-bag selection, a variant of cross-validation, were used to identify canonical pathway maps-sets of genes coordinately involved in key biological processes-with classification power. Differentially expressed genes identified by the different statistical univariate analyses all generated reasonably performing classifiers of tubule toxicity. Maps identified by enrichment analysis or Hotelling T-square had lower classification power, but highlighted perturbed lipid homeostasis as a common discriminator of nephrotoxic treatments. The out-of-bag method yielded the best functionally integrated classifier. The map "ephrins signaling" performed comparably to a classifier derived using sparse linear programming, a machine learning algorithm, and represents a signaling network specifically involved in renal tubule development and integrity. Such functional descriptors of toxicity promise to better integrate predictive toxicogenomics with mechanistic analysis, facilitating the interpretation and risk assessment of predictive genomic investigations.
Coherent instability in wall-bounded turbulence
NASA Astrophysics Data System (ADS)
Hack, M. J. Philipp
2017-11-01
Hairpin vortices are commonly considered one of the major classes of coherent fluid motions in shear layers, even as their significance in the grand scheme of turbulence has remained an openly debated question. The statistical prevalence of the dynamic process that gives rise to the hairpins across different types of flows suggests an origin in a robust common mechanism triggered by conditions widespread in wall-bounded shear layers. This study seeks to shed light on the physical process which drives the generation of hairpin vortices. It is primarily facilitated through an algorithm based on concepts developed in the field of computer vision which allows the topological identification and analysis of coherent flow processes across multiple scales. Application to direct numerical simulations of boundary layers enables the time-resolved sampling and exploration of the hairpin process in natural flow. The analysis yields rich statistical results which lead to a refined characterization of the hairpin process. Linear stability theory offers further insight into the flow physics and especially into the connection between the hairpin and exponential amplification mechanisms. The results also provide a sharpened understanding of the underlying causality of events.
treespace: Statistical exploration of landscapes of phylogenetic trees.
Jombart, Thibaut; Kendall, Michelle; Almagro-Garcia, Jacob; Colijn, Caroline
2017-11-01
The increasing availability of large genomic data sets as well as the advent of Bayesian phylogenetics facilitates the investigation of phylogenetic incongruence, which can result in the impossibility of representing phylogenetic relationships using a single tree. While sometimes considered as a nuisance, phylogenetic incongruence can also reflect meaningful biological processes as well as relevant statistical uncertainty, both of which can yield valuable insights in evolutionary studies. We introduce a new tool for investigating phylogenetic incongruence through the exploration of phylogenetic tree landscapes. Our approach, implemented in the R package treespace, combines tree metrics and multivariate analysis to provide low-dimensional representations of the topological variability in a set of trees, which can be used for identifying clusters of similar trees and group-specific consensus phylogenies. treespace also provides a user-friendly web interface for interactive data analysis and is integrated alongside existing standards for phylogenetics. It fills a gap in the current phylogenetics toolbox in R and will facilitate the investigation of phylogenetic results. © 2017 The Authors. Molecular Ecology Resources Published by John Wiley & Sons Ltd.
Predictor of increase in caregiver burden for disabled elderly at home.
Okamoto, Kazushi; Harasawa, Yuko
2009-01-01
In order to classify the caregivers at high risk of increase in their burden early, linear discriminant analysis was performed to obtain an effective discriminant model for differentiation of the presence or absence of increase in caregiver burden. The data obtained by self-administered questionnaire from 193 caregivers of frail elderly from January to February of 2005 were used. The discriminant analysis yielded a statistically significant function explaining 35.0% (Rc=0.59; d.f.=6; p=0.0001). The configuration indicated that the psychological predictors of change in caregiver burden with much perceived stress (1.47), high caregiver burden at baseline (1.28), emotional control (0.75), effort to achieve (-0.28), symptomatic depression (0.20) and "ikigai" (purpose in life) (0.18) made statistically significant contributions to the differentiation between no increase and increase in caregiver burden. The discriminant function showed a sensitivity of 86% and specificity of 81%, and successfully classified 83% of the caregivers. The function at baseline is a simple and useful method for screening of an increase in caregiver burden among caregivers for the frail elderly at home.
Anderson localization of shear waves observed by magnetic resonance imaging
NASA Astrophysics Data System (ADS)
Papazoglou, S.; Klatt, D.; Braun, J.; Sack, I.
2010-07-01
In this letter we present for the first time an experimental investigation of shear wave localization using motion-sensitive magnetic resonance imaging (MRI). Shear wave localization was studied in gel phantoms containing arrays of randomly positioned parallel glass rods. The phantoms were exposed to continuous harmonic vibrations in a frequency range from 25 to 175 Hz, yielding wavelengths on the order of the elastic mean free path, i.e. the Ioffe-Regel criterion of Anderson localization was satisfied. The experimental setup was further chosen such that purely shear horizontal waves were induced to avoid effects due to mode conversion and pressure waves. Analysis of the distribution of shear wave intensity in experiments and simulations revealed a significant deviation from Rayleigh statistics indicating that shear wave energy is localized. This observation is further supported by experiments on weakly scattering samples exhibiting Rayleigh statistics and an analysis of the multifractality of wave functions. Our results suggest that motion-sensitive MRI is a promising tool for studying Anderson localization of time-harmonic shear waves, which are increasingly used in dynamic elastography.
2018-01-01
ABSTRACT To assess phenotypic bacterial antimicrobial resistance (AMR) in different strata (e.g., host populations, environmental areas, manure, or sewage effluents) for epidemiological purposes, isolates of target bacteria can be obtained from a stratum using various sample types. Also, different sample processing methods can be applied. The MIC of each target antimicrobial drug for each isolate is measured. Statistical equivalence testing of the MIC data for the isolates allows evaluation of whether different sample types or sample processing methods yield equivalent estimates of the bacterial antimicrobial susceptibility in the stratum. We demonstrate this approach on the antimicrobial susceptibility estimates for (i) nontyphoidal Salmonella spp. from ground or trimmed meat versus cecal content samples of cattle in processing plants in 2013-2014 and (ii) nontyphoidal Salmonella spp. from urine, fecal, and blood human samples in 2015 (U.S. National Antimicrobial Resistance Monitoring System data). We found that the sample types for cattle yielded nonequivalent susceptibility estimates for several antimicrobial drug classes and thus may gauge distinct subpopulations of salmonellae. The quinolone and fluoroquinolone susceptibility estimates for nontyphoidal salmonellae from human blood are nonequivalent to those from urine or feces, conjecturally due to the fluoroquinolone (ciprofloxacin) use to treat infections caused by nontyphoidal salmonellae. We also demonstrate statistical equivalence testing for comparing sample processing methods for fecal samples (culturing one versus multiple aliquots per sample) to assess AMR in fecal Escherichia coli. These methods yield equivalent results, except for tetracyclines. Importantly, statistical equivalence testing provides the MIC difference at which the data from two sample types or sample processing methods differ statistically. Data users (e.g., microbiologists and epidemiologists) may then interpret practical relevance of the difference. IMPORTANCE Bacterial antimicrobial resistance (AMR) needs to be assessed in different populations or strata for the purposes of surveillance and determination of the efficacy of interventions to halt AMR dissemination. To assess phenotypic antimicrobial susceptibility, isolates of target bacteria can be obtained from a stratum using different sample types or employing different sample processing methods in the laboratory. The MIC of each target antimicrobial drug for each of the isolates is measured, yielding the MIC distribution across the isolates from each sample type or sample processing method. We describe statistical equivalence testing for the MIC data for evaluating whether two sample types or sample processing methods yield equivalent estimates of the bacterial phenotypic antimicrobial susceptibility in the stratum. This includes estimating the MIC difference at which the data from the two approaches differ statistically. Data users (e.g., microbiologists, epidemiologists, and public health professionals) can then interpret whether that present difference is practically relevant. PMID:29475868
Shakeri, Heman; Volkova, Victoriya; Wen, Xuesong; Deters, Andrea; Cull, Charley; Drouillard, James; Müller, Christian; Moradijamei, Behnaz; Jaberi-Douraki, Majid
2018-05-01
To assess phenotypic bacterial antimicrobial resistance (AMR) in different strata (e.g., host populations, environmental areas, manure, or sewage effluents) for epidemiological purposes, isolates of target bacteria can be obtained from a stratum using various sample types. Also, different sample processing methods can be applied. The MIC of each target antimicrobial drug for each isolate is measured. Statistical equivalence testing of the MIC data for the isolates allows evaluation of whether different sample types or sample processing methods yield equivalent estimates of the bacterial antimicrobial susceptibility in the stratum. We demonstrate this approach on the antimicrobial susceptibility estimates for (i) nontyphoidal Salmonella spp. from ground or trimmed meat versus cecal content samples of cattle in processing plants in 2013-2014 and (ii) nontyphoidal Salmonella spp. from urine, fecal, and blood human samples in 2015 (U.S. National Antimicrobial Resistance Monitoring System data). We found that the sample types for cattle yielded nonequivalent susceptibility estimates for several antimicrobial drug classes and thus may gauge distinct subpopulations of salmonellae. The quinolone and fluoroquinolone susceptibility estimates for nontyphoidal salmonellae from human blood are nonequivalent to those from urine or feces, conjecturally due to the fluoroquinolone (ciprofloxacin) use to treat infections caused by nontyphoidal salmonellae. We also demonstrate statistical equivalence testing for comparing sample processing methods for fecal samples (culturing one versus multiple aliquots per sample) to assess AMR in fecal Escherichia coli These methods yield equivalent results, except for tetracyclines. Importantly, statistical equivalence testing provides the MIC difference at which the data from two sample types or sample processing methods differ statistically. Data users (e.g., microbiologists and epidemiologists) may then interpret practical relevance of the difference. IMPORTANCE Bacterial antimicrobial resistance (AMR) needs to be assessed in different populations or strata for the purposes of surveillance and determination of the efficacy of interventions to halt AMR dissemination. To assess phenotypic antimicrobial susceptibility, isolates of target bacteria can be obtained from a stratum using different sample types or employing different sample processing methods in the laboratory. The MIC of each target antimicrobial drug for each of the isolates is measured, yielding the MIC distribution across the isolates from each sample type or sample processing method. We describe statistical equivalence testing for the MIC data for evaluating whether two sample types or sample processing methods yield equivalent estimates of the bacterial phenotypic antimicrobial susceptibility in the stratum. This includes estimating the MIC difference at which the data from the two approaches differ statistically. Data users (e.g., microbiologists, epidemiologists, and public health professionals) can then interpret whether that present difference is practically relevant. Copyright © 2018 Shakeri et al.
Improved Hierarchical Optimization-Based Classification of Hyperspectral Images Using Shape Analysis
NASA Technical Reports Server (NTRS)
Tarabalka, Yuliya; Tilton, James C.
2012-01-01
A new spectral-spatial method for classification of hyperspectral images is proposed. The HSegClas method is based on the integration of probabilistic classification and shape analysis within the hierarchical step-wise optimization algorithm. First, probabilistic support vector machines classification is applied. Then, at each iteration two neighboring regions with the smallest Dissimilarity Criterion (DC) are merged, and classification probabilities are recomputed. The important contribution of this work consists in estimating a DC between regions as a function of statistical, classification and geometrical (area and rectangularity) features. Experimental results are presented on a 102-band ROSIS image of the Center of Pavia, Italy. The developed approach yields more accurate classification results when compared to previously proposed methods.
Development and Validation of the Caring Loneliness Scale.
Karhe, Liisa; Kaunonen, Marja; Koivisto, Anna-Maija
2016-12-01
The Caring Loneliness Scale (CARLOS) includes 5 categories derived from earlier qualitative research. This article assesses the reliability and construct validity of a scale designed to measure patient experiences of loneliness in a professional caring relationship. Statistical analysis with 4 different sample sizes included Cronbach's alpha and exploratory factor analysis with principal axis factoring extraction. The sample size of 250 gave the most useful and comprehensible structure, but all 4 samples yielded underlying content of loneliness experiences. The initial 5 categories were reduced to 4 factors with 24 items and Cronbach's alpha ranging from .77 to .90. The findings support the reliability and validity of CARLOS for the assessment of Finnish breast cancer and heart surgery patients' experiences but as all instruments, further validation is needed.
Identification of atypical flight patterns
NASA Technical Reports Server (NTRS)
Statler, Irving C. (Inventor); Ferryman, Thomas A. (Inventor); Amidan, Brett G. (Inventor); Whitney, Paul D. (Inventor); White, Amanda M. (Inventor); Willse, Alan R. (Inventor); Cooley, Scott K. (Inventor); Jay, Joseph Griffith (Inventor); Lawrence, Robert E. (Inventor); Mosbrucker, Chris (Inventor)
2005-01-01
Method and system for analyzing aircraft data, including multiple selected flight parameters for a selected phase of a selected flight, and for determining when the selected phase of the selected flight is atypical, when compared with corresponding data for the same phase for other similar flights. A flight signature is computed using continuous-valued and discrete-valued flight parameters for the selected flight parameters and is optionally compared with a statistical distribution of other observed flight signatures, yielding atypicality scores for the same phase for other similar flights. A cluster analysis is optionally applied to the flight signatures to define an optimal collection of clusters. A level of atypicality for a selected flight is estimated, based upon an index associated with the cluster analysis.
NASA Astrophysics Data System (ADS)
Preibus-Norquist, R. N. C.-Grover; Bush-Romney, G. W.-Willard-Mitt; Dimon, J. P.; Adelson-Koch, Sheldon-Charles-David-Sheldon; Krugman-Axelrod, Paul-David; Siegel, Edward Carl-Ludwig; D. N. C./O. F. P./''47''%/50% Collaboration; R. N. C./G. O. P./''53''%/49% Collaboration; Nyt/Wp/Cnn/Msnbc/Pbs/Npr/Ft Collaboration; Ftn/Fnc/Fox/Wsj/Fbn Collaboration; Lb/Jpmc/Bs/Boa/Ml/Wamu/S&P/Fitch/Moodys/Nmis Collaboration
2013-03-01
``Models''? CAVEAT EMPTOR!!!: ``Toy Models Too-Often Yield Toy-Results''!!!: Goldenfeld[``The Role of Models in Physics'', in Lects.on Phase-Transitions & R.-G.(92)-p.32-33!!!]: statistics(Silver{[NYTimes; Bensinger, ``Math-Geerks Clearly-Defeated Pundits'', LATimes, (11/9/12)])}, polls, politics, economics, elections!!!: GRAPH/network/net/...-PHYSICS Barabasi-Albert[RMP (02)] (r,t)-space VERSUS(???) [Where's the Inverse/ Dual/Integral-Transform???] (Benjamin)Franklin(1795)-Fourier(1795; 1897;1822)-Laplace(1850)-Mellin (1902) Brillouin(1922)-...(k,)-space, {Hubbard [The World According to Wavelets,Peters (96)-p.14!!!/p.246: refs.-F2!!!]},and then (2) Albert-Barabasi[]Bose-Einstein quantum-statistics(BEQS) Bose-Einstein CONDENSATION (BEC) versus Bianconi[pvt.-comm.; arXiv:cond-mat/0204506; ...] -Barabasi [???] Fermi-Dirac
Fade durations in satellite-path mobile radio propagation
NASA Technical Reports Server (NTRS)
Schmier, Robert G.; Bostian, Charles W.
1986-01-01
Fades on satellite to land mobile radio links are caused by several factors, the most important of which are multipath propagation and vegetative shadowing. Designers of vehicular satellite communications systems require information about the statistics of fade durations in order to overcome or compensate for the fades. Except for a few limiting cases, only the mean fade duration can be determined analytically, and all other statistics must be obtained experimentally or via simulation. This report describes and presents results from a computer program developed at Virginia Tech to simulate satellite path propagation of a mobile station in a rural area. It generates rapidly-fading and slowly-fading signals by separate processes that yield correct cumulative signal distributions and then combines these to simulate the overall signal. This is then analyzed to yield the statistics of fade duration.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mayer, B. P.; Mew, D. A.; DeHope, A.
Attribution of the origin of an illicit drug relies on identification of compounds indicative of its clandestine production and is a key component of many modern forensic investigations. The results of these studies can yield detailed information on method of manufacture, starting material source, and final product - all critical forensic evidence. In the present work, chemical attribution signatures (CAS) associated with the synthesis of the analgesic fentanyl, N-(1-phenylethylpiperidin-4-yl)-N-phenylpropanamide, were investigated. Six synthesis methods, all previously published fentanyl synthetic routes or hybrid versions thereof, were studied in an effort to identify and classify route-specific signatures. 160 distinct compounds and inorganicmore » species were identified using gas and liquid chromatographies combined with mass spectrometric methods (GC-MS and LCMS/ MS-TOF) in conjunction with inductively coupled plasma mass spectrometry (ICPMS). The complexity of the resultant data matrix urged the use of multivariate statistical analysis. Using partial least squares discriminant analysis (PLS-DA), 87 route-specific CAS were classified and a statistical model capable of predicting the method of fentanyl synthesis was validated and tested against CAS profiles from crude fentanyl products deposited and later extracted from two operationally relevant surfaces: stainless steel and vinyl tile. This work provides the most detailed fentanyl CAS investigation to date by using orthogonal mass spectral data to identify CAS of forensic significance for illicit drug detection, profiling, and attribution.« less
A systematic review and meta-analysis of music therapy for the older adults with depression.
Zhao, K; Bai, Z G; Bo, A; Chi, I
2016-11-01
To determine the efficacy of music therapy in the management of depression in the elderly. We conducted a systematic review and meta-analysis of randomized controlled trials. Change in depressive symptoms was measured with various scales. Standardized mean differences were calculated for each therapy-control contrast. A comprehensive search yielded 2,692 citations; 19 articles met inclusion criteria. Meta-analysis suggests that music therapy plus standard treatment has statistical significance in reducing depressive symptoms among older adults (standardized mean differences = 1.02; 95% CI = 0.87, 1.17). This systematic review and meta-analysis suggests that music therapy has an effect on reducing depressive symptoms to some extent. However, high-quality trials evaluating the effects of music therapy on depression are required. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Plausible rice yield losses under future climate warming.
Zhao, Chuang; Piao, Shilong; Wang, Xuhui; Huang, Yao; Ciais, Philippe; Elliott, Joshua; Huang, Mengtian; Janssens, Ivan A; Li, Tao; Lian, Xu; Liu, Yongwen; Müller, Christoph; Peng, Shushi; Wang, Tao; Zeng, Zhenzhong; Peñuelas, Josep
2016-12-19
Rice is the staple food for more than 50% of the world's population 1-3 . Reliable prediction of changes in rice yield is thus central for maintaining global food security. This is an extraordinary challenge. Here, we compare the sensitivity of rice yield to temperature increase derived from field warming experiments and three modelling approaches: statistical models, local crop models and global gridded crop models. Field warming experiments produce a substantial rice yield loss under warming, with an average temperature sensitivity of -5.2 ± 1.4% K -1 . Local crop models give a similar sensitivity (-6.3 ± 0.4% K -1 ), but statistical and global gridded crop models both suggest less negative impacts of warming on yields (-0.8 ± 0.3% and -2.4 ± 3.7% K -1 , respectively). Using data from field warming experiments, we further propose a conditional probability approach to constrain the large range of global gridded crop model results for the future yield changes in response to warming by the end of the century (from -1.3% to -9.3% K -1 ). The constraint implies a more negative response to warming (-8.3 ± 1.4% K -1 ) and reduces the spread of the model ensemble by 33%. This yield reduction exceeds that estimated by the International Food Policy Research Institute assessment (-4.2 to -6.4% K -1 ) (ref. 4). Our study suggests that without CO 2 fertilization, effective adaptation and genetic improvement, severe rice yield losses are plausible under intensive climate warming scenarios.
Study on paddy rice yield estimation based on multisource data and the Grey system theory
NASA Astrophysics Data System (ADS)
Deng, Wensheng; Wang, Wei; Liu, Hai; Li, Chen; Ge, Yimin; Zheng, Xianghua
2009-10-01
The paddy rice is our important crops. In study of the paddy rice yield estimation, compared with the scholars who usually only take the remote sensing data or meteorology as the influence factors, we combine the remote sensing and the meteorological data to make the monitoring result closer reality. Although the gray system theory has used in many aspects, it is applied very little in paddy rice yield estimation. This study introduces it to the paddy rice yield estimation, and makes the yield estimation model. This can resolve small data sets problem that can not be solved by deterministic model. It selects some regions in Jianghan plain for the study area. The data includes multi-temporal remote sensing image, meteorological and statistic data. The remote sensing data is the 16-day composite images (250-m spatial resolution) of MODIS. The meteorological data includes monthly average temperature, sunshine duration and rain fall amount. The statistical data is the long-term paddy rice yield of the study area. Firstly, it extracts the paddy rice planting area from the multi-temporal MODIS images with the help of GIS and RS. Then taking the paddy rice yield as the reference sequence, MODIS data and meteorological data as the comparative sequence, computing the gray correlative coefficient, it selects the yield estimation factor based on the grey system theory. Finally, using the factors, it establishes the yield estimation model and does the result test. The result indicated that the method is feasible and the conclusion is credible. It can provide the scientific method and reference value to carry on the region paddy rice remote sensing estimation.
Robertson, Dale M.; Schwarz, Gregory E.; Saad, David A.; Alexander, Richard B.
2009-01-01
Excessive loads of nutrients transported by tributary rivers have been linked to hypoxia in the Gulf of Mexico. Management efforts to reduce the hypoxic zone in the Gulf of Mexico and improve the water quality of rivers and streams could benefit from targeting nutrient reductions toward watersheds with the highest nutrient yields delivered to sensitive downstream waters. One challenge is that most conventional watershed modeling approaches (e.g., mechanistic models) used in these management decisions do not consider uncertainties in the predictions of nutrient yields and their downstream delivery. The increasing use of parameter estimation procedures to statistically estimate model coefficients, however, allows uncertainties in these predictions to be reliably estimated. Here, we use a robust bootstrapping procedure applied to the results of a previous application of the hybrid statistical/mechanistic watershed model SPARROW (Spatially Referenced Regression On Watershed attributes) to develop a statistically reliable method for identifying “high priority” areas for management, based on a probabilistic ranking of delivered nutrient yields from watersheds throughout a basin. The method is designed to be used by managers to prioritize watersheds where additional stream monitoring and evaluations of nutrient-reduction strategies could be undertaken. Our ranking procedure incorporates information on the confidence intervals of model predictions and the corresponding watershed rankings of the delivered nutrient yields. From this quantified uncertainty, we estimate the probability that individual watersheds are among a collection of watersheds that have the highest delivered nutrient yields. We illustrate the application of the procedure to 818 eight-digit Hydrologic Unit Code watersheds in the Mississippi/Atchafalaya River basin by identifying 150 watersheds having the highest delivered nutrient yields to the Gulf of Mexico. Highest delivered yields were from watersheds in the Central Mississippi, Ohio, and Lower Mississippi River basins. With 90% confidence, only a few watersheds can be reliably placed into the highest 150 category; however, many more watersheds can be removed from consideration as not belonging to the highest 150 category. Results from this ranking procedure provide robust information on watershed nutrient yields that can benefit management efforts to reduce nutrient loadings to downstream coastal waters, such as the Gulf of Mexico, or to local receiving streams and reservoirs.
Miyamoto, Suzanne; Taylor, Sandra L.; Barupal, Dinesh K.; Taguchi, Ayumu; Wohlgemuth, Gert; Wikoff, William R.; Yoneda, Ken Y.; Gandara, David R.; Hanash, Samir M.; Kim, Kyoungmi; Fiehn, Oliver
2015-01-01
Lung cancer is a leading cause of cancer deaths worldwide. Metabolic alterations in tumor cells coupled with systemic indicators of the host response to tumor development have the potential to yield blood profiles with clinical utility for diagnosis and monitoring of treatment. We report results from two separate studies using gas chromatography time-of-flight mass spectrometry (GC-TOF MS) to profile metabolites in human blood samples that significantly differ from non-small cell lung cancer (NSCLC) adenocarcinoma and other lung cancer cases. Metabolomic analysis of blood samples from the two studies yielded a total of 437 metabolites, of which 148 were identified as known compounds and 289 identified as unknown compounds. Differential analysis identified 15 known metabolites in one study and 18 in a second study that were statistically different (p-values <0.05). Levels of maltose, palmitic acid, glycerol, ethanolamine, glutamic acid, and lactic acid were increased in cancer samples while amino acids tryptophan, lysine and histidine decreased. Many of the metabolites were found to be significantly different in both studies, suggesting that metabolomics appears to be robust enough to find systemic changes from lung cancer, thus showing the potential of this type of analysis for lung cancer detection. PMID:25859693
Babies and math: A meta-analysis of infants' simple arithmetic competence.
Christodoulou, Joan; Lac, Andrew; Moore, David S
2017-08-01
Wynn's (1992) seminal research reported that infants looked longer at stimuli representing "incorrect" versus "correct" solutions of basic addition and subtraction problems and concluded that infants have innate arithmetical abilities. Since then, infancy researchers have attempted to replicate this effect, yielding mixed findings. The present meta-analysis aimed to systematically compile and synthesize all of the primary replications and extensions of Wynn (1992) that have been conducted to date. The synthesis included 12 studies consisting of 26 independent samples and 550 unique infants. The summary effect, computed using a random-effects model, was statistically significant, d = +0.34, p < .001, suggesting that the phenomenon Wynn originally reported is reliable. Five different tests of publication bias yielded mixed results, suggesting that while a moderate level of publication bias is probable, the summary effect would be positive even after accounting for this issue. Out of the 10 metamoderators tested, none were found to be significant, but most of the moderator subgroups were significantly different from a null effect. Although this meta-analysis provides support for Wynn's original findings, further research is warranted to understand the underlying mechanisms responsible for infants' visual preferences for "mathematically incorrect" test stimuli. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
NASA Astrophysics Data System (ADS)
Eliazar, Iddo I.; Shlesinger, Michael F.
2012-01-01
We introduce and explore a Stochastic Flow Cascade (SFC) model: A general statistical model for the unidirectional flow through a tandem array of heterogeneous filters. Examples include the flow of: (i) liquid through heterogeneous porous layers; (ii) shocks through tandem shot noise systems; (iii) signals through tandem communication filters. The SFC model combines together the Langevin equation, convolution filters and moving averages, and Poissonian randomizations. A comprehensive analysis of the SFC model is carried out, yielding closed-form results. Lévy laws are shown to universally emerge from the SFC model, and characterize both heavy tailed retention times (Noah effect) and long-ranged correlations (Joseph effect).
Personality characteristics of hospice volunteers as measured by Myers-Briggs Type Indicator.
Mitchell, C W; Shuff, I M
1995-12-01
A sample of hospice volunteers (n = 99) was administered the Myers-Briggs Type Indicator (Myers & McCaulley, 1985). Frequencies of types observed were compared to population sample (n = 1,105) frequencies. Results indicated that, as a whole, hospice volunteers preferred extraversion over introversion, intuition over sensing, and feeling over thinking. Analysis of four-and two-letter preference combinations also yielded statistically significant differences. Most notably, the sensing-intuitive function appeared pivotal in determining of hospice volunteering. Suggestions are offered as to why the sensing-intuition function appeared central to hospice volunteering. Results appeared consistent with Jungian personality theory.
Time series regression-based pairs trading in the Korean equities market
NASA Astrophysics Data System (ADS)
Kim, Saejoon; Heo, Jun
2017-07-01
Pairs trading is an instance of statistical arbitrage that relies on heavy quantitative data analysis to profit by capitalising low-risk trading opportunities provided by anomalies of related assets. A key element in pairs trading is the rule by which open and close trading triggers are defined. This paper investigates the use of time series regression to define the rule which has previously been identified with fixed threshold-based approaches. Empirical results indicate that our approach may yield significantly increased excess returns compared to ones obtained by previous approaches on large capitalisation stocks in the Korean equities market.
Van Metre, Peter C.; Reutter, David C.
1995-01-01
Only limited suspended-sediment data were available. Four sites had daily sediment-discharge records for three or more water years (October 1 to September 30) between 1974 and 1985. An additional three sites had periodic measurements of suspended-sediment concentrations. There are differences in concentrations and yields among sites; however, the limited amount of data precludes developing statistical or cause-and-effect relations with environmental factors such as land use, soil, and geology. Data are sufficient, and the relation is pronounced enough, to indicate trapping of suspended sediment by Livingston Reservoir.
The attractor dimension of solar decimetric radio pulsations
NASA Technical Reports Server (NTRS)
Kurths, J.; Benz, A. O.; Aschwanden, M. J.
1991-01-01
The temporal characteristics of decimetric pulsations and related radio emissions during solar flares are analyzed using statistical methods recently developed for nonlinear dynamic systems. The results of the analysis is consistent with earlier reports on low-dimensional attractors of such events and yield a quantitative description of their temporal characteristics and hidden order. The estimated dimensions of typical decimetric pulsations are generally in the range of 3.0 + or - 0.5. Quasi-periodic oscillations and sudden reductions may have dimensions as low as 2. Pulsations of decimetric type IV continua have typically a dimension of about 4.
76 FR 41756 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-15
... materials and supplies used in production. The economic census will produce basic statistics by kind of business on number of establishments, sales, payroll, employment, inventories, and operating expenses. It also will yield a variety of subject statistics, including sales by product line; sales by class of...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aaltonen, T.; Maki, T.; Mehtala, P.
2009-02-01
This article presents the first measurement of the ratio of branching fractions B({lambda}{sub b}{sup 0}{yields}{lambda}{sub c}{sup +}{mu}{sup -}{nu}{sub {mu}})/B({lambda}{sub b}{sup 0}{yields}{lambda}{sub c}{sup +}{pi}{sup -}). Measurements in two control samples using the same technique B(B{sup 0}{yields}D{sup +}{mu}{sup -}{nu}{sub {mu}})/B(B{sup 0}{yields}D{sup +}{pi}{sup -}) and B(B{sup 0}{yields}D*(2010){sup +}{mu}{sup -}{nu}{sub {mu}})/B(B{sup 0}{yields}D*(2010){sup +}{pi}{sup -}) are also reported. The analysis uses data from an integrated luminosity of approximately 172 pb{sup -1} of pp collisions at {radical}(s)=1.96 TeV, collected with the CDF II detector at the Fermilab Tevatron. The relative branching fractions are measured to be (B({lambda}{sub b}{sup 0}{yields}{lambda}{sub c}{sup +}{mu}{sup -}{nu}{sub {mu}})/B({lambda}{sub b}{sup 0}{yields}{lambda}{sub c}{supmore » +}{pi}{sup -}))=16.6{+-}3.0(stat){+-}1.0(syst)(+2.6/-3.4)(PDG){+-}0.3 (EBR), (B(B{sup 0}{yields}D{sup +}{mu}{sup -}{nu}{sub {mu}})/B(B{sup 0}{yields}D{sup +}{pi}{sup -}))9.9{+-}1.0(stat){+-}0.6(syst){+-}0.4(PDG){+-}0.5(EBR), and (B(B{sup 0}{yields}D*(2010){sup +}{mu}{sup -}{nu}{sub {mu}})/B(B{sup 0}{yields}D*(2010){sup +}{pi}{sup -}))=16.5{+-}2.3(stat){+-} 0.6(syst){+-}0.5(PDG){+-}0.8(EBR). The uncertainties are from statistics (stat), internal systematics (syst), world averages of measurements published by the Particle Data Group or subsidiary measurements in this analysis (PDG), and unmeasured branching fractions estimated from theory (EBR), respectively. This article also presents measurements of the branching fractions of four new {lambda}{sub b}{sup 0} semileptonic decays: {lambda}{sub b}{sup 0}{yields}{lambda}{sub c}(2595){sup +}{mu}{sup -}{nu}{sub {mu}}, {lambda}{sub b}{sup 0}{yields}{lambda}{sub c}(2625){sup +}{mu}{sup -}{nu}{sub {mu}}, {lambda}{sub b}{sup 0}{yields}{sigma}{sub c}(2455){sup 0}{pi}{sup +}{mu}{sup -}{nu}{sub {mu}}, and {lambda}{sub b}{sup 0}{yields}{sigma}{sub c}(2455){sup ++}{pi}{sup -}{mu}{sup -}{nu}{sub {mu}}, relative to the branching fraction of the {lambda}{sub b}{sup 0}{yields}{lambda}{sub c}{sup +}{mu}{sup -}{nu}{sub {mu}} decay. Finally, the transverse-momentum distribution of {lambda}{sub b}{sup 0} baryons produced in pp collisions is measured and found to be significantly different from that of B{sup 0} mesons, which results in a modification in the production cross-section ratio {sigma}{sub {lambda}{sub b}{sup 0}}/{sigma}{sub B{sup 0}} with respect to the CDF I measurement.« less
Combined slope ratio analysis and linear-subtraction: An extension of the Pearce ratio method
NASA Astrophysics Data System (ADS)
De Waal, Sybrand A.
1996-07-01
A new technique, called combined slope ratio analysis, has been developed by extending the Pearce element ratio or conserved-denominator method (Pearce, 1968) to its logical conclusions. If two stoichiometric substances are mixed and certain chemical components are uniquely contained in either one of the two mixing substances, then by treating these unique components as conserved, the composition of the substance not containing the relevant component can be accurately calculated within the limits allowed by analytical and geological error. The calculated composition can then be subjected to rigorous statistical testing using the linear-subtraction method recently advanced by Woronow (1994). Application of combined slope ratio analysis to the rocks of the Uwekahuna Laccolith, Hawaii, USA, and the lavas of the 1959-summit eruption of Kilauea Volcano, Hawaii, USA, yields results that are consistent with field observations.
Zhu, Xudong; Arman, Bessembayev; Chu, Ju; Wang, Yonghong; Zhuang, Yingping
2017-05-01
To develop an efficient cost-effective screening process to improve production of glucoamylase in Aspergillus niger. The cultivation of A. niger was achieved with well-dispersed morphology in 48-deep-well microtiter plates, which increased the throughput of the samples compared to traditional flask cultivation. There was a close negative correlation between glucoamylase and its pH of the fermentation broth. A novel high-throughput analysis method using Methyl Orange was developed. When compared to the conventional analysis method using 4-nitrophenyl α-D-glucopyranoside as substrate, a correlation coefficient of 0.96 by statistical analysis was obtained. Using this novel screening method, we acquired a strain with an activity of 2.2 × 10 3 U ml -1 , a 70% higher yield of glucoamylase than its parent strain.
Analysis And Assistant Planning System Ofregional Agricultural Economic Inform
NASA Astrophysics Data System (ADS)
Han, Jie; Zhang, Junfeng
For the common problems existed in regional development and planning, we try to design a decision support system for assisting regional agricultural development and alignment as a decision-making tool for local government and decision maker. The analysis methods of forecast, comparative advantage, liner programming and statistical analysis are adopted. According to comparative advantage theory, the regional advantage can be determined by calculating and comparing yield advantage index (YAI), Scale advantage index (SAI), Complicated advantage index (CAI). Combining with GIS, agricultural data are presented as a form of graph such as area, bar and pie to uncover the principle and trend for decision-making which can't be found in data table. This system provides assistant decisions for agricultural structure adjustment, agro-forestry development and planning, and can be integrated to information technologies such as RS, AI and so on.
Woldegebriel, Michael; Vivó-Truyols, Gabriel
2016-10-04
A novel method for compound identification in liquid chromatography-high resolution mass spectrometry (LC-HRMS) is proposed. The method, based on Bayesian statistics, accommodates all possible uncertainties involved, from instrumentation up to data analysis into a single model yielding the probability of the compound of interest being present/absent in the sample. This approach differs from the classical methods in two ways. First, it is probabilistic (instead of deterministic); hence, it computes the probability that the compound is (or is not) present in a sample. Second, it answers the hypothesis "the compound is present", opposed to answering the question "the compound feature is present". This second difference implies a shift in the way data analysis is tackled, since the probability of interfering compounds (i.e., isomers and isobaric compounds) is also taken into account.
Evaluation of the Williams-type spring wheat model in North Dakota and Minnesota
NASA Technical Reports Server (NTRS)
Leduc, S. (Principal Investigator)
1982-01-01
The Williams type model, developed similarly to previous models of C.V.D. Williams, uses monthly temperature and precipitation data as well as soil and topological variables to predict the yield of the spring wheat crop. The models are statistically developed using the regression technique. Eight model characteristics are examined in the evaluation of the model. Evaluation is at the crop reporting district level, the state level and for the entire region. A ten year bootstrap test was the basis of the statistical evaluation. The accuracy and current indication of modeled yield reliability could show improvement. There is great variability in the bias measured over the districts, but there is a slight overall positive bias. The model estimates for the east central crop reporting district in Minnesota are not accurate. The estimate of yield for 1974 were inaccurate for all of the models.
NASA Astrophysics Data System (ADS)
Abdelaziz, Chebboubi; Grégoire, Kessedjian; Olivier, Serot; Sylvain, Julien-Laferriere; Christophe, Sage; Florence, Martin; Olivier, Méplan; David, Bernard; Olivier, Litaize; Aurélien, Blanc; Herbert, Faust; Paolo, Mutti; Ulli, Köster; Alain, Letourneau; Thomas, Materna; Michal, Rapala
2017-09-01
The study of fission yields has a major impact on the characterization and understanding of the fission process and is mandatory for reactor applications. In the past with the LOHENGRIN spectrometer of the ILL, priority has been given for the studies in the light fission fragment mass range. The LPSC in collaboration with ILL and CEA has developed a measurement program on symmetric and heavy mass fission fragment distributions. The combination of measurements with ionisation chamber and Ge detectors is necessary to describe precisely the heavy fission fragment region in mass and charge. Recently, new measurements of fission yields and kinetic energy distributions are has been made on the 233U(nth,f) reaction. The focus of this work has been on the new optical and statistical methodology and the self-normalization of the data to provide new absolute measurements, independently of any libraries, and the associated experimental covariance matrix.
Linking crop yield anomalies to large-scale atmospheric circulation in Europe.
Ceglar, Andrej; Turco, Marco; Toreti, Andrea; Doblas-Reyes, Francisco J
2017-06-15
Understanding the effects of climate variability and extremes on crop growth and development represents a necessary step to assess the resilience of agricultural systems to changing climate conditions. This study investigates the links between the large-scale atmospheric circulation and crop yields in Europe, providing the basis to develop seasonal crop yield forecasting and thus enabling a more effective and dynamic adaptation to climate variability and change. Four dominant modes of large-scale atmospheric variability have been used: North Atlantic Oscillation, Eastern Atlantic, Scandinavian and Eastern Atlantic-Western Russia patterns. Large-scale atmospheric circulation explains on average 43% of inter-annual winter wheat yield variability, ranging between 20% and 70% across countries. As for grain maize, the average explained variability is 38%, ranging between 20% and 58%. Spatially, the skill of the developed statistical models strongly depends on the large-scale atmospheric variability impact on weather at the regional level, especially during the most sensitive growth stages of flowering and grain filling. Our results also suggest that preceding atmospheric conditions might provide an important source of predictability especially for maize yields in south-eastern Europe. Since the seasonal predictability of large-scale atmospheric patterns is generally higher than the one of surface weather variables (e.g. precipitation) in Europe, seasonal crop yield prediction could benefit from the integration of derived statistical models exploiting the dynamical seasonal forecast of large-scale atmospheric circulation.
Lennon, Jay T
2011-06-01
A recent analysis revealed that most environmental microbiologists neglect replication in their science (Prosser, 2010). Of all peer-reviewed papers published during 2009 in the field's leading journals, slightly more than 70% lacked replication when it came to analyzing microbial community data. The paucity of replication is viewed as an 'endemic' and 'embarrassing' problem that amounts to 'bad science', or worse yet, as the title suggests, lying (Prosser, 2010). Although replication is an important component of experimental design, it is possible to do good science without replication. There are various quantitative techniques - some old, some new - that, when used properly, will allow environmental microbiologists to make strong statistical conclusions from experimental and comparative data. Here, I provide examples where unreplicated data can be used to test hypotheses and yield novel information in a statistically robust manner. © 2011 Society for Applied Microbiology and Blackwell Publishing Ltd.
A variation of the Davis-Smith method for in-flight determination of spacecraft magnetic fields.
NASA Technical Reports Server (NTRS)
Belcher, J. W.
1973-01-01
A variation of a procedure developed by Davis and Smith (1968) is presented for the in-flight determination of spacecraft magnetic fields. Both methods take statistical advantage of the observation that fluctuations in the interplanetary magnetic field over short periods of time are primarily changes in direction rather than in magnitude. During typical solar wind conditions between 0.8 and 1.0 AU, a statistical analysis of 2-3 days of continuous interplanetary field measurements yields an estimate of a constant spacecraft field with an uncertainty of plus or minus 0.25 gamma in the direction radial to the sun and plus or minus 15 gammas in the directions transverse to the radial. The method is also of use in estimating variable spacecraft fields with gradients of the order of 0.1 gamma/day and less and in other special circumstances.
Britz, Juliane; Pitts, Michael A
2011-11-01
We used an intermittent stimulus presentation to investigate event-related potential (ERP) components associated with perceptual reversals during binocular rivalry. The combination of spatiotemporal ERP analysis with source imaging and statistical parametric mapping of the concomitant source differences yielded differences in three time windows: reversals showed increased activity in early visual (∼120 ms) and in inferior frontal and anterior temporal areas (∼400-600 ms) and decreased activity in the ventral stream (∼250-350 ms). The combination of source imaging and statistical parametric mapping suggests that these differences were due to differences in generator strength and not generator configuration, unlike the initiation of reversals in right inferior parietal areas. These results are discussed within the context of the extensive network of brain areas that has been implicated in the initiation, implementation, and appraisal of bistable perceptual reversals. Copyright © 2011 Society for Psychophysiological Research.
Identifying trends in sediment discharge from alterations in upstream land use
Parker, R.S.; Osterkamp, W.R.
1995-01-01
Environmental monitoring is a primary reason for collecting sediment data. One emphasis of this monitoring is identification of trends in suspended sediment discharge. A stochastic equation was used to generate time series of annual suspended sediment discharges using statistics from gaging stations with drainage areas between 1606 and 1 805 230 km2. Annual sediment discharge was increased linearly to yield a given increase at the end of a fixed period and trend statistics were computed for each simulation series using Kendal's tau (at 0.05 significance level). A parameter was calculated from two factors that control trend detection time: (a) the magnitude of change in sediment discharge, and (b) the natural variability of sediment discharge. In this analysis the detection of a trend at most stations is well over 100 years for a 20% increase in sediment discharge. Further research is needed to assess the sensitivity of detecting trends at sediment stations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bochicchio, Davide; Panizon, Emanuele; Ferrando, Riccardo
2015-10-14
We compare the performance of two well-established computational algorithms for the calculation of free-energy landscapes of biomolecular systems, umbrella sampling and metadynamics. We look at benchmark systems composed of polyethylene and polypropylene oligomers interacting with lipid (phosphatidylcholine) membranes, aiming at the calculation of the oligomer water-membrane free energy of transfer. We model our test systems at two different levels of description, united-atom and coarse-grained. We provide optimized parameters for the two methods at both resolutions. We devote special attention to the analysis of statistical errors in the two different methods and propose a general procedure for the error estimation inmore » metadynamics simulations. Metadynamics and umbrella sampling yield the same estimates for the water-membrane free energy profile, but metadynamics can be more efficient, providing lower statistical uncertainties within the same simulation time.« less
NASA Astrophysics Data System (ADS)
Szücs, T.; Kiss, G. G.; Gyürky, Gy.; Halász, Z.; Fülöp, Zs.; Rauscher, T.
2018-01-01
The stellar reaction rates of radiative α-capture reactions on heavy isotopes are of crucial importance for the γ process network calculations. These rates are usually derived from statistical model calculations, which need to be validated, but the experimental database is very scarce. This paper presents the results of α-induced reaction cross section measurements on iridium isotopes carried out at first close to the astrophysically relevant energy region. Thick target yields of 191Ir(α,γ)195Au, 191Ir(α,n)194Au, 193Ir(α,n)196mAu, 193Ir(α,n)196Au reactions have been measured with the activation technique between Eα = 13.4 MeV and 17 MeV. For the first time the thick target yield was determined with X-ray counting. This led to a previously unprecedented sensitivity. From the measured thick target yields, reaction cross sections are derived and compared with statistical model calculations. The recently suggested energy-dependent modification of the α + nucleus optical potential gives a good description of the experimental data.
Statistical modeling of SRAM yield performance and circuit variability
NASA Astrophysics Data System (ADS)
Cheng, Qi; Chen, Yijian
2015-03-01
In this paper, we develop statistical models to investigate SRAM yield performance and circuit variability in the presence of self-aligned multiple patterning (SAMP) process. It is assumed that SRAM fins are fabricated by a positivetone (spacer is line) self-aligned sextuple patterning (SASP) process which accommodates two types of spacers, while gates are fabricated by a more pitch-relaxed self-aligned quadruple patterning (SAQP) process which only allows one type of spacer. A number of possible inverter and SRAM structures are identified and the related circuit multi-modality is studied using the developed failure-probability and yield models. It is shown that SRAM circuit yield is significantly impacted by the multi-modality of fins' spatial variations in a SRAM cell. The sensitivity of 6-transistor SRAM read/write failure probability to SASP process variations is calculated and the specific circuit type with the highest probability to fail in the reading/writing operation is identified. Our study suggests that the 6-transistor SRAM configuration may not be scalable to 7-nm half pitch and more robust SRAM circuit design needs to be researched.
Field warming experiments shed light on the wheat yield response to temperature in China
Zhao, Chuang; Piao, Shilong; Huang, Yao; Wang, Xuhui; Ciais, Philippe; Huang, Mengtian; Zeng, Zhenzhong; Peng, Shushi
2016-01-01
Wheat growth is sensitive to temperature, but the effect of future warming on yield is uncertain. Here, focusing on China, we compiled 46 observations of the sensitivity of wheat yield to temperature change (SY,T, yield change per °C) from field warming experiments and 102 SY,T estimates from local process-based and statistical models. The average SY,T from field warming experiments, local process-based models and statistical models is −0.7±7.8(±s.d.)% per °C, −5.7±6.5% per °C and 0.4±4.4% per °C, respectively. Moreover, SY,T is different across regions and warming experiments indicate positive SY,T values in regions where growing-season mean temperature is low, and water supply is not limiting, and negative values elsewhere. Gridded crop model simulations from the Inter-Sectoral Impact Model Intercomparison Project appear to capture the spatial pattern of SY,T deduced from warming observations. These results from local manipulative experiments could be used to improve crop models in the future. PMID:27853151
Relative mass distributions of neutron-rich thermally fissile nuclei within a statistical model
NASA Astrophysics Data System (ADS)
Kumar, Bharat; Kannan, M. T. Senthil; Balasubramaniam, M.; Agrawal, B. K.; Patra, S. K.
2017-09-01
We study the binary mass distribution for the recently predicted thermally fissile neutron-rich uranium and thorium nuclei using a statistical model. The level density parameters needed for the study are evaluated from the excitation energies of the temperature-dependent relativistic mean field formalism. The excitation energy and the level density parameter for a given temperature are employed in the convolution integral method to obtain the probability of the particular fragmentation. As representative cases, we present the results for the binary yields of 250U and 254Th. The relative yields are presented for three different temperatures: T =1 , 2, and 3 MeV.
Genome-wide network-based pathway analysis of CSF t-tau/Aβ1-42 ratio in the ADNI cohort.
Cong, Wang; Meng, Xianglian; Li, Jin; Zhang, Qiushi; Chen, Feng; Liu, Wenjie; Wang, Ying; Cheng, Sipu; Yao, Xiaohui; Yan, Jingwen; Kim, Sungeun; Saykin, Andrew J; Liang, Hong; Shen, Li
2017-05-30
The cerebrospinal fluid (CSF) levels of total tau (t-tau) and Aβ 1-42 are potential early diagnostic markers for probable Alzheimer's disease (AD). The influence of genetic variation on these CSF biomarkers has been investigated in candidate or genome-wide association studies (GWAS). However, the investigation of statistically modest associations in GWAS in the context of biological networks is still an under-explored topic in AD studies. The main objective of this study is to gain further biological insights via the integration of statistical gene associations in AD with physical protein interaction networks. The CSF and genotyping data of 843 study subjects (199 CN, 85 SMC, 239 EMCI, 207 LMCI, 113 AD) from the Alzheimer's Disease Neuroimaging Initiative (ADNI) were analyzed. PLINK was used to perform GWAS on the t-tau/Aβ 1-42 ratio using quality controlled genotype data, including 563,980 single nucleotide polymorphisms (SNPs), with age, sex and diagnosis as covariates. Gene-level p-values were obtained by VEGAS2. Genes with p-value ≤ 0.05 were mapped on to a protein-protein interaction (PPI) network (9,617 nodes, 39,240 edges, from the HPRD Database). We integrated a consensus model strategy into the iPINBPA network analysis framework, and named it as CM-iPINBPA. Four consensus modules (CMs) were discovered by CM-iPINBPA, and were functionally annotated using the pathway analysis tool Enrichr. The intersection of four CMs forms a common subnetwork of 29 genes, including those related to tau phosphorylation (GSK3B, SUMO1, AKAP5, CALM1 and DLG4), amyloid beta production (CASP8, PIK3R1, PPA1, PARP1, CSNK2A1, NGFR, and RHOA), and AD (BCL3, CFLAR, SMAD1, and HIF1A). This study coupled a consensus module (CM) strategy with the iPINBPA network analysis framework, and applied it to the GWAS of CSF t-tau/Aβ1-42 ratio in an AD study. The genome-wide network analysis yielded 4 enriched CMs that share not only genes related to tau phosphorylation or amyloid beta production but also multiple genes enriching several KEGG pathways such as Alzheimer's disease, colorectal cancer, gliomas, renal cell carcinoma, Huntington's disease, and others. This study demonstrated that integration of gene-level associations with CMs could yield statistically significant findings to offer valuable biological insights (e.g., functional interaction among the protein products of these genes) and suggest high confidence candidates for subsequent analyses.
Libiger, Ondrej; Schork, Nicholas J.
2015-01-01
It is now feasible to examine the composition and diversity of microbial communities (i.e., “microbiomes”) that populate different human organs and orifices using DNA sequencing and related technologies. To explore the potential links between changes in microbial communities and various diseases in the human body, it is essential to test associations involving different species within and across microbiomes, environmental settings and disease states. Although a number of statistical techniques exist for carrying out relevant analyses, it is unclear which of these techniques exhibit the greatest statistical power to detect associations given the complexity of most microbiome datasets. We compared the statistical power of principal component regression, partial least squares regression, regularized regression, distance-based regression, Hill's diversity measures, and a modified test implemented in the popular and widely used microbiome analysis methodology “Metastats” across a wide range of simulated scenarios involving changes in feature abundance between two sets of metagenomic samples. For this purpose, simulation studies were used to change the abundance of microbial species in a real dataset from a published study examining human hands. Each technique was applied to the same data, and its ability to detect the simulated change in abundance was assessed. We hypothesized that a small subset of methods would outperform the rest in terms of the statistical power. Indeed, we found that the Metastats technique modified to accommodate multivariate analysis and partial least squares regression yielded high power under the models and data sets we studied. The statistical power of diversity measure-based tests, distance-based regression and regularized regression was significantly lower. Our results provide insight into powerful analysis strategies that utilize information on species counts from large microbiome data sets exhibiting skewed frequency distributions obtained on a small to moderate number of samples. PMID:26734061
NASA Astrophysics Data System (ADS)
Bierstedt, Svenja E.; Hünicke, Birgit; Zorita, Eduardo; Ludwig, Juliane
2017-07-01
We statistically analyse the relationship between the structure of migrating dunes in the southern Baltic and the driving wind conditions over the past 26 years, with the long-term aim of using migrating dunes as a proxy for past wind conditions at an interannual resolution. The present analysis is based on the dune record derived from geo-radar measurements by Ludwig et al. (2017). The dune system is located at the Baltic Sea coast of Poland and is migrating from west to east along the coast. The dunes present layers with different thicknesses that can be assigned to absolute dates at interannual timescales and put in relation to seasonal wind conditions. To statistically analyse this record and calibrate it as a wind proxy, we used a gridded regional meteorological reanalysis data set (coastDat2) covering recent decades. The identified link between the dune annual layers and wind conditions was additionally supported by the co-variability between dune layers and observed sea level variations in the southern Baltic Sea. We include precipitation and temperature into our analysis, in addition to wind, to learn more about the dependency between these three atmospheric factors and their common influence on the dune system. We set up a statistical linear model based on the correlation between the frequency of days with specific wind conditions in a given season and dune migration velocities derived for that season. To some extent, the dune records can be seen as analogous to tree-ring width records, and hence we use a proxy validation method usually applied in dendrochronology, cross-validation with the leave-one-out method, when the observational record is short. The revealed correlations between the wind record from the reanalysis and the wind record derived from the dune structure is in the range between 0.28 and 0.63, yielding similar statistical validation skill as dendroclimatological records.
Jiang, Honghua; Ni, Xiao; Huster, William; Heilmann, Cory
2015-01-01
Hypoglycemia has long been recognized as a major barrier to achieving normoglycemia with intensive diabetic therapies. It is a common safety concern for the diabetes patients. Therefore, it is important to apply appropriate statistical methods when analyzing hypoglycemia data. Here, we carried out bootstrap simulations to investigate the performance of the four commonly used statistical models (Poisson, negative binomial, analysis of covariance [ANCOVA], and rank ANCOVA) based on the data from a diabetes clinical trial. Zero-inflated Poisson (ZIP) model and zero-inflated negative binomial (ZINB) model were also evaluated. Simulation results showed that Poisson model inflated type I error, while negative binomial model was overly conservative. However, after adjusting for dispersion, both Poisson and negative binomial models yielded slightly inflated type I errors, which were close to the nominal level and reasonable power. Reasonable control of type I error was associated with ANCOVA model. Rank ANCOVA model was associated with the greatest power and with reasonable control of type I error. Inflated type I error was observed with ZIP and ZINB models.
Statistical Evaluations of Variations in Dairy Cows’ Milk Yields as a Precursor of Earthquakes
Yamauchi, Hiroyuki; Hayakawa, Masashi; Asano, Tomokazu; Ohtani, Nobuyo; Ohta, Mitsuaki
2017-01-01
Simple Summary There are many reports of abnormal changes occurring in various natural systems prior to earthquakes. Unusual animal behavior is one of these abnormalities; however, there are few objective indicators and to date, reliability has remained uncertain. We found that milk yields of dairy cows decreased prior to an earthquake in our previous case study. In this study, we examined the reliability of decreases in milk yields as a precursor for earthquakes using long-term observation data. In the results, milk yields decreased approximately three weeks before earthquakes. We have come to the conclusion that dairy cow milk yields have applicability as an objectively observable unusual animal behavior prior to earthquakes, and dairy cows respond to some physical or chemical precursors of earthquakes. Abstract Previous studies have provided quantitative data regarding unusual animal behavior prior to earthquakes; however, few studies include long-term, observational data. Our previous study revealed that the milk yields of dairy cows decreased prior to an extremely large earthquake. To clarify whether the milk yields decrease prior to earthquakes, we examined the relationship between earthquakes of various magnitudes and daily milk yields. The observation period was one year. In the results, cross-correlation analyses revealed a significant negative correlation between earthquake occurrence and milk yields approximately three weeks beforehand. Approximately a week and a half beforehand, a positive correlation was revealed, and the correlation gradually receded to zero as the day of the earthquake approached. Future studies that use data from a longer observation period are needed because this study only considered ten earthquakes and therefore does not have strong statistical power. Additionally, we compared the milk yields with the subionospheric very low frequency/low frequency (VLF/LF) propagation data indicating ionospheric perturbations. The results showed that anomalies of VLF/LF propagation data emerged prior to all of the earthquakes following decreases in milk yields; the milk yields decreased earlier than propagation anomalies. We mention how ultralow frequency magnetic fields are a stimulus that could reduce milk yields. This study suggests that dairy cow milk yields decrease prior to earthquakes, and that they might respond to stimuli emerging earlier than ionospheric perturbations. PMID:28282889
Pfeifle, Mark; Ma, Yong-Tao; Jasper, Ahren W; Harding, Lawrence B; Hase, William L; Klippenstein, Stephen J
2018-05-07
Ozonolysis produces chemically activated carbonyl oxides (Criegee intermediates, CIs) that are either stabilized or decompose directly. This branching has an important impact on atmospheric chemistry. Prior theoretical studies have employed statistical models for energy partitioning to the CI arising from dissociation of the initially formed primary ozonide (POZ). Here, we used direct dynamics simulations to explore this partitioning for decomposition of c-C 2 H 4 O 3 , the POZ in ethylene ozonolysis. A priori estimates for the overall stabilization probability were then obtained by coupling the direct dynamics results with master equation simulations. Trajectories were initiated at the concerted cycloreversion transition state, as well as the second transition state of a stepwise dissociation pathway, both leading to a CI (H 2 COO) and formaldehyde (H 2 CO). The resulting CI energy distributions were incorporated in master equation simulations of CI decomposition to obtain channel-specific stabilized CI (sCI) yields. Master equation simulations of POZ formation and decomposition, based on new high-level electronic structure calculations, were used to predict yields for the different POZ decomposition channels. A non-negligible contribution of stepwise POZ dissociation was found, and new mechanistic aspects of this pathway were elucidated. By combining the trajectory-based channel-specific sCI yields with the channel branching fractions, an overall sCI yield of (48 ± 5)% was obtained. Non-statistical energy release was shown to measurably affect sCI formation, with statistical models predicting significantly lower overall sCI yields (∼30%). Within the range of experimental literature values (35%-54%), our trajectory-based calculations favor those clustered at the upper end of the spectrum.
NASA Astrophysics Data System (ADS)
Pfeifle, Mark; Ma, Yong-Tao; Jasper, Ahren W.; Harding, Lawrence B.; Hase, William L.; Klippenstein, Stephen J.
2018-05-01
Ozonolysis produces chemically activated carbonyl oxides (Criegee intermediates, CIs) that are either stabilized or decompose directly. This branching has an important impact on atmospheric chemistry. Prior theoretical studies have employed statistical models for energy partitioning to the CI arising from dissociation of the initially formed primary ozonide (POZ). Here, we used direct dynamics simulations to explore this partitioning for decomposition of c-C2H4O3, the POZ in ethylene ozonolysis. A priori estimates for the overall stabilization probability were then obtained by coupling the direct dynamics results with master equation simulations. Trajectories were initiated at the concerted cycloreversion transition state, as well as the second transition state of a stepwise dissociation pathway, both leading to a CI (H2COO) and formaldehyde (H2CO). The resulting CI energy distributions were incorporated in master equation simulations of CI decomposition to obtain channel-specific stabilized CI (sCI) yields. Master equation simulations of POZ formation and decomposition, based on new high-level electronic structure calculations, were used to predict yields for the different POZ decomposition channels. A non-negligible contribution of stepwise POZ dissociation was found, and new mechanistic aspects of this pathway were elucidated. By combining the trajectory-based channel-specific sCI yields with the channel branching fractions, an overall sCI yield of (48 ± 5)% was obtained. Non-statistical energy release was shown to measurably affect sCI formation, with statistical models predicting significantly lower overall sCI yields (˜30%). Within the range of experimental literature values (35%-54%), our trajectory-based calculations favor those clustered at the upper end of the spectrum.
Biophysical and Economic Uncertainty in the Analysis of Poverty Impacts of Climate Change
NASA Astrophysics Data System (ADS)
Hertel, T. W.; Lobell, D. B.; Verma, M.
2011-12-01
This paper seeks to understand the main sources of uncertainty in assessing the impacts of climate change on agricultural output, international trade, and poverty. We incorporate biophysical uncertainty by sampling from a distribution of global climate model predictions for temperature and precipitation for 2050. The implications of these realizations for crop yields around the globe are estimated using the recently published statistical crop yield functions provided by Lobell, Schlenker and Costa-Roberts (2011). By comparing these yields to those predicted under current climate, we obtain the likely change in crop yields owing to climate change. The economic uncertainty in our analysis relates to the response of the global economic system to these biophysical shocks. We use a modified version of the GTAP model to elicit the impact of the biophysical shocks on global patterns of production, consumption, trade and poverty. Uncertainty in these responses is reflected in the econometrically estimated parameters governing the responsiveness of international trade, consumption, production (and hence the intensive margin of supply response), and factor supplies (which govern the extensive margin of supply response). We sample from the distributions of these parameters as specified by Hertel et al. (2007) and Keeney and Hertel (2009). We find that, even though it is difficult to predict where in the world agricultural crops will be favorably affected by climate change, the responses of economic variables, including output and exports can be far more robust (Table 1). This is due to the fact that supply and demand decisions depend on relative prices, and relative prices depend on productivity changes relative to other crops in a given region, or relative to similar crops in other parts of the world. We also find that uncertainty in poverty impacts of climate change appears to be almost entirely driven by biophysical uncertainty.
Ozone and sulfur dioxide effects on three tall fescue cultivars
DOE Office of Scientific and Technical Information (OSTI.GOV)
Flagler, R.B.; Youngner, V.B.
Although many reports have been published concerning differential susceptibility of various crops and/or cultivars to air pollutants, most have used foliar injury instead of the marketable yield as the factor that determined susceptibility for the crop. In an examination of screening in terms of marketable yield, three cultivars of tall fescue (Festuca arundinacea Schreb.), 'Alta,' 'Fawn,' and 'Kentucky 31,' were exposed to 0-0.40 ppm O/sub 3/ or 0-0.50 ppm SO/sub 2/ 6 h/d, once a week, for 7 and 9 weeks, respectively. Experimental design was a randomized complete block with three replications. Statistical analysis was by standard analysis of variancemore » and regression techniques. Three variables were analyzed: top dry weight (yield), tiller number, and weight per tiller. Ozone had a significant effect on all three variables. Significant linear decreases in yield and weight per tiller occurred with increasing O/sub 3/ concentrations. Linear regressions of these variables on O/sub 3/ concentration produced significantly different regression coefficients. The coefficient for Kentucky 31 was significantly greater than Alta or Fawn, which did not differ from each other. This indicated that Kentucky 31 was more susceptible to O/sub 3/ than either of the other cultivars. Percent reductions in dry weight for the three cultivars at highest O/sub 3/ level were 35, 44, and 53%, respectively, for Fawn, Alta, and Kentucky 31. For weight per tiller, Kentucky 31 had a higher percent reduction than the other cultivars (59 vs. 46 and 44%). Tiller number was generally increased by O/sub 3/, but this variable was not useful for determining differential susceptibility to the pollutant. Sulfur dioxide treatments produced no significant effects on any of the variables analyzed.« less
NASA Astrophysics Data System (ADS)
Karuppiah, R.; Faldi, A.; Laurenzi, I.; Usadi, A.; Venkatesh, A.
2014-12-01
An increasing number of studies are focused on assessing the environmental footprint of different products and processes, especially using life cycle assessment (LCA). This work shows how combining statistical methods and Geographic Information Systems (GIS) with environmental analyses can help improve the quality of results and their interpretation. Most environmental assessments in literature yield single numbers that characterize the environmental impact of a process/product - typically global or country averages, often unchanging in time. In this work, we show how statistical analysis and GIS can help address these limitations. For example, we demonstrate a method to separately quantify uncertainty and variability in the result of LCA models using a power generation case study. This is important for rigorous comparisons between the impacts of different processes. Another challenge is lack of data that can affect the rigor of LCAs. We have developed an approach to estimate environmental impacts of incompletely characterized processes using predictive statistical models. This method is applied to estimate unreported coal power plant emissions in several world regions. There is also a general lack of spatio-temporal characterization of the results in environmental analyses. For instance, studies that focus on water usage do not put in context where and when water is withdrawn. Through the use of hydrological modeling combined with GIS, we quantify water stress on a regional and seasonal basis to understand water supply and demand risks for multiple users. Another example where it is important to consider regional dependency of impacts is when characterizing how agricultural land occupation affects biodiversity in a region. We developed a data-driven methodology used in conjuction with GIS to determine if there is a statistically significant difference between the impacts of growing different crops on different species in various biomes of the world.
Statistically Valid Planting Trials
C. B. Briscoe
1961-01-01
More than 100 million tree seedlings are planted each year in Latin America, and at least ten time'that many should be planted Rational control and development of a program of such magnitude require establishing and interpreting carefully planned trial plantings which will yield statistically valid answers to real and important questions. Unfortunately, many...
Yang, Jun-Ho; Yoh, Jack J
2018-01-01
A novel technique is reported for separating overlapping latent fingerprints using chemometric approaches that combine laser-induced breakdown spectroscopy (LIBS) and multivariate analysis. The LIBS technique provides the capability of real time analysis and high frequency scanning as well as the data regarding the chemical composition of overlapping latent fingerprints. These spectra offer valuable information for the classification and reconstruction of overlapping latent fingerprints by implementing appropriate statistical multivariate analysis. The current study employs principal component analysis and partial least square methods for the classification of latent fingerprints from the LIBS spectra. This technique was successfully demonstrated through a classification study of four distinct latent fingerprints using classification methods such as soft independent modeling of class analogy (SIMCA) and partial least squares discriminant analysis (PLS-DA). The novel method yielded an accuracy of more than 85% and was proven to be sufficiently robust. Furthermore, through laser scanning analysis at a spatial interval of 125 µm, the overlapping fingerprints were reconstructed as separate two-dimensional forms.
Solvent refined coal (SRC) process. Annual technical progress report, January 1979-December 1979
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1980-11-01
A set of statistically designed experiments was used to study the effects of several important operating variables on coal liquefaction product yield structures. These studies used a Continuous Stirred-Tank Reactor to provide a hydrodynamically well-defined system from which kinetic data could be extracted. An analysis of the data shows that product yield structures can be adequately represented by a correlative model. It was shown that second-order effects (interaction and squared terms) are necessary to provide a good model fit of the data throughout the range studied. Three reports were issued covering the SRC-II database and yields as functions of operatingmore » variables. The results agree well with the generally-held concepts of the SRC reaction process, i.e., liquid phase hydrogenolysis of liquid coal which is time-dependent, thermally activated, catalyzed by recycle ash, and reaction rate-controlled. Four reports were issued summarizing the comprehensive SRC reactor thermal response models and reporting the results of several studies made with the models. Analytical equipment for measuring SRC off-gas composition and simulated distillation of coal liquids and appropriate procedures have been established.« less
Comparative analysis of protocols for DNA extraction from soybean caterpillars.
Palma, J; Valmorbida, I; da Costa, I F D; Guedes, J V C
2016-04-07
Genomic DNA extraction is crucial for molecular research, including diagnostic and genome characterization of different organisms. The aim of this study was to comparatively analyze protocols of DNA extraction based on cell lysis by sarcosyl, cetyltrimethylammonium bromide, and sodium dodecyl sulfate, and to determine the most efficient method applicable to soybean caterpillars. DNA was extracted from specimens of Chrysodeixis includens and Spodoptera eridania using the aforementioned three methods. DNA quantification was performed using spectrophotometry and high molecular weight DNA ladders. The purity of the extracted DNA was determined by calculating the A260/A280 ratio. Cost and time for each DNA extraction method were estimated and analyzed statistically. The amount of DNA extracted by these three methods was sufficient for PCR amplification. The sarcosyl method yielded DNA of higher purity, because it generated a clearer pellet without viscosity, and yielded high quality amplification products of the COI gene I. The sarcosyl method showed lower cost per extraction and did not differ from the other methods with respect to preparation times. Cell lysis by sarcosyl represents the best method for DNA extraction in terms of yield, quality, and cost effectiveness.
A hierarchical spatial model for well yield in complex aquifers
NASA Astrophysics Data System (ADS)
Montgomery, J.; O'sullivan, F.
2017-12-01
Efficiently siting and managing groundwater wells requires reliable estimates of the amount of water that can be produced, or the well yield. This can be challenging to predict in highly complex, heterogeneous fractured aquifers due to the uncertainty around local hydraulic properties. Promising statistical approaches have been advanced in recent years. For instance, kriging and multivariate regression analysis have been applied to well test data with limited but encouraging levels of prediction accuracy. Additionally, some analytical solutions to diffusion in homogeneous porous media have been used to infer "effective" properties consistent with observed flow rates or drawdown. However, this is an under-specified inverse problem with substantial and irreducible uncertainty. We describe a flexible machine learning approach capable of combining diverse datasets with constraining physical and geostatistical models for improved well yield prediction accuracy and uncertainty quantification. Our approach can be implemented within a hierarchical Bayesian framework using Markov Chain Monte Carlo, which allows for additional sources of information to be incorporated in priors to further constrain and improve predictions and reduce the model order. We demonstrate the usefulness of this approach using data from over 7,000 wells in a fractured bedrock aquifer.
Statistical evaluation of nutritional components impacting phycocyanin production Synechocystis SP.
Deshmukh, Devendra V.; Puranik, Pravin R.
2012-01-01
Alkaliphilic cyanobacterial cultures were isolated from Lonar lake (MS, India). Among the set of cultures, Synechocystis sp, was studied for phycocyanin production. A maximum yield was obtained in BG-11 medium at optimized conditions (pH 10 and 16 h light). In order to increase the phycocyanin yield media optimization based on the eight media components a Plackett-Burman design of the 12 experimental trials was used. As per the analysis CaCl2, 2H2O and Na2CO3 have been found to be the most influencing media components at 95% significance. Further the optimum concentrations of these components were estimated following a Box Wilson Central Composite Design (CCD) with four star points and five replicates at the center points for each of two factors was adopted for optimization of these two media components. The results indicated that there was an interlinked influence of CaCl2, 2H2O and Na2CO3 on 98% significance. The maximum yield of phycocyanin (12% of dry wt) could be obtained at 0.058 g/l and 0.115 g/l of CaCl2, 2H2O and Na2CO3, respectively. PMID:24031838
Liu, Shuli; Zhang, Guangming; Li, Jianzheng; Li, Xiangkun; Zhang, Jie
2016-06-01
Microbial 5-aminolevulinic acid (ALA) produced from wastewater is considered as potential renewable energy. However, many hurdles are needed to be overcome such as the regulation of key influencing factors on ALA yield. Biomass and ALA production by Rhodobacter sphaeroides was optimized using response surface methodology. The culturing medium was artificial volatile fatty acids wastewater. Three additives were optimized, namely succinate and glycine that are precursors of ALA biosynthesis, and D-glucose that is an inhibitor of ALA dehydratase. The optimal conditions were achieved by analyzing the response surface plots. Statistical analysis showed that succinate at 8.56 mmol/L, glycine at 5.06 mmol/L, and D-glucose at 7.82 mmol/L were the best conditions. Under these optimal conditions, the highest biomass production and ALA yield of 3.55 g/L and 5.49 mg/g-biomass were achieved. Subsequent verification experiments at optimal values had the maximum biomass production of 3.41 ± 0.002 g/L and ALA yield of 5.78 ± 0.08 mg/g-biomass.
Frojo, Gianfranco; Tadisina, Kashyap Komarraju; Pressman, Zachary; Chibnall, John T; Lin, Alexander Y; Kraemer, Bruce A
2016-12-01
The integrated plastic surgery match is a competitive process not only for applicants but also for programs vying for highly qualified candidates. Interactions between applicants and program constituents are limited to a single interview visit. The authors aimed to identify components of the interview visit that influence applicant decision making when determining a final program rank list. Thirty-six applicants who were interviewed (100% response) completed the survey. Applicants rated the importance of 20 elements of the interview visit regarding future ranking of the program on a 1 to 5 Likert scale. Data were analyzed using descriptive statistics, hierarchical cluster analysis, analysis of variance, and Pearson correlations. A literature review was performed regarding the plastic surgery integrated residency interview process. Survey questions were categorized into four groups based on mean survey responses:1. Interactions with faculty and residents (mean response > 4),2. Information about the program (3.5-4),3. Ancillaries (food, amenities, stipends) (3-3.5),4. Hospital tour, hotel (<3).Hierarchical item cluster analysis and analysis of variance testing validated these groupings. Average summary scores were calculated for the items representing Interactions, Information, and Ancillaries. Correlation analysis between clusters yielded no significant correlations. A review of the literature yielded a paucity of data on analysis of the interview visit. The interview visit consists of a discrete hierarchy of perceived importance by applicants. The strongest independent factor in determining future program ranking is the quality of interactions between applicants and program constituents on the interview visit. This calls for further investigation and optimization of the interview visit experience.
Estimating variability in grain legume yields across Europe and the Americas
NASA Astrophysics Data System (ADS)
Cernay, Charles; Ben-Ari, Tamara; Pelzer, Elise; Meynard, Jean-Marc; Makowski, David
2015-06-01
Grain legume production in Europe has recently come under scrutiny. Although legume crops are often promoted to provide environmental services, European farmers tend to turn to non-legume crops. It is assumed that high variability in legume yields explains this aversion, but so far this hypothesis has not been tested. Here, we estimate the variability of major grain legume and non-legume yields in Europe and the Americas from yield time series over 1961-2013. Results show that grain legume yields are significantly more variable than non-legume yields in Europe. These differences are smaller in the Americas. Our results are robust at the level of the statistical methods. In all regions, crops with high yield variability are allocated to less than 1% of cultivated areas. Although the expansion of grain legumes in Europe may be hindered by high yield variability, some species display risk levels compatible with the development of specialized supply chains.
Feeney, Daniel A; Ober, Christopher P; Snyder, Laura A; Hill, Sara A; Jessen, Carl R
2013-01-01
Peritoneal, mesenteric, and omental diseases are important causes of morbidity and mortality in humans and animals, although information in the veterinary literature is limited. The purposes of this retrospective study were to determine whether objectively applied ultrasound interpretive criteria are statistically useful in differentiating among cytologically defined normal, inflammatory, and neoplastic peritoneal conditions in dogs and cats. A second goal was to determine the cytologically interpretable yield on ultrasound-guided, fine-needle sampling of peritoneal, mesenteric, or omental structures. Sonographic criteria agreed upon by the authors were retrospectively and independently applied by two radiologists to the available ultrasound images without knowledge of the cytologic diagnosis and statistically compared to the ultrasound-guided, fine-needle aspiration cytologic interpretations. A total of 72 dogs and 49 cats with abdominal peritoneal, mesenteric, or omental (peritoneal) surface or effusive disease and 17 dogs and 3 cats with no cytologic evidence of inflammation or neoplasia were included. The optimized, ultrasound criteria-based statistical model created independently for each radiologist yielded an equation-based diagnostic category placement accuracy of 63.2-69.9% across the two involved radiologists. Regional organ-associated masses or nodules as well as aggregated bowel and peritoneal thickening were more associated with peritoneal neoplasia whereas localized, severely complex fluid collections were more associated with inflammatory peritoneal disease. The cytologically interpretable yield for ultrasound-guided fine-needle sampling was 72.3% with no difference between species, making this a worthwhile clinical procedure. © 2013 Veterinary Radiology & Ultrasound.
NASA Astrophysics Data System (ADS)
Nizamuddin, Mohammad; Akhand, Kawsar; Roytman, Leonid; Kogan, Felix; Goldberg, Mitch
2015-06-01
Rice is a dominant food crop of Bangladesh accounting about 75 percent of agricultural land use for rice cultivation and currently Bangladesh is the world's fourth largest rice producing country. Rice provides about two-third of total calorie supply and about one-half of the agricultural GDP and one-sixth of the national income in Bangladesh. Aus is one of the main rice varieties in Bangladesh. Crop production, especially rice, the main food staple, is the most susceptible to climate change and variability. Any change in climate will, thus, increase uncertainty regarding rice production as climate is major cause year-to-year variability in rice productivity. This paper shows the application of remote sensing data for estimating Aus rice yield in Bangladesh using official statistics of rice yield with real time acquired satellite data from Advanced Very High Resolution Radiometer (AVHRR) sensor and Principal Component Regression (PCR) method was used to construct a model. The simulated result was compared with official agricultural statistics showing that the error of estimation of Aus rice yield was less than 10%. Remote sensing, therefore, is a valuable tool for estimating crop yields well in advance of harvest, and at a low cost.
Ojha, Nupur; Das, Nilanjana
2018-02-01
Polyhydroxyalkanoates (PHAs) are three-level group of biodegradable polymers and attractive substitutes over conventional plastics to avoid the pollution problems. The yeast strain isolated from sugarcane juice, identified as Wickerhamomyces anomalus VIT-NN01, was used for the production of polyhydroxyalkanoates (PHA). Response surface methodology (RSM), three-level six variables Box-Behnken design (BBD), was employed to optimize the factors such as pH 8.0, temperature 37°C, sugarcane molasses (35g/L) supplemented with co-substrate palm oil (0.5%),corn steep liquor (2%) after a period of 96h of incubation for the maximum yield (19.50±0.3g/L) of PHA. It was well in close agreement with the predicted value obtained by RSM model yield (19.55±0.1g/L).Characterization of the extracted polymer was done using FTIR, GC-MS, XRD, TGA and AFM analysis. NMR spectroscopic analysis revealed that the biopolymer was poly (3-hydroxybutyrate-co-3-hydroxyvalerate), copolymer of PHA. This is the first report on optimization of PHA production using yeast strain isolated from natural sources. Copyright © 2017 Elsevier B.V. All rights reserved.
Optimization of grapevine yield by applying mathematical models to obtain quality wine products
NASA Astrophysics Data System (ADS)
Alina, Dobrei; Alin, Dobrei; Eleonora, Nistor; Teodor, Cristea; Marius, Boldea; Florin, Sala
2016-06-01
Relationship between the crop load and the grape yield and quality is a dynamic process, specific for wine cultivars and for fresh consumption varieties. Modeling these relations is important for the improvement of technological works. This study evaluated the interrelationship of crop load (B - buds number) and several production parameters (Y - yield; S - sugar; A - acidity; GaI - Glucoacidimetric index; AP - alcoholic potential; F - flavorings, WA - wine alcohol; SR - sugar residue, in Muscat Ottonel wine cultivar and Y - yield; S - sugar; A - acidity; GaI - Glucoacidimetric Index; CP - commercial production; BS - berries size in the Victoria table grape cultivar). In both varieties have been identified correlations between the independent variable (B - buds number as a result of pruning and training practices) and quality parameters analyzed (r = -0.699 for B vsY relationship; r = 0.961 for the relationship B vs S; r = -0.959 for B vs AP relationship; r = 0.743 for the relationship Y vs S, p <0.01, in the Muscat Ottonel cultivar, respectively r = -0.907 for relationship B vs Y; r = -0.975 for B vs CP relationship; r = -0.971 for relationship B vs BS; r = 0.990 for CP vs BS relationship in the Victoria cultivar. Through regression analysis were obtained models that describe the variation concerning production and quality parameters in relation to the independent variable (B - buds number) with statistical significance results.
Harden, Stephen L.; Cuffney, Thomas F.; Terziotti, Silvia; Kolb, Katharine R.
2013-01-01
Data collected between 1997 and 2008 at 48 stream sites were used to characterize relations between watershed settings and stream nutrient yields throughout central and eastern North Carolina. The focus of the investigation was to identify environmental variables in watersheds that influence nutrient export for supporting the development and prioritization of management strategies for restoring nutrient-impaired streams. Nutrient concentration data and streamflow data compiled for the 1997 to 2008 study period were used to compute stream yields of nitrate, total nitrogen (N), and total phosphorus (P) for each study site. Compiled environmental data (including variables for land cover, hydrologic soil groups, base-flow index, streams, wastewater treatment facilities, and concentrated animal feeding operations) were used to characterize the watershed settings for the study sites. Data for the environmental variables were analyzed in combination with the stream nutrient yields to explore relations based on watershed characteristics and to evaluate whether particular variables were useful indicators of watersheds having relatively higher or lower potential for exporting nutrients. Data evaluations included an examination of median annual nutrient yields based on a watershed land-use classification scheme developed as part of the study. An initial examination of the data indicated that the highest median annual nutrient yields occurred at both agricultural and urban sites, especially for urban sites having large percentages of point-source flow contributions to the streams. The results of statistical testing identified significant differences in annual nutrient yields when sites were analyzed on the basis of watershed land-use category. When statistical differences in median annual yields were noted, the results for nitrate, total N, and total P were similar in that highly urbanized watersheds (greater than 30 percent developed land use) and (or) watersheds with greater than 10 percent point-source flow contributions to streamflow had higher yields relative to undeveloped watersheds (having less than 10 and 15 percent developed and agricultural land uses, respectively) and watersheds with relatively low agricultural land use (between 15 and 30 percent). The statistical tests further indicated that the median annual yields for total P were statistically higher for watersheds with high agricultural land use (greater than 30 percent) compared to the undeveloped watersheds and watersheds with low agricultural land use. The total P yields also were higher for watersheds with low urban land use (between 10 and 30 percent developed land) compared to the undeveloped watersheds. The study data indicate that grouping and examining stream nutrient yields based on the land-use classifications used in this report can be useful for characterizing relations between watershed settings and nutrient yields in streams located throughout central and eastern North Carolina. Compiled study data also were analyzed with four regression tree models as a means of determining which watershed environmental variables or combination of variables result in basins that are likely to have high or low nutrient yields. The regression tree analyses indicated that some of the environmental variables examined in this study were useful for predicting yields of nitrate, total N, and total P. When the median annual nutrient yields for all 48 sites were evaluated as a group (Model 1), annual point-source flow yields had the greatest influence on nitrate and total N yields observed in streams, and annual streamflow yields had the greatest influence on yields of total P. The Model 1 results indicated that watersheds with higher annual point-source flow yields had higher annual yields of nitrate and total N, and watersheds with higher annual streamflow yields had higher annual yields of total P. When sites with high point-source flows (greater than 10 percent of total streamflow) were excluded from the regression tree analyses (Models 2–4), the percentage of forested land in the watersheds was identified as the primary environmental variable influencing stream yields for both total N and total P. Models 2, 3 and 4 did not identify any watershed environmental variables that could adequately explain the observed variability in the nitrate yields among the set of sites examined by each of these models. The results for Models 2, 3, and 4 indicated that watersheds with higher percentages of forested land had lower annual total N and total P yields compared to watersheds with lower percentages of forested land, which had higher median annual total N and total P yields. Additional environmental variables determined to further influence the stream nutrient yields included median annual percentage of point-source flow contributions to the streams, variables of land cover (percentage of forested land, agricultural land, and (or) forested land plus wetlands) in the watershed and (or) in the stream buffer, and drainage area. The regression tree models can serve as a tool for relating differences in select watershed attributes to differences in stream yields of nitrate, total N, and total P, which can provide beneficial information for improving nutrient management in streams throughout North Carolina and for reducing nutrient loads to coastal waters.
Perry, Charles A.; Wolock, David M.; Artman, Joshua C.
2004-01-01
Streamflow statistics of flow duration and peak-discharge frequency were estimated for 4,771 individual locations on streams listed on the 1999 Kansas Surface Water Register. These statistics included the flow-duration values of 90, 75, 50, 25, and 10 percent, as well as the mean flow value. Peak-discharge frequency values were estimated for the 2-, 5-, 10-, 25-, 50-, and 100-year floods. Least-squares multiple regression techniques were used, along with Tobit analyses, to develop equations for estimating flow-duration values of 90, 75, 50, 25, and 10 percent and the mean flow for uncontrolled flow stream locations. The contributing-drainage areas of 149 U.S. Geological Survey streamflow-gaging stations in Kansas and parts of surrounding States that had flow uncontrolled by Federal reservoirs and used in the regression analyses ranged from 2.06 to 12,004 square miles. Logarithmic transformations of climatic and basin data were performed to yield the best linear relation for developing equations to compute flow durations and mean flow. In the regression analyses, the significant climatic and basin characteristics, in order of importance, were contributing-drainage area, mean annual precipitation, mean basin permeability, and mean basin slope. The analyses yielded a model standard error of prediction range of 0.43 logarithmic units for the 90-percent duration analysis to 0.15 logarithmic units for the 10-percent duration analysis. The model standard error of prediction was 0.14 logarithmic units for the mean flow. Regression equations used to estimate peak-discharge frequency values were obtained from a previous report, and estimates for the 2-, 5-, 10-, 25-, 50-, and 100-year floods were determined for this report. The regression equations and an interpolation procedure were used to compute flow durations, mean flow, and estimates of peak-discharge frequency for locations along uncontrolled flow streams on the 1999 Kansas Surface Water Register. Flow durations, mean flow, and peak-discharge frequency values determined at available gaging stations were used to interpolate the regression-estimated flows for the stream locations where available. Streamflow statistics for locations that had uncontrolled flow were interpolated using data from gaging stations weighted according to the drainage area and the bias between the regression-estimated and gaged flow information. On controlled reaches of Kansas streams, the streamflow statistics were interpolated between gaging stations using only gaged data weighted by drainage area.
Digestible lysine requirements of male broilers from 1 to 42 days of age reassessed.
Cemin, Henrique Scher; Vieira, Sergio Luiz; Stefanello, Catarina; Kipper, Marcos; Kindlein, Liris; Helmbrecht, Ariane
2017-01-01
Three experiments were conducted separately to estimate the digestible Lys (dig. Lys) requirements of Cobb × Cobb 500 male broilers using different statistical models. For each experiment, 1,200 chicks were housed in 48 floor pens in a completely randomized design with 6 treatments and 8 replicates. Broilers were fed diets with increasing dig. Lys levels from 1 to 12 d (Exp. 1), from 12 to 28 d (Exp. 2), and 28 to 42 d (Exp. 3). Increasing dig. Lys levels were equally spaced from 0.97 to 1.37% in Exp. 1, 0.77 to 1.17% in Exp. 2, and 0.68 to 1.07% in Exp. 3. The lowest dig. Lys diets were not supplemented with L-Lysine and all other essential AA met or exceeded recommendations. In Exp. 3, six birds per pen were randomly selected from each replication to evaluate carcass and breast yields. Digestible Lys requirements were estimated by quadratic polynomial (QP), linear broken-line (LBL), quadratic broken-line (QBL), and exponential asymptotic (EA) models. Overall, dig. Lys requirements varied among response variables and statistical models. Increasing dietary dig. Lys had a positive effect on BW, carcass and breast yields. Levels of dig. Lys that optimized performance using QP, LBL, QBL, and EA models were 1.207, 1.036, 1.113, and 1.204% for BWG and 1.190, 1.027, 1.100, and 1.172% for FCR in Exp. 1; 1.019, 0.853, 0.944; 1.025% for BWG and 1.050, 0.879, 1.032, and 1.167% for FCR in Exp. 2; and 0.960, 0.835, 0.933, and 1.077% for BWG, 0.981, 0.857, 0.963, and 1.146% for FCR in Exp. 3. The QP, LBL, QBL, and EA also estimated dig. Lys requirements as 0.941, 0.846, 0.925, and 1.070% for breast meat yield in Exp. 3. In conclusion, Lys requirements vary greatly according to the statistical analysis utilized; therefore, the origin of requirement estimation must be taken into account in order to allow adequate comparisons between references.
Digestible lysine requirements of male broilers from 1 to 42 days of age reassessed
Cemin, Henrique Scher; Stefanello, Catarina; Kipper, Marcos; Kindlein, Liris; Helmbrecht, Ariane
2017-01-01
Three experiments were conducted separately to estimate the digestible Lys (dig. Lys) requirements of Cobb × Cobb 500 male broilers using different statistical models. For each experiment, 1,200 chicks were housed in 48 floor pens in a completely randomized design with 6 treatments and 8 replicates. Broilers were fed diets with increasing dig. Lys levels from 1 to 12 d (Exp. 1), from 12 to 28 d (Exp. 2), and 28 to 42 d (Exp. 3). Increasing dig. Lys levels were equally spaced from 0.97 to 1.37% in Exp. 1, 0.77 to 1.17% in Exp. 2, and 0.68 to 1.07% in Exp. 3. The lowest dig. Lys diets were not supplemented with L-Lysine and all other essential AA met or exceeded recommendations. In Exp. 3, six birds per pen were randomly selected from each replication to evaluate carcass and breast yields. Digestible Lys requirements were estimated by quadratic polynomial (QP), linear broken-line (LBL), quadratic broken-line (QBL), and exponential asymptotic (EA) models. Overall, dig. Lys requirements varied among response variables and statistical models. Increasing dietary dig. Lys had a positive effect on BW, carcass and breast yields. Levels of dig. Lys that optimized performance using QP, LBL, QBL, and EA models were 1.207, 1.036, 1.113, and 1.204% for BWG and 1.190, 1.027, 1.100, and 1.172% for FCR in Exp. 1; 1.019, 0.853, 0.944; 1.025% for BWG and 1.050, 0.879, 1.032, and 1.167% for FCR in Exp. 2; and 0.960, 0.835, 0.933, and 1.077% for BWG, 0.981, 0.857, 0.963, and 1.146% for FCR in Exp. 3. The QP, LBL, QBL, and EA also estimated dig. Lys requirements as 0.941, 0.846, 0.925, and 1.070% for breast meat yield in Exp. 3. In conclusion, Lys requirements vary greatly according to the statistical analysis utilized; therefore, the origin of requirement estimation must be taken into account in order to allow adequate comparisons between references. PMID:28636626
Personal use of hair dyes and the risk of bladder cancer: results of a meta-analysis.
Huncharek, Michael; Kupelnick, Bruce
2005-01-01
OBJECTIVE: This study examined the methodology of observational studies that explored an association between personal use of hair dye products and the risk of bladder cancer. METHODS: Data were pooled from epidemiological studies using a general variance-based meta-analytic method that employed confidence intervals. The outcome of interest was a summary relative risk (RRs) reflecting the risk of bladder cancer development associated with use of hair dye products vs. non-use. Sensitivity analyses were performed to explain any observed statistical heterogeneity and to explore the influence of specific study characteristics of the summary estimate of effect. RESULTS: Initially combining homogenous data from six case-control and one cohort study yielded a non-significant RR of 1.01 (0.92, 1.11), suggesting no association between hair dye use and bladder cancer development. Sensitivity analyses examining the influence of hair dye type, color, and study design on this suspected association showed that uncontrolled confounding and design limitations contributed to a spurious non-significant summary RR. The sensitivity analyses yielded statistically significant RRs ranging from 1.22 (1.11, 1.51) to 1.50 (1.30, 1.98), indicating that personal use of hair dye products increases bladder cancer risk by 22% to 50% vs. non-use. CONCLUSION: The available epidemiological data suggest an association between personal use of hair dye products and increased risk of bladder cancer. PMID:15736329
Hui, Ferdinand K; Schuette, Albert J; Lieber, Michael; Spiotta, Alejandro M; Moskowitz, Shaye I; Barrow, Daniel L; Cawley, C Michael
2012-03-01
ε-Aminocaproic acid (EACA) has been used to reduce the rate of cerebral aneurysm rerupture before definitive treatment. In centers administering EACA to patients with a subarachnoid hemorrhage (SAH), patients eventually diagnosed with angiographically negative subarachnoid hemorrhage (ANSAH) may also initially receive EACA, perhaps placing them at increased risk for ischemic complications. To evaluate the effect of short-term EACA on outcomes and secondary measures in patients with ANSAH. We conducted a retrospective study of 454 consecutive SAH patients over a 2-year period under a current protocol for EACA use. Patients were excluded if a source for the SAH was discovered, yielding a total of 83 ANSAH patients. The patients were assigned to groups that did or did not receive EACA. The primary end points of the study were ischemic complications, pulmonary emboli, vasospasm, ventriculoperitoneal shunting rates, and outcomes. Statistical analysis yielded no significant difference between the 2 arms with respect to any of the end points: vasospasm (P = .65), deep vein thrombosis (P = .51), pulmonary embolism (P = 1.0), stroke (P = 1.0), myocardial infarction (P = 1.0), and ventriculoperitoneal shunt (P = .57). There was no statistically significant outcome difference using the modified Rankin Scale (P = .30). Short-term (<72 hour) application of EACA does not result in an increase in adverse events in patients with ANSAH.
On the significance of δ13C correlations in ancient sediments
NASA Astrophysics Data System (ADS)
Derry, Louis A.
2010-08-01
A graphical analysis of the correlations between δc and ɛTOC was introduced by Rothman et al. (2003) to obtain estimates of the carbon isotopic composition of inputs to the oceans and the organic carbon burial fraction. Applied to Cenozoic data, the method agrees with independent estimates, but with Neoproterozoic data the method yields results that cannot be accommodated with standard models of sedimentary carbon isotope mass balance. We explore the sensitivity of the graphical correlation method and find that the variance ratio between δc and δo is an important control on the correlation of δc and ɛ. If the variance ratio σc/ σo ≥ 1 highly correlated arrays very similar to those obtained from the data are produced from independent random variables. The Neoproterozoic data shows such variance patterns, and the regression parameters for the Neoproterozoic data are statistically indistinguishable from the randomized model at the 95% confidence interval. The projection of the data into δc- ɛ space cannot distinguish between signal and noise, such as post-depositional alteration, under these circumstances. There appears to be no need to invoke unusual carbon cycle dynamics to explain the Neoproterozoic δc- ɛ array. The Cenozoic data have σc/ σo < 1 and the δc vs. ɛ correlation is probably geologically significant, but the analyzed sample size is too small to yield statistically significant results.
Building and using a statistical 3D motion atlas for analyzing myocardial contraction in MRI
NASA Astrophysics Data System (ADS)
Rougon, Nicolas F.; Petitjean, Caroline; Preteux, Francoise J.
2004-05-01
We address the issue of modeling and quantifying myocardial contraction from 4D MR sequences, and present an unsupervised approach for building and using a statistical 3D motion atlas for the normal heart. This approach relies on a state-of-the-art variational non rigid registration (NRR) technique using generalized information measures, which allows for robust intra-subject motion estimation and inter-subject anatomical alignment. The atlas is built from a collection of jointly acquired tagged and cine MR exams in short- and long-axis views. Subject-specific non parametric motion estimates are first obtained by incremental NRR of tagged images onto the end-diastolic (ED) frame. Individual motion data are then transformed into the coordinate system of a reference subject using subject-to-reference mappings derived by NRR of cine ED images. Finally, principal component analysis of aligned motion data is performed for each cardiac phase, yielding a mean model and a set of eigenfields encoding kinematic ariability. The latter define an organ-dedicated hierarchical motion basis which enables parametric motion measurement from arbitrary tagged MR exams. To this end, the atlas is transformed into subject coordinates by reference-to-subject NRR of ED cine frames. Atlas-based motion estimation is then achieved by parametric NRR of tagged images onto the ED frame, yielding a compact description of myocardial contraction during diastole.
A review of dark fermentative hydrogen production from biodegradable municipal waste fractions.
De Gioannis, G; Muntoni, A; Polettini, A; Pomi, R
2013-06-01
Hydrogen is believed to play a potentially key role in the implementation of sustainable energy production, particularly when it is produced from renewable sources and low energy-demanding processes. In the present paper an attempt was made at critically reviewing more than 80 recent publications, in order to harmonize and compare the available results from different studies on hydrogen production from FW and OFMSW through dark fermentation, and derive reliable information about process yield and stability in view of building related predictive models. The review was focused on the effect of factors, recognized as potentially affecting process evolution (including type of substrate and co-substrate and relative ratio, type of inoculum, food/microorganisms [F/M] ratio, applied pre-treatment, reactor configuration, temperature and pH), on the fermentation yield and kinetics. Statistical analysis of literature data from batch experiments was also conducted, showing that the variables affecting the H2 production yield were ranked in the order: type of co-substrate, type of pre-treatment, operating pH, control of initial pH and fermentation temperature. However, due to the dispersion of data observed in some instances, the ambiguity about the presence of additional hidden variables cannot be resolved. The results from the analysis thus suggest that, for reliable predictive models of fermentative hydrogen production to be derived, a high level of consistency between data is strictly required, claiming for more systematic and comprehensive studies on the subject. Copyright © 2013 Elsevier Ltd. All rights reserved.
Prakash, S; Rajeswari, K; Divya, P; Ferlin, M; Rajeshwari, C T; Vanavil, B
2018-05-28
Curdlan gum is a neutral water-insoluble bacterial exopolysaccharide composed primarily of linear β-(1,3) glycosidic linkages. Recently, there has been increasing interest in the applications of curdlan and its derivatives. Curdlan is found to inhibit tumors and its sulfated derivative possess anti-HIV activity. Curdlan is biodegradable, non-toxic towards human, environment and edible which makes it suitable as drug-delivery vehicles for sustained drug release. The increasing demand for the growing applications of curdlan requires an efficient high yield fermentation production process so as to satisfy the industrial needs. In this perspective, the present work is aimed to screen and isolate an efficient curdlan gum producing bacteria from rhizosphere of ground nut plant using aniline-blue agar. High yielding isolate was selected based on curdlan yield and identified as Bacillus cereus using gas-chromatography fatty acid methyl ester analysis. B. cereus PR3 curdlan gum was characterized using FT-IR spectroscopy, SEM, XRD and TGA. Fermentation time for curdlan production using B. cereus PR3 was optimized. Media constituents like carbon, nitrogen and mineral sources were screened using Plackett-Burman design. Subsequent statistical analysis revealed that Starch, NH 4 NO 3 , K 2 HPO 4 , Na 2 SO 4 , KH 2 SO 4 and CaCl 2 were significant media constituents and these concentrations were optimized for enhancement of curdlan production up to 20.88 g/l.
Post-heading heat stress and yield impact in winter wheat of China.
Liu, Bing; Liu, Leilei; Tian, Liying; Cao, Weixing; Zhu, Yan; Asseng, Senthold
2014-02-01
Wheat is sensitive to high temperatures, but the spatial and temporal variability of high temperature and its impact on yield are often not known. An analysis of historical climate and yield data was undertaken to characterize the spatial and temporal variability of heat stress between heading and maturity and its impact on wheat grain yield in China. Several heat stress indices were developed to quantify heat intensity, frequency, and duration between heading and maturity based on measured maximum temperature records of the last 50 years from 166 stations in the main wheat-growing region of China. Surprisingly, heat stress between heading and maturity was more severe in the generally cooler northern wheat-growing regions than the generally warmer southern regions of China, because of the delayed time of heading with low temperatures during the earlier growing season and the exposure of the post-heading phase into the warmer part of the year. Heat stress between heading and maturity has increased in the last decades in most of the main winter wheat production areas of China, but the rate was higher in the south than in the north. The correlation between measured grain yields and post-heading heat stress and average temperature were statistically significant in the entire wheat-producing region, and explained about 29% of the observed spatial and temporal yield variability. A heat stress index considering the duration and intensity of heat between heading and maturity was required to describe the correlation of heat stress and yield variability. Because heat stress is a major cause of yield loss and the number of heat events is projected to increase in the future, quantifying the future impact of heat stress on wheat production and developing appropriate adaptation and mitigation strategies are critical for developing food security policies in China and elsewhere. © 2013 John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Waqas, Abi; Melati, Daniele; Manfredi, Paolo; Grassi, Flavia; Melloni, Andrea
2018-02-01
The Building Block (BB) approach has recently emerged in photonic as a suitable strategy for the analysis and design of complex circuits. Each BB can be foundry related and contains a mathematical macro-model of its functionality. As well known, statistical variations in fabrication processes can have a strong effect on their functionality and ultimately affect the yield. In order to predict the statistical behavior of the circuit, proper analysis of the uncertainties effects is crucial. This paper presents a method to build a novel class of Stochastic Process Design Kits for the analysis of photonic circuits. The proposed design kits directly store the information on the stochastic behavior of each building block in the form of a generalized-polynomial-chaos-based augmented macro-model obtained by properly exploiting stochastic collocation and Galerkin methods. Using this approach, we demonstrate that the augmented macro-models of the BBs can be calculated once and stored in a BB (foundry dependent) library and then used for the analysis of any desired circuit. The main advantage of this approach, shown here for the first time in photonics, is that the stochastic moments of an arbitrary photonic circuit can be evaluated by a single simulation only, without the need for repeated simulations. The accuracy and the significant speed-up with respect to the classical Monte Carlo analysis are verified by means of classical photonic circuit example with multiple uncertain variables.