Sample records for extremely small values

  1. Extreme event statistics in a drifting Markov chain

    NASA Astrophysics Data System (ADS)

    Kindermann, Farina; Hohmann, Michael; Lausch, Tobias; Mayer, Daniel; Schmidt, Felix; Widera, Artur

    2017-07-01

    We analyze extreme event statistics of experimentally realized Markov chains with various drifts. Our Markov chains are individual trajectories of a single atom diffusing in a one-dimensional periodic potential. Based on more than 500 individual atomic traces we verify the applicability of the Sparre Andersen theorem to our system despite the presence of a drift. We present detailed analysis of four different rare-event statistics for our system: the distributions of extreme values, of record values, of extreme value occurrence in the chain, and of the number of records in the chain. We observe that, for our data, the shape of the extreme event distributions is dominated by the underlying exponential distance distribution extracted from the atomic traces. Furthermore, we find that even small drifts influence the statistics of extreme events and record values, which is supported by numerical simulations, and we identify cases in which the drift can be determined without information about the underlying random variable distributions. Our results facilitate the use of extreme event statistics as a signal for small drifts in correlated trajectories.

  2. How to recover more value from small pine trees: Essential oils and resins

    Treesearch

    Vasant M. Kelkar; Brian W. Geils; Dennis R. Becker; Steven T. Overby; Daniel G. Neary

    2006-01-01

    In recent years, the young dense forests of northern Arizona have suffered extreme droughts, wildfires, and insect outbreaks. Improving forest health requires reducing forest density by cutting many small-diameter trees with the consequent production of large volumes of residual biomass. To offset the cost of handling this low-value timber, additional marketing options...

  3. Statistic analysis of annual total ozone extremes for the period 1964-1988

    NASA Technical Reports Server (NTRS)

    Krzyscin, Janusz W.

    1994-01-01

    Annual extremes of total column amount of ozone (in the period 1964-1988) from a network of 29 Dobson stations have been examined using the extreme value analysis. The extremes have been calculated as the highest deviation of daily mean total ozone from its long-term monthly mean, normalized by the monthly standard deviations. The extremes have been selected from the direct-Sun total ozone observations only. The extremes resulting from abrupt changes in ozone (day to day changes greater than 20 percent) have not been considered. The ordered extremes (maxima in ascending way, minima in descending way) have been fitted to one of three forms of the Fisher-Tippet extreme value distribution by the nonlinear least square method (Levenberg-Marguard method). We have found that the ordered extremes from a majority of Dobson stations lie close to Fisher-Tippet type III. The extreme value analysis of the composite annual extremes (combined from averages of the annual extremes selected at individual stations) has shown that the composite maxima are fitted by the Fisher-Tippet type III and the composite minima by the Fisher-Tippet type I. The difference between the Fisher-Tippet types of the composite extremes seems to be related to the ozone downward trend. Extreme value prognoses for the period 1964-2014 (derived from the data taken at: all analyzed stations, the North American, and the European stations) have revealed that the prognostic extremes are close to the largest annual extremes in the period 1964-1988 and there are only small regional differences in the prognoses.

  4. Extreme events and event size fluctuations in biased random walks on networks.

    PubMed

    Kishore, Vimal; Santhanam, M S; Amritkar, R E

    2012-05-01

    Random walk on discrete lattice models is important to understand various types of transport processes. The extreme events, defined as exceedences of the flux of walkers above a prescribed threshold, have been studied recently in the context of complex networks. This was motivated by the occurrence of rare events such as traffic jams, floods, and power blackouts which take place on networks. In this work, we study extreme events in a generalized random walk model in which the walk is preferentially biased by the network topology. The walkers preferentially choose to hop toward the hubs or small degree nodes. In this setting, we show that extremely large fluctuations in event sizes are possible on small degree nodes when the walkers are biased toward the hubs. In particular, we obtain the distribution of event sizes on the network. Further, the probability for the occurrence of extreme events on any node in the network depends on its "generalized strength," a measure of the ability of a node to attract walkers. The generalized strength is a function of the degree of the node and that of its nearest neighbors. We obtain analytical and simulation results for the probability of occurrence of extreme events on the nodes of a network using a generalized random walk model. The result reveals that the nodes with a larger value of generalized strength, on average, display lower probability for the occurrence of extreme events compared to the nodes with lower values of generalized strength.

  5. Rainfall extremes from TRMM data and the Metastatistical Extreme Value Distribution

    NASA Astrophysics Data System (ADS)

    Zorzetto, Enrico; Marani, Marco

    2017-04-01

    A reliable quantification of the probability of weather extremes occurrence is essential for designing resilient water infrastructures and hazard mitigation measures. However, it is increasingly clear that the presence of inter-annual climatic fluctuations determines a substantial long-term variability in the frequency of occurrence of extreme events. This circumstance questions the foundation of the traditional extreme value theory, hinged on stationary Poisson processes or on asymptotic assumptions to derive the Generalized Extreme Value (GEV) distribution. We illustrate here, with application to daily rainfall, a new approach to extreme value analysis, the Metastatistical Extreme Value Distribution (MEVD). The MEVD relaxes the above assumptions and is based on the whole distribution of daily rainfall events, thus allowing optimal use of all available observations. Using a global dataset of rain gauge observations, we show that the MEVD significantly outperforms the Generalized Extreme Value distribution, particularly for long average recurrence intervals and when small samples are available. The latter property suggests MEVD to be particularly suited for applications to satellite rainfall estimates, which only cover two decades, thus making extreme value estimation extremely challenging. Here we apply MEVD to the TRMM TMPA 3B42 product, an 18-year dataset of remotely-sensed daily rainfall providing a quasi-global coverage. Our analyses yield a global scale mapping of daily rainfall extremes and of their distributional tail properties, bridging the existing large gaps in ground-based networks. Finally, we illustrate how our global-scale analysis can provide insight into how properties of local rainfall regimes affect tail estimation uncertainty when using the GEV or MEVD approach. We find a dependence of the estimation uncertainty, for both the GEV- and MEV-based approaches, on the average annual number and on the inter-annual variability of rainy days. In particular, estimation uncertainty decreases 1) as the mean annual number of wet days increases, and 2) as the variability in the number of rainy days, expressed by its coefficient of variation, decreases. We tentatively explain this behavior in terms of the assumptions underlying the two approaches.

  6. Extreme events in total ozone over Arosa - Part 1: Application of extreme value theory

    NASA Astrophysics Data System (ADS)

    Rieder, H. E.; Staehelin, J.; Maeder, J. A.; Peter, T.; Ribatet, M.; Davison, A. C.; Stübi, R.; Weihs, P.; Holawe, F.

    2010-10-01

    In this study ideas from extreme value theory are for the first time applied in the field of stratospheric ozone research, because statistical analysis showed that previously used concepts assuming a Gaussian distribution (e.g. fixed deviations from mean values) of total ozone data do not adequately address the structure of the extremes. We show that statistical extreme value methods are appropriate to identify ozone extremes and to describe the tails of the Arosa (Switzerland) total ozone time series. In order to accommodate the seasonal cycle in total ozone, a daily moving threshold was determined and used, with tools from extreme value theory, to analyse the frequency of days with extreme low (termed ELOs) and high (termed EHOs) total ozone at Arosa. The analysis shows that the Generalized Pareto Distribution (GPD) provides an appropriate model for the frequency distribution of total ozone above or below a mathematically well-defined threshold, thus providing a statistical description of ELOs and EHOs. The results show an increase in ELOs and a decrease in EHOs during the last decades. The fitted model represents the tails of the total ozone data set with high accuracy over the entire range (including absolute monthly minima and maxima), and enables a precise computation of the frequency distribution of ozone mini-holes (using constant thresholds). Analyzing the tails instead of a small fraction of days below constant thresholds provides deeper insight into the time series properties. Fingerprints of dynamical (e.g. ENSO, NAO) and chemical features (e.g. strong polar vortex ozone loss), and major volcanic eruptions, can be identified in the observed frequency of extreme events throughout the time series. Overall the new approach to analysis of extremes provides more information on time series properties and variability than previous approaches that use only monthly averages and/or mini-holes and mini-highs.

  7. Extreme events in total ozone over Arosa - Part 1: Application of extreme value theory

    NASA Astrophysics Data System (ADS)

    Rieder, H. E.; Staehelin, J.; Maeder, J. A.; Peter, T.; Ribatet, M.; Davison, A. C.; Stübi, R.; Weihs, P.; Holawe, F.

    2010-05-01

    In this study ideas from extreme value theory are for the first time applied in the field of stratospheric ozone research, because statistical analysis showed that previously used concepts assuming a Gaussian distribution (e.g. fixed deviations from mean values) of total ozone data do not adequately address the structure of the extremes. We show that statistical extreme value methods are appropriate to identify ozone extremes and to describe the tails of the Arosa (Switzerland) total ozone time series. In order to accommodate the seasonal cycle in total ozone, a daily moving threshold was determined and used, with tools from extreme value theory, to analyse the frequency of days with extreme low (termed ELOs) and high (termed EHOs) total ozone at Arosa. The analysis shows that the Generalized Pareto Distribution (GPD) provides an appropriate model for the frequency distribution of total ozone above or below a mathematically well-defined threshold, thus providing a statistical description of ELOs and EHOs. The results show an increase in ELOs and a decrease in EHOs during the last decades. The fitted model represents the tails of the total ozone data set with high accuracy over the entire range (including absolute monthly minima and maxima), and enables a precise computation of the frequency distribution of ozone mini-holes (using constant thresholds). Analyzing the tails instead of a small fraction of days below constant thresholds provides deeper insight into the time series properties. Fingerprints of dynamical (e.g. ENSO, NAO) and chemical features (e.g. strong polar vortex ozone loss), and major volcanic eruptions, can be identified in the observed frequency of extreme events throughout the time series. Overall the new approach to analysis of extremes provides more information on time series properties and variability than previous approaches that use only monthly averages and/or mini-holes and mini-highs.

  8. Isotopic evidence for continental ice sheet in mid-latitude region in the supergreenhouse Early Cretaceous

    PubMed Central

    Yang, Wu-Bin; Niu, He-Cai; Sun, Wei-Dong; Shan, Qiang; Zheng, Yong-Fei; Li, Ning-Bo; Li, Cong-Ying; Arndt, Nicholas T.; Xu, Xing; Jiang, Yu-Hang; Yu, Xue-Yuan

    2013-01-01

    Cretaceous represents one of the hottest greenhouse periods in the Earth's history, but some recent studies suggest that small ice caps might be present in non-polar regions during certain periods in the Early Cretaceous. Here we report extremely negative δ18O values of −18.12‰ to −13.19‰ for early Aptian hydrothermal zircon from an A-type granite at Baerzhe in northeastern China. Given that A-type granite is anhydrous and that magmatic zircon of the Baerzhe granite has δ18O value close to mantle values, the extremely negative δ18O values for hydrothermal zircon are attributed to addition of meteoric water with extremely low δ18O, mostly likely transported by glaciers. Considering the paleoaltitude of the region, continental glaciation is suggested to occur in the early Aptian, indicating much larger temperature fluctuations than previously thought during the supergreenhouse Cretaceous. This may have impact on the evolution of major organism in the Jehol Group during this period. PMID:24061068

  9. Climatic extremes improve predictions of spatial patterns of tree species

    USGS Publications Warehouse

    Zimmermann, N.E.; Yoccoz, N.G.; Edwards, T.C.; Meier, E.S.; Thuiller, W.; Guisan, Antoine; Schmatz, D.R.; Pearman, P.B.

    2009-01-01

    Understanding niche evolution, dynamics, and the response of species to climate change requires knowledge of the determinants of the environmental niche and species range limits. Mean values of climatic variables are often used in such analyses. In contrast, the increasing frequency of climate extremes suggests the importance of understanding their additional influence on range limits. Here, we assess how measures representing climate extremes (i.e., interannual variability in climate parameters) explain and predict spatial patterns of 11 tree species in Switzerland. We find clear, although comparably small, improvement (+20% in adjusted D2, +8% and +3% in cross-validated True Skill Statistic and area under the receiver operating characteristics curve values) in models that use measures of extremes in addition to means. The primary effect of including information on climate extremes is a correction of local overprediction and underprediction. Our results demonstrate that measures of climate extremes are important for understanding the climatic limits of tree species and assessing species niche characteristics. The inclusion of climate variability likely will improve models of species range limits under future conditions, where changes in mean climate and increased variability are expected.

  10. From Chebyshev to Bernstein: A Tour of Polynomials Small and Large

    ERIC Educational Resources Information Center

    Boelkins, Matthew; Miller, Jennifer; Vugteveen, Benjamin

    2006-01-01

    Consider the family of monic polynomials of degree n having zeros at -1 and +1 and all their other real zeros in between these two values. This article explores the size of these polynomials using the supremum of the absolute value on [-1, 1], showing that scaled Chebyshev and Bernstein polynomials give the extremes.

  11. Numerical modeling of macrodispersion in heterogeneous media: a comparison of multi-Gaussian and non-multi-Gaussian models

    NASA Astrophysics Data System (ADS)

    Wen, Xian-Huan; Gómez-Hernández, J. Jaime

    1998-03-01

    The macrodispersion of an inert solute in a 2-D heterogeneous porous media is estimated numerically in a series of fields of varying heterogeneity. Four different random function (RF) models are used to model log-transmissivity (ln T) spatial variability, and for each of these models, ln T variance is varied from 0.1 to 2.0. The four RF models share the same univariate Gaussian histogram and the same isotropic covariance, but differ from one another in terms of the spatial connectivity patterns at extreme transmissivity values. More specifically, model A is a multivariate Gaussian model for which, by definition, extreme values (both high and low) are spatially uncorrelated. The other three models are non-multi-Gaussian: model B with high connectivity of high extreme values, model C with high connectivity of low extreme values, and model D with high connectivities of both high and low extreme values. Residence time distributions (RTDs) and macrodispersivities (longitudinal and transverse) are computed on ln T fields corresponding to the different RF models, for two different flow directions and at several scales. They are compared with each other, as well as with predicted values based on first-order analytical results. Numerically derived RTDs and macrodispersivities for the multi-Gaussian model are in good agreement with analytically derived values using first-order theories for log-transmissivity variance up to 2.0. The results from the non-multi-Gaussian models differ from each other and deviate largely from the multi-Gaussian results even when ln T variance is small. RTDs in non-multi-Gaussian realizations with high connectivity at high extreme values display earlier breakthrough than in multi-Gaussian realizations, whereas later breakthrough and longer tails are observed for RTDs from non-multi-Gaussian realizations with high connectivity at low extreme values. Longitudinal macrodispersivities in the non-multi-Gaussian realizations are, in general, larger than in the multi-Gaussian ones, while transverse macrodispersivities in the non-multi-Gaussian realizations can be larger or smaller than in the multi-Gaussian ones depending on the type of connectivity at extreme values. Comparing the numerical results for different flow directions, it is confirmed that macrodispersivities in multi-Gaussian realizations with isotropic spatial correlation are not flow direction-dependent. Macrodispersivities in the non-multi-Gaussian realizations, however, are flow direction-dependent although the covariance of ln T is isotropic (the same for all four models). It is important to account for high connectivities at extreme transmissivity values, a likely situation in some geological formations. Some of the discrepancies between first-order-based analytical results and field-scale tracer test data may be due to the existence of highly connected paths of extreme conductivity values.

  12. Calculating p-values and their significances with the Energy Test for large datasets

    NASA Astrophysics Data System (ADS)

    Barter, W.; Burr, C.; Parkes, C.

    2018-04-01

    The energy test method is a multi-dimensional test of whether two samples are consistent with arising from the same underlying population, through the calculation of a single test statistic (called the T-value). The method has recently been used in particle physics to search for samples that differ due to CP violation. The generalised extreme value function has previously been used to describe the distribution of T-values under the null hypothesis that the two samples are drawn from the same underlying population. We show that, in a simple test case, the distribution is not sufficiently well described by the generalised extreme value function. We present a new method, where the distribution of T-values under the null hypothesis when comparing two large samples can be found by scaling the distribution found when comparing small samples drawn from the same population. This method can then be used to quickly calculate the p-values associated with the results of the test.

  13. Improving power and robustness for detecting genetic association with extreme-value sampling design.

    PubMed

    Chen, Hua Yun; Li, Mingyao

    2011-12-01

    Extreme-value sampling design that samples subjects with extremely large or small quantitative trait values is commonly used in genetic association studies. Samples in such designs are often treated as "cases" and "controls" and analyzed using logistic regression. Such a case-control analysis ignores the potential dose-response relationship between the quantitative trait and the underlying trait locus and thus may lead to loss of power in detecting genetic association. An alternative approach to analyzing such data is to model the dose-response relationship by a linear regression model. However, parameter estimation from this model can be biased, which may lead to inflated type I errors. We propose a robust and efficient approach that takes into consideration of both the biased sampling design and the potential dose-response relationship. Extensive simulations demonstrate that the proposed method is more powerful than the traditional logistic regression analysis and is more robust than the linear regression analysis. We applied our method to the analysis of a candidate gene association study on high-density lipoprotein cholesterol (HDL-C) which includes study subjects with extremely high or low HDL-C levels. Using our method, we identified several SNPs showing a stronger evidence of association with HDL-C than the traditional case-control logistic regression analysis. Our results suggest that it is important to appropriately model the quantitative traits and to adjust for the biased sampling when dose-response relationship exists in extreme-value sampling designs. © 2011 Wiley Periodicals, Inc.

  14. Modelling hydrological extremes under non-stationary conditions using climate covariates

    NASA Astrophysics Data System (ADS)

    Vasiliades, Lampros; Galiatsatou, Panagiota; Loukas, Athanasios

    2013-04-01

    Extreme value theory is a probabilistic theory that can interpret the future probabilities of occurrence of extreme events (e.g. extreme precipitation and streamflow) using past observed records. Traditionally, extreme value theory requires the assumption of temporal stationarity. This assumption implies that the historical patterns of recurrence of extreme events are static over time. However, the hydroclimatic system is nonstationary on time scales that are relevant to extreme value analysis, due to human-mediated and natural environmental change. In this study the generalized extreme value (GEV) distribution is used to assess nonstationarity in annual maximum daily rainfall and streamflow timeseries at selected meteorological and hydrometric stations in Greece and Cyprus. The GEV distribution parameters (location, scale, and shape) are specified as functions of time-varying covariates and estimated using the conditional density network (CDN) as proposed by Cannon (2010). The CDN is a probabilistic extension of the multilayer perceptron neural network. Model parameters are estimated via the generalized maximum likelihood (GML) approach using the quasi-Newton BFGS optimization algorithm, and the appropriate GEV-CDN model architecture for the selected meteorological and hydrometric stations is selected by fitting increasingly complicated models and choosing the one that minimizes the Akaike information criterion with small sample size correction. For all case studies in Greece and Cyprus different formulations are tested with combinational cases of stationary and nonstationary parameters of the GEV distribution, linear and non-linear architecture of the CDN and combinations of the input climatic covariates. Climatic indices such as the Southern Oscillation Index (SOI), which describes atmospheric circulation in the eastern tropical pacific related to El Niño Southern Oscillation (ENSO), the Pacific Decadal Oscillation (PDO) index that varies on an interdecadal rather than interannual time scale and the atmospheric circulation patterns as expressed by the North Atlantic Oscillation (NAO) index are used to express the GEV parameters as functions of the covariates. Results show that the nonstationary GEV model can be an efficient tool to take into account the dependencies between extreme value random variables and the temporal evolution of the climate.

  15. On the impacts of computing daily temperatures as the average of the daily minimum and maximum temperatures

    NASA Astrophysics Data System (ADS)

    Villarini, Gabriele; Khouakhi, Abdou; Cunningham, Evan

    2017-12-01

    Daily temperature values are generally computed as the average of the daily minimum and maximum observations, which can lead to biases in the estimation of daily averaged values. This study examines the impacts of these biases on the calculation of climatology and trends in temperature extremes at 409 sites in North America with at least 25 years of complete hourly records. Our results show that the calculation of daily temperature based on the average of minimum and maximum daily readings leads to an overestimation of the daily values of 10+ % when focusing on extremes and values above (below) high (low) thresholds. Moreover, the effects of the data processing method on trend estimation are generally small, even though the use of the daily minimum and maximum readings reduces the power of trend detection ( 5-10% fewer trends detected in comparison with the reference data).

  16. Characterization and prediction of extreme events in turbulence

    NASA Astrophysics Data System (ADS)

    Fonda, Enrico; Iyer, Kartik P.; Sreenivasan, Katepalli R.

    2017-11-01

    Extreme events in Nature such as tornadoes, large floods and strong earthquakes are rare but can have devastating consequences. The predictability of these events is very limited at present. Extreme events in turbulence are the very large events in small scales that are intermittent in character. We examine events in energy dissipation rate and enstrophy which are several tens to hundreds to thousands of times the mean value. To this end we use our DNS database of homogeneous and isotropic turbulence with Taylor Reynolds numbers spanning a decade, computed with different small scale resolutions and different box sizes, and study the predictability of these events using machine learning. We start with an aggressive data augmentation to virtually increase the number of these rare events by two orders of magnitude and train a deep convolutional neural network to predict their occurrence in an independent data set. The goal of the work is to explore whether extreme events can be predicted with greater assurance than can be done by conventional methods (e.g., D.A. Donzis & K.R. Sreenivasan, J. Fluid Mech. 647, 13-26, 2010).

  17. Natural Hazards characterisation in industrial practice

    NASA Astrophysics Data System (ADS)

    Bernardara, Pietro

    2017-04-01

    The definition of rare hydroclimatic extremes (up to 10-4 annual probability of occurrence) is of the utmost importance for the design of high value industrial infrastructures, such as grids, power plants, offshore platforms. The underestimation as well as the overestimation of the risk may lead to huge costs (ex. mid-life expensive works or overdesign) which may even prevent the project to happen. Nevertheless, the uncertainty associated to the extrapolation towards the rare frequencies are huge and manifold. They are mainly due to the scarcity of observations, the lack of quality on the extreme value records and on the arbitrary choice of the models used for extrapolations. This often put the design engineers in uncomfortable situations when they must choose the design values to use. Providentially, the recent progresses in the earth observation techniques, information technology, historical data collection and weather and ocean modelling are making huge datasets available. A careful use of big datasets of observations and modelled data are leading towards a better understanding of the physics of the underlying phenomena, the complex interactions between them and thus of the extreme events frequency extrapolations. This will move the engineering practice from the single site, small sample, application of statistical analysis to a more spatially coherent, physically driven extrapolation of extreme values. Few examples, from the EDF industrial practice are given to illustrate these progresses and their potential impact on the design approaches.

  18. So Small, So Loud: Extremely High Sound Pressure Level from a Pygmy Aquatic Insect (Corixidae, Micronectinae)

    PubMed Central

    Sueur, Jérôme; Mackie, David; Windmill, James F. C.

    2011-01-01

    To communicate at long range, animals have to produce intense but intelligible signals. This task might be difficult to achieve due to mechanical constraints, in particular relating to body size. Whilst the acoustic behaviour of large marine and terrestrial animals has been thoroughly studied, very little is known about the sound produced by small arthropods living in freshwater habitats. Here we analyse for the first time the calling song produced by the male of a small insect, the water boatman Micronecta scholtzi. The song is made of three distinct parts differing in their temporal and amplitude parameters, but not in their frequency content. Sound is produced at 78.9 (63.6–82.2) SPL rms re 2.10−5 Pa with a peak at 99.2 (85.7–104.6) SPL re 2.10−5 Pa estimated at a distance of one metre. This energy output is significant considering the small size of the insect. When scaled to body length and compared to 227 other acoustic species, the acoustic energy produced by M. scholtzi appears as an extreme value, outperforming marine and terrestrial mammal vocalisations. Such an extreme display may be interpreted as an exaggerated secondary sexual trait resulting from a runaway sexual selection without predation pressure. PMID:21698252

  19. So small, so loud: extremely high sound pressure level from a pygmy aquatic insect (Corixidae, Micronectinae).

    PubMed

    Sueur, Jérôme; Mackie, David; Windmill, James F C

    2011-01-01

    To communicate at long range, animals have to produce intense but intelligible signals. This task might be difficult to achieve due to mechanical constraints, in particular relating to body size. Whilst the acoustic behaviour of large marine and terrestrial animals has been thoroughly studied, very little is known about the sound produced by small arthropods living in freshwater habitats. Here we analyse for the first time the calling song produced by the male of a small insect, the water boatman Micronecta scholtzi. The song is made of three distinct parts differing in their temporal and amplitude parameters, but not in their frequency content. Sound is produced at 78.9 (63.6-82.2) SPL rms re 2.10(-5) Pa with a peak at 99.2 (85.7-104.6) SPL re 2.10(-5) Pa estimated at a distance of one metre. This energy output is significant considering the small size of the insect. When scaled to body length and compared to 227 other acoustic species, the acoustic energy produced by M. scholtzi appears as an extreme value, outperforming marine and terrestrial mammal vocalisations. Such an extreme display may be interpreted as an exaggerated secondary sexual trait resulting from a runaway sexual selection without predation pressure.

  20. Extreme value statistics and finite-size scaling at the ecological extinction/laminar-turbulence transition

    NASA Astrophysics Data System (ADS)

    Shih, Hong-Yan; Goldenfeld, Nigel

    Experiments on transitional turbulence in pipe flow seem to show that turbulence is a transient metastable state since the measured mean lifetime of turbulence puffs does not diverge asymptotically at a critical Reynolds number. Yet measurements reveal that the lifetime scales with Reynolds number in a super-exponential way reminiscent of extreme value statistics, and simulations and experiments in Couette and channel flow exhibit directed percolation type scaling phenomena near a well-defined transition. This universality class arises from the interplay between small-scale turbulence and a large-scale collective zonal flow, which exhibit predator-prey behavior. Why is asymptotically divergent behavior not observed? Using directed percolation and a stochastic individual level model of predator-prey dynamics related to transitional turbulence, we investigate the relation between extreme value statistics and power law critical behavior, and show that the paradox is resolved by carefully defining what is measured in the experiments. We theoretically derive the super-exponential scaling law, and using finite-size scaling, show how the same data can give both super-exponential behavior and power-law critical scaling.

  1. Extreme events in total ozone over Arosa: Application of extreme value theory and fingerprints of atmospheric dynamics and chemistry and their effects on mean values and long-term changes

    NASA Astrophysics Data System (ADS)

    Rieder, Harald E.; Staehelin, Johannes; Maeder, Jörg A.; Peter, Thomas; Ribatet, Mathieu; Davison, Anthony C.; Stübi, Rene; Weihs, Philipp; Holawe, Franz

    2010-05-01

    In this study tools from extreme value theory (e.g. Coles, 2001; Ribatet, 2007) are applied for the first time in the field of stratospheric ozone research, as statistical analysis showed that previously used concepts assuming a Gaussian distribution (e.g. fixed deviations from mean values) of total ozone data do not address the internal data structure concerning extremes adequately. The study illustrates that tools based on extreme value theory are appropriate to identify ozone extremes and to describe the tails of the world's longest total ozone record (Arosa, Switzerland - for details see Staehelin et al., 1998a,b) (Rieder et al., 2010a). A daily moving threshold was implemented for consideration of the seasonal cycle in total ozone. The frequency of days with extreme low (termed ELOs) and extreme high (termed EHOs) total ozone and the influence of those on mean values and trends is analyzed for Arosa total ozone time series. The results show (a) an increase in ELOs and (b) a decrease in EHOs during the last decades and (c) that the overall trend during the 1970s and 1980s in total ozone is strongly dominated by changes in these extreme events. After removing the extremes, the time series shows a strongly reduced trend (reduction by a factor of 2.5 for trend in annual mean). Furthermore, it is shown that the fitted model represents the tails of the total ozone data set with very high accuracy over the entire range (including absolute monthly minima and maxima). Also the frequency distribution of ozone mini-holes (using constant thresholds) can be calculated with high accuracy. Analyzing the tails instead of a small fraction of days below constant thresholds provides deeper insight in time series properties. Excursions in the frequency of extreme events reveal "fingerprints" of dynamical factors such as ENSO or NAO, and chemical factors, such as cold Arctic vortex ozone losses, as well as major volcanic eruptions of the 20th century (e.g. Gunung Agung, El Chichón, Mt. Pinatubo). Furthermore, atmospheric loading in ozone depleting substances lead to a continuous modification of column ozone in the northern hemisphere also with respect to extreme values (partly again in connection with polar vortex contributions). It is shown that application of extreme value theory allows the identification of many more such fingerprints than conventional time series analysis of annual and seasonal mean values. Especially, the analysis shows the strong influence of dynamics, revealing that even moderate ENSO and NAO events have a discernible effect on total ozone (Rieder et al., 2010b). Overall the presented new extremes concept provides new information on time series properties, variability, trends and the influence of dynamics and chemistry, complementing earlier analyses focusing only on monthly (or annual) mean values. References: Coles, S.: An Introduction to Statistical Modeling of Extreme Values, Springer Series in Statistics, ISBN:1852334592, Springer, Berlin, 2001. Ribatet, M.: POT: Modelling peaks over a threshold, R News, 7, 34-36, 2007. Rieder ,H.E., Staehelin, J., Maeder, J.A., Ribatet, M., Stübi, R., Weihs, P., Holawe, F., Peter, T., and A.D., Davison (2010): Extreme events in total ozone over Arosa - Part I: Application of extreme value theory, to be submitted to ACPD. Rieder, H.E., Staehelin, J., Maeder, J.A., Ribatet, M., Stübi, R., Weihs, P., Holawe, F., Peter, T., and A.D., Davison (2010): Extreme events in total ozone over Arosa - Part II: Fingerprints of atmospheric dynamics and chemistry and effects on mean values and long-term changes, to be submitted to ACPD. Staehelin, J., Renaud, A., Bader, J., McPeters, R., Viatte, P., Hoegger, B., Bugnion, V., Giroud, M., and Schill, H.: Total ozone series at Arosa (Switzerland): Homogenization and data comparison, J. Geophys. Res., 103(D5), 5827-5842, doi:10.1029/97JD02402, 1998a. Staehelin, J., Kegel, R., and Harris, N. R.: Trend analysis of the homogenized total ozone series of Arosa (Switzerland), 1929-1996, J. Geophys. Res., 103(D7), 8389-8400, doi:10.1029/97JD03650, 1998b.

  2. Sensitivity Analysis and Uncertainty Quantification for the LAMMPS Molecular Dynamics Simulation Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Picard, Richard Roy; Bhat, Kabekode Ghanasham

    2017-07-18

    We examine sensitivity analysis and uncertainty quantification for molecular dynamics simulation. Extreme (large or small) output values for the LAMMPS code often occur at the boundaries of input regions, and uncertainties in those boundary values are overlooked by common SA methods. Similarly, input values for which code outputs are consistent with calibration data can also occur near boundaries. Upon applying approaches in the literature for imprecise probabilities (IPs), much more realistic results are obtained than for the complacent application of standard SA and code calibration.

  3. Thermoregulatory value of cracking-clay soil shelters for small vertebrates during extreme desert conditions.

    PubMed

    Waudby, Helen P; Petit, Sophie

    2017-05-01

    Deserts exhibit extreme climatic conditions. Small desert-dwelling vertebrates have physiological and behavioral adaptations to cope with these conditions, including the ability to seek shelter. We investigated the temperature (T) and relative humidity (RH) regulating properties of the soil cracks that characterize the extensive cracking-clay landscapes of arid Australia, and the extent of their use by 2 small marsupial species: fat-tailed and stripe-faced dunnarts (Sminthopsis crassicaudata and Sminthopsis macroura). We measured hourly (over 24-h periods) the T and RH of randomly-selected soil cracks compared to outside conditions, during 2 summers and 2 winters. We tracked 17 dunnarts (8 Sminthopsis crassicaudata and 9 Sminthopsis macroura) to quantify their use of cracks. Cracks consistently moderated microclimate, providing more stable conditions than available from non-crack points, which often displayed comparatively dramatic fluctuations in T and RH. Both dunnart species used crack shelters extensively. Cracks constitute important shelter for small animals during extreme conditions by providing a stable microclimate, which is typically cooler than outside conditions in summer and warmer in winter. Cracks likely play a fundamental sheltering role by sustaining the physiological needs of small mammal populations. Globally, cracking-clay areas are dominated by agricultural land uses, including livestock grazing. Management of these systems should focus not only on vegetation condition, but also on soil integrity, to maintain shelter resources for ground-dwelling fauna. © 2016 International Society of Zoological Sciences, Institute of Zoology/Chinese Academy of Sciences and John Wiley & Sons Australia, Ltd.

  4. Statistical trend analysis and extreme distribution of significant wave height from 1958 to 1999 - an application to the Italian Seas

    NASA Astrophysics Data System (ADS)

    Martucci, G.; Carniel, S.; Chiggiato, J.; Sclavo, M.; Lionello, P.; Galati, M. B.

    2010-06-01

    The study is a statistical analysis of sea states timeseries derived using the wave model WAM forced by the ERA-40 dataset in selected areas near the Italian coasts. For the period 1 January 1958 to 31 December 1999 the analysis yields: (i) the existence of a negative trend in the annual- and winter-averaged sea state heights; (ii) the existence of a turning-point in late 80's in the annual-averaged trend of sea state heights at a site in the Northern Adriatic Sea; (iii) the overall absence of a significant trend in the annual-averaged mean durations of sea states over thresholds; (iv) the assessment of the extreme values on a time-scale of thousand years. The analysis uses two methods to obtain samples of extremes from the independent sea states: the r-largest annual maxima and the peak-over-threshold. The two methods show statistical differences in retrieving the return values and more generally in describing the significant wave field. The r-largest annual maxima method provides more reliable predictions of the extreme values especially for small return periods (<100 years). Finally, the study statistically proves the existence of decadal negative trends in the significant wave heights and by this it conveys useful information on the wave climatology of the Italian seas during the second half of the 20th century.

  5. Small-diameter success stories III.

    Treesearch

    Jean Livingston

    2008-01-01

    More than 73 million acres of our national forests and millions more in public and private forestlands are in need of some form of restoration. Our forests are declining in health because of major changes over the years in forest structure and composition. However, restoration of these overstocked stands is extremely expensive. If new, economical, and value-added uses...

  6. Global Weirding? - Using Very Large Ensembles and Extreme Value Theory to assess Changes in Extreme Weather Events Today

    NASA Astrophysics Data System (ADS)

    Otto, F. E. L.; Mitchell, D.; Sippel, S.; Black, M. T.; Dittus, A. J.; Harrington, L. J.; Mohd Saleh, N. H.

    2014-12-01

    A shift in the distribution of socially-relevant climate variables such as daily minimum winter temperatures and daily precipitation extremes, has been attributed to anthropogenic climate change for various mid-latitude regions. However, while there are many process-based arguments suggesting also a change in the shape of these distributions, attribution studies demonstrating this have not currently been undertaken. Here we use a very large initial condition ensemble of ~40,000 members simulating the European winter 2013/2014 using the distributed computing infrastructure under the weather@home project. Two separate scenarios are used:1. current climate conditions, and 2. a counterfactual scenario of "world that might have been" without anthropogenic forcing. Specifically focusing on extreme events, we assess how the estimated parameters of the Generalized Extreme Value (GEV) distribution vary depending on variable-type, sampling frequency (daily, monthly, …) and geographical region. We find that the location parameter changes for most variables but, depending on the region and variables, we also find significant changes in scale and shape parameters. The very large ensemble allows, furthermore, to assess whether such findings in the fitted GEV distributions are consistent with an empirical analysis of the model data, and whether the most extreme data still follow a known underlying distribution that in a small sample size might otherwise be thought of as an out-lier. The ~40,000 member ensemble is simulated using 12 different SST patterns (1 'observed', and 11 best guesses of SSTs with no anthropogenic warming). The range in SSTs, along with the corresponding changings in the NAO and high-latitude blocking inform on the dynamics governing some of these extreme events. While strong tele-connection patterns are not found in this particular experiment, the high number of simulated extreme events allows for a more thorough analysis of the dynamics than has been performed before. Therefore, combining extreme value theory with very large ensemble simulations allows us to understand the dynamics of changes in extreme events which is not possible just using the former but also shows in which cases statistics combined with smaller ensembles give as valid results as very large initial conditions.

  7. Design and Manufacturing of Extremely Low Mass Flight Systems

    NASA Technical Reports Server (NTRS)

    Johnson, Michael R.

    2002-01-01

    Extremely small flight systems pose some unusual design and manufacturing challenges. The small size of the components that make up the system generally must be built with extremely tight tolerances to maintain the functionality of the assembled item. Additionally, the total mass of the system is extremely sensitive to what would be considered small perturbations in a larger flight system. The MUSES C mission, designed, built, and operated by Japan, has a small rover provided by NASA that falls into this small flight system category. This NASA-provided rover is used as a case study of an extremely small flight system design. The issues that were encountered with the rover portion of the MUSES C program are discussed and conclusions about the recommended mass margins at different stages of a small flight system project are presented.

  8. Ground level measurements of air conductivities under Florida thunderstorms

    NASA Technical Reports Server (NTRS)

    Blakeslee, Richard J.; Krider, E. P.

    1992-01-01

    Values of the positive and negative polar conductivities under summer thunderstorms in Florida are highly variable and exhibit a significant electrode effect, but the total conductivity usually remains close to values found in fair weather, 0.4 to 1.8 x 10 exp -14 S/m. With these values a method proposed by Krider and Musser (1982) for estimating the total conductivity from changes in the slope of the electric field recovery following a lightning discharge will be extremely sensitive to small time variations in the local Maxwell current density and must be modified to include these effects.

  9. Extreme events in total ozone over the northern mid-latitudes: A case study based on long-term data sets from 5 ground-based stations

    NASA Astrophysics Data System (ADS)

    Rieder, Harald E.; Jancso, Leonhardt M.; Staehelin, Johannes; Maeder, Jörg A.; Ribatet, Mathieu; Peter, Thomas; Davison, Anthony C.

    2010-05-01

    In this study we analyze the frequency distribution of extreme events in low and high total ozone (termed ELOs and EHOs) for 5 long-term stations in the northern mid-latitudes in Europe (Belsk, Poland; Hradec Kralove, Czech Republic; Hohenpeissenberg and Potsdam, Germany; and Uccle, Belgium). Further, the influence of these extreme events on annual and seasonal mean values and trends is analysed. The applied method follows the new "ozone extreme concept", which is based on tools from extreme value theory [Coles, 2001; Ribatet, 2007], recently developed by Rieder et al. [2010a, b]. Mathematically seen the decisive feature within the extreme concept is the Generalized Pareto Distribution (GPD). In this analysis, the long-term trends needed to be removed first, differently to the treatment of Rieder et al. [2010a, b], in which the time series of Arosa was analysed, covering many decades of measurements in the anthropogenically undisturbed stratosphere. In contrast to previous studies only focusing on so called ozone mini-holes and mini-highs the "ozone extreme concept" provides a statistical description of the tails in total ozone distributions (i.e. extreme low and high values). It is shown that this concept is not only an appropriate method to describe the frequency and distribution of extreme events, it also provides new information on time series properties and internal variability. Furthermore it allows detection of fingerprints of physical (e.g. El Niño, NAO) and chemical (e.g. polar vortex ozone loss) features in the Earth's atmosphere as well as major volcanic eruptions (e.g. El Chichón, Mt. Pinatubo). It is shown that mean values and trends in total ozone are strongly influenced by extreme events. Trend calculations (for the period 1970-1990) are performed for the entire as well as the extremes-removed time series. The results after excluding extremes show that annual trends are most reduced at Hradec Kralove (about a factor of 3), followed by Potsdam (factor of 2.5), and Hohenpeissenberg and Belsk (both about a factor of 2). In general the reduction of trend is strongest during winter and spring. Throughout all stations the influence of ELOs on observed trends is larger than those of EHOs. Especially from the 1990s on ELOs dominate the picture as only a relatively small fraction of EHOs can be observed in the records (due to strong influence of Mt. Pinatubo eruption and polar vortex ozone loss contributions). Additionally it is evidenced that the number of observed mini-holes can be estimated highly accurate by the GPD-model. Overall the results of this thesis show that extreme events play a major role in total ozone and the "ozone extremes concept" provides deeper insight in the influence of chemical and physical features on column ozone. References: Coles, S.: An Introduction to Statistical Modeling of Extreme Values, Springer Series in Statistics, ISBN:1852334592, Springer, Berlin, 2001. Ribatet, M.: POT: Modelling peaks over a threshold, R News, 7, 34-36, 2007. Rieder ,H.E., Staehelin, J., Maeder, J.A., Ribatet, M., Stübi, R., Weihs, P., Holawe, F., Peter, T., and A.D., Davison (2010): Extreme events in total ozone over Arosa - Part I: Application of extreme value theory, to be submitted to ACPD. Rieder, H.E., Staehelin, J., Maeder, J.A., Ribatet, M., Stübi, R., Weihs, P., Holawe, F., Peter, T., and A.D., Davison (2010): Extreme events in total ozone over Arosa - Part II: Fingerprints of atmospheric dynamics and chemistry and effects on mean values and long-term changes, to be submitted to ACPD.

  10. The Coincident Coherence of Extreme Doppler Velocity Events with p-mode Patches in the Solar Photosphere.

    NASA Astrophysics Data System (ADS)

    McClure, Rachel Lee

    2018-06-01

    Observations of the solar photosphere show many spatially compact Doppler velocity events with short life spans and extreme values. In the IMaX spectropolarimetric inversion data of the first flight of the SUNRISE balloon in 2009 these striking flashes in the intergranule lanes and complementary outstanding values in the centers of granules have line of sight Doppler velocity values in excess of 4 sigma from the mean. We conclude that values outside 4 sigma are a result from the superposition of the granulation flows and the p-modes.To determine how granulation and p-modes contribute to these outstanding Doppler events, I separate the two components using the Fast Fourier Transform. I produce the power spectrum of the spatial wave frequencies and their corresponding frequency in time for each image, and create a k-omega filter to separate the two components. Using the filtered data, test the hypothesis that extreme events occur because of strict superposition between the p-mode Doppler velocities and the granular velocities. I compare event counts from the observational data to those produced by random superposition of the two flow components and find that the observational event counts are consistent with the model event counts in the limit of small number statistics. Poisson count probabilities of event numbers observed are consistent with expected model count probability distributions.

  11. Spatial variation of statistical properties of extreme water levels along the eastern Baltic Sea

    NASA Astrophysics Data System (ADS)

    Pindsoo, Katri; Soomere, Tarmo; Rocha, Eugénio

    2016-04-01

    Most of existing projections of future extreme water levels rely on the use of classic generalised extreme value distributions. The choice to use a particular distribution is often made based on the absolute value of the shape parameter of the Generalise Extreme Value distribution. If this parameter is small, the Gumbel distribution is most appropriate while in the opposite case the Weibull or Frechet distribution could be used. We demonstrate that the alongshore variation in the statistical properties of numerically simulated high water levels along the eastern coast of the Baltic Sea is so large that the use of a single distribution for projections of extreme water levels is highly questionable. The analysis is based on two simulated data sets produced in the Swedish Meteorological and Hydrological Institute. The output of the Rossby Centre Ocean model is sampled with a resolution of 6 h and the output of the circulation model NEMO with a resolution of 1 h. As the maxima of water levels of subsequent years may be correlated in the Baltic Sea, we also employ maxima for stormy seasons. We provide a detailed analysis of spatial variation of the parameters of the family of extreme value distributions along an approximately 600 km long coastal section from the north-western shore of Latvia in the Baltic Proper until the eastern Gulf of Finland. The parameters are evaluated using maximum likelihood method and method of moments. The analysis also covers the entire Gulf of Riga. The core parameter of this family of distributions, the shape parameter of the Generalised Extreme Value distribution, exhibits extensive variation in the study area. Its values evaluated using the Hydrognomon software and maximum likelihood method, vary from about -0.1 near the north-western coast of Latvia in the Baltic Proper up to about 0.05 in the eastern Gulf of Finland. This parameter is very close to zero near Tallinn in the western Gulf of Finland. Thus, it is natural that the Gumbel distribution gives adequate projections of extreme water levels for the vicinity of Tallinn. More importantly, this feature indicates that the use of a single distribution for the projections of extreme water levels and their return periods for the entire Baltic Sea coast is inappropriate. The physical reason is the interplay of the complex shape of large subbasins (such as the Gulf of Riga and Gulf of Finland) of the sea and highly anisotropic wind regime. The 'impact' of this anisotropy on the statistics of water level is amplified by the overall anisotropy of the distributions of the frequency of occurrence of high and low water levels. The most important conjecture is that long-term behaviour of water level extremes in different coastal sections of the Baltic Sea may be fundamentally different.

  12. Significance of noisy signals in periodograms

    NASA Astrophysics Data System (ADS)

    Süveges, Maria

    2015-08-01

    The detection of tiny periodic signals in noisy and irregularly sampled time series is a challenging task. Once a small peak is found in the periodogram, the next step is to see how probable it is that pure noise produced a peak so extreme - that is to say, compute its False Alarm Probability (FAP). This useful measure quantifies the statistical plausibility of the found signal among the noise. However, its derivation from statistical principles is very hard due to the specificities of astronomical periodograms, such as oversampling and the ensuing strong correlation among its values at different frequencies. I will present a method to compute the FAP based on extreme-value statistics (Süveges 2014), and compare it to two other methods proposed by Baluev (2008) and Paltani (2004) and Schwarzenberg-Czerny (2012) on signals with various signal shapes and at different signal-to-noise ratios.

  13. Formation flying for electric sails in displaced orbits. Part I: Geometrical analysis

    NASA Astrophysics Data System (ADS)

    Wang, Wei; Mengali, Giovanni; Quarta, Alessandro A.; Yuan, Jianping

    2017-09-01

    We present a geometrical methodology for analyzing the formation flying of electric solar wind sail based spacecraft that operate in heliocentric, elliptic, displaced orbits. The spacecraft orbit is maintained by adjusting its propulsive acceleration modulus, whose value is estimated using a thrust model that takes into account a variation of the propulsive performance with the sail attitude. The properties of the relative motion of the spacecraft are studied in detail and a geometrical solution is obtained in terms of relative displaced orbital elements, assumed to be small quantities. In particular, for the small eccentricity case (i.e. for a near-circular displaced orbit), the bounds characterized by the extreme values of relative distances are analytically calculated, thus providing an useful mathematical tool for preliminary design of the spacecraft formation structure.

  14. Combination of radar and daily precipitation data to estimate meaningful sub-daily point precipitation extremes

    NASA Astrophysics Data System (ADS)

    Bárdossy, András; Pegram, Geoffrey

    2017-01-01

    The use of radar measurements for the space time estimation of precipitation has for many decades been a central topic in hydro-meteorology. In this paper we are interested specifically in daily and sub-daily extreme values of precipitation at gauged or ungauged locations which are important for design. The purpose of the paper is to develop a methodology to combine daily precipitation observations and radar measurements to estimate sub-daily extremes at point locations. Radar data corrected using precipitation-reflectivity relationships lead to biased estimations of extremes. Different possibilities of correcting systematic errors using the daily observations are investigated. Observed gauged daily amounts are interpolated to unsampled points and subsequently disaggregated using the sub-daily values obtained by the radar. Different corrections based on the spatial variability and the subdaily entropy of scaled rainfall distributions are used to provide unbiased corrections of short duration extremes. Additionally a statistical procedure not based on a matching day by day correction is tested. In this last procedure as we are only interested in rare extremes, low to medium values of rainfall depth were neglected leaving a small number of L days of ranked daily maxima in each set per year, whose sum typically comprises about 50% of each annual rainfall total. The sum of these L day maxima is first iterpolated using a Kriging procedure. Subsequently this sum is disaggregated to daily values using a nearest neighbour procedure. The daily sums are then disaggregated by using the relative values of the biggest L radar based days. Of course, the timings of radar and gauge maxima can be different, so the method presented here uses radar for disaggregating daily gauge totals down to 15 min intervals in order to extract the maxima of sub-hourly through to daily rainfall. The methodologies were tested in South Africa, where an S-band radar operated relatively continuously at Bethlehem from 1998 to 2003, whose scan at 1.5 km above ground [CAPPI] overlapped a dense (10 km spacing) set of 45 pluviometers recording in the same 6-year period. This valuable set of data was obtained from each of 37 selected radar pixels [1 km square in plan] which contained a pluviometer not masked out by the radar foot-print. The pluviometer data were also aggregated to daily totals, for the same purpose. The extremes obtained using disaggregation methods were compared to the observed extremes in a cross validation procedure. The unusual and novel goal was not to obtain the reproduction of the precipitation matching in space and time, but to obtain frequency distributions of the point extremes, which we found to be stable.

  15. Using the Student's "t"-Test with Extremely Small Sample Sizes

    ERIC Educational Resources Information Center

    de Winter, J. C .F.

    2013-01-01

    Researchers occasionally have to work with an extremely small sample size, defined herein as "N" less than or equal to 5. Some methodologists have cautioned against using the "t"-test when the sample size is extremely small, whereas others have suggested that using the "t"-test is feasible in such a case. The present…

  16. Using chromium stable isotope ratios to quantify Cr(VI) reduction: Lack of sorption effects

    USGS Publications Warehouse

    Ellis, A.S.; Johnson, T.M.; Bullen, T.D.

    2004-01-01

    Chromium stable isotope values can be effectively used to monitor reduction of Cr(VI) in natural waters. We investigate effects of sorption during transport of Cr(VI) which may also shift Cr isotopes values, complicating efforts to quantify reduction. This study shows that Cr stable isotope fractionation caused by sorption is negligible. Equilibrium fractionation of Cr stable isotopes between dissolved Cr-(VI) and Cr(VI) adsorbed onto ??-Al2O3 and goethite is less than 0.04???. (53Cr/52Cr) under environmentally relevant pH conditions. Batch experiments at pH 4.0 and pH 6.0 were conducted in series to sequentially magnify small isotope fractionations. A simple transport model suggests that adsorption may cause amplification of a small isotope fractionation along extreme fringes of a plume, leading to shifts in 53Cr/52Cr values. We therefore suggest that isotope values at extreme fringes of Cr plumes be critically evaluated for sorption effects. A kinetic effect was observed in experiments with goethite at pH 4 where apparently lighter isotopes diffuse into goethite clumps at a faster rate before eventually reaching equilibrium. This observed kinetic effect may be important in a natural system that has not attained equilibrium and is in need of further study. Cr isotope fractionation caused by speciation of Cr(VI) between HCrO4- and CrO42- was also examined, and we conclude that it is not measurable. In the absence of isotope fractionation caused by equilibrium speciation and sorption, most of the variation in ??53 Cr values may be attributed to reduction, and reliable estimates of Cr reduction can be made.

  17. Extreme climatic events drive mammal irruptions: regression analysis of 100-year trends in desert rainfall and temperature

    PubMed Central

    Greenville, Aaron C; Wardle, Glenda M; Dickman, Chris R

    2012-01-01

    Extreme climatic events, such as flooding rains, extended decadal droughts and heat waves have been identified increasingly as important regulators of natural populations. Climate models predict that global warming will drive changes in rainfall and increase the frequency and severity of extreme events. Consequently, to anticipate how organisms will respond we need to document how changes in extremes of temperature and rainfall compare to trends in the mean values of these variables and over what spatial scales the patterns are consistent. Using the longest historical weather records available for central Australia – 100 years – and quantile regression methods, we investigate if extreme climate events have changed at similar rates to median events, if annual rainfall has increased in variability, and if the frequency of large rainfall events has increased over this period. Specifically, we compared local (individual weather stations) and regional (Simpson Desert) spatial scales, and quantified trends in median (50th quantile) and extreme weather values (5th, 10th, 90th, and 95th quantiles). We found that median and extreme annual minimum and maximum temperatures have increased at both spatial scales over the past century. Rainfall changes have been inconsistent across the Simpson Desert; individual weather stations showed increases in annual rainfall, increased frequency of large rainfall events or more prolonged droughts, depending on the location. In contrast to our prediction, we found no evidence that intra-annual rainfall had become more variable over time. Using long-term live-trapping records (22 years) of desert small mammals as a case study, we demonstrate that irruptive events are driven by extreme rainfalls (>95th quantile) and that increases in the magnitude and frequency of extreme rainfall events are likely to drive changes in the populations of these species through direct and indirect changes in predation pressure and wildfires. PMID:23170202

  18. Uncertainty in determining extreme precipitation thresholds

    NASA Astrophysics Data System (ADS)

    Liu, Bingjun; Chen, Junfan; Chen, Xiaohong; Lian, Yanqing; Wu, Lili

    2013-10-01

    Extreme precipitation events are rare and occur mostly on a relatively small and local scale, which makes it difficult to set the thresholds for extreme precipitations in a large basin. Based on the long term daily precipitation data from 62 observation stations in the Pearl River Basin, this study has assessed the applicability of the non-parametric, parametric, and the detrended fluctuation analysis (DFA) methods in determining extreme precipitation threshold (EPT) and the certainty to EPTs from each method. Analyses from this study show the non-parametric absolute critical value method is easy to use, but unable to reflect the difference of spatial rainfall distribution. The non-parametric percentile method can account for the spatial distribution feature of precipitation, but the problem with this method is that the threshold value is sensitive to the size of rainfall data series and is subjected to the selection of a percentile thus make it difficult to determine reasonable threshold values for a large basin. The parametric method can provide the most apt description of extreme precipitations by fitting extreme precipitation distributions with probability distribution functions; however, selections of probability distribution functions, the goodness-of-fit tests, and the size of the rainfall data series can greatly affect the fitting accuracy. In contrast to the non-parametric and the parametric methods which are unable to provide information for EPTs with certainty, the DFA method although involving complicated computational processes has proven to be the most appropriate method that is able to provide a unique set of EPTs for a large basin with uneven spatio-temporal precipitation distribution. The consistency between the spatial distribution of DFA-based thresholds with the annual average precipitation, the coefficient of variation (CV), and the coefficient of skewness (CS) for the daily precipitation further proves that EPTs determined by the DFA method are more reasonable and applicable for the Pearl River Basin.

  19. Extremely preterm infants small for gestational age are at risk for motor impairment at 3 years corrected age.

    PubMed

    Kato, Takeshi; Mandai, Tsurue; Iwatani, Sota; Koda, Tsubasa; Nagasaka, Miwako; Fujita, Kaori; Kurokawa, Daisuke; Yamana, Keiji; Nishida, Kosuke; Taniguchi-Ikeda, Mariko; Tanimura, Kenji; Deguchi, Masashi; Yamada, Hideto; Iijima, Kazumoto; Morioka, Ichiro

    2016-02-01

    Few studies have targeted psychomotor development and associated perinatal risk factors in Japanese very low birth weight (VLBW) infants who are severely small for gestational age (SGA). A single-center study was conducted in 104 Japanese VLBW infants who were born preterm, due to maternal, umbilical cord, or placental abnormalities, between 2000 and 2007. Psychomotor development as a developmental quotient (DQ) was assessed using the Kyoto Scale of Psychological Development at 3 years corrected age. Severely SGA was defined as birth weight or length below -2 standard deviation values of the mean values at the same gestation. VLBW infants were divided into 2 subgroups based on gestational age at birth: ⩾28 weeks (n=64) and <28 weeks (n=40). DQs of infants with severe SGA were compared with those of infants who were appropriate for gestational age (AGA). Factors associated with developmental disabilities in VLBW infants with severe SGA (n=23) were determined. In the group born at ⩾28 weeks gestation, infants with severe SGA had normal DQ values and did not significantly differ from those with AGA. However, in the group born at <28 weeks gestation, severe SGA infants had significantly lower postural-motor DQ values than AGA infants. Gestational age <28 weeks was an independent factor for low postural-motor DQ, regardless of the cause of severe SGA or pregnancy termination. Extremely preterm newborns with severe SGA are at risk of motor developmental disability at age 3 years. Copyright © 2015 The Japanese Society of Child Neurology. Published by Elsevier B.V. All rights reserved.

  20. Entropy-based derivation of generalized distributions for hydrometeorological frequency analysis

    NASA Astrophysics Data System (ADS)

    Chen, Lu; Singh, Vijay P.

    2018-02-01

    Frequency analysis of hydrometeorological and hydrological extremes is needed for the design of hydraulic and civil infrastructure facilities as well as water resources management. A multitude of distributions have been employed for frequency analysis of these extremes. However, no single distribution has been accepted as a global standard. Employing the entropy theory, this study derived five generalized distributions for frequency analysis that used different kinds of information encoded as constraints. These distributions were the generalized gamma (GG), the generalized beta distribution of the second kind (GB2), and the Halphen type A distribution (Hal-A), Halphen type B distribution (Hal-B) and Halphen type inverse B distribution (Hal-IB), among which the GG and GB2 distribution were previously derived by Papalexiou and Koutsoyiannis (2012) and the Halphen family was first derived using entropy theory in this paper. The entropy theory allowed to estimate parameters of the distributions in terms of the constraints used for their derivation. The distributions were tested using extreme daily and hourly rainfall data. Results show that the root mean square error (RMSE) values were very small, which indicated that the five generalized distributions fitted the extreme rainfall data well. Among them, according to the Akaike information criterion (AIC) values, generally the GB2 and Halphen family gave a better fit. Therefore, those general distributions are one of the best choices for frequency analysis. The entropy-based derivation led to a new way for frequency analysis of hydrometeorological extremes.

  1. Associations Between Measures of Balance and Lower-Extremity Muscle Strength/Power in Healthy Individuals Across the Lifespan: A Systematic Review and Meta-Analysis.

    PubMed

    Muehlbauer, Thomas; Gollhofer, Albert; Granacher, Urs

    2015-12-01

    It has frequently been reported that balance and lower-extremity muscle strength/power are associated with sports-related and everyday activities. Knowledge about the relationship between balance, strength, and power are important for the identification of at-risk individuals because deficits in these neuromuscular components are associated with an increased risk of sustaining injuries and falls. In addition, this knowledge is of high relevance for the development of specifically tailored health and skill-related exercise programs. The objectives of this systematic literature review and meta-analysis were to characterize and, if possible, quantify associations between variables of balance and lower-extremity muscle strength/power in healthy individuals across the lifespan. A computerized systematic literature search was performed in the electronic databases PubMed, Web of Science, and SPORTDiscus up to March 2015 to capture all relevant articles. A systematic approach was used to evaluate the 996 articles identified for initial review. Studies were included only if they investigated healthy individuals aged ≥6 years and tested at least one measure of static steady-state balance (e.g., center of pressure [CoP] displacement during one-legged stance), dynamic steady-state balance (e.g., gait speed), proactive balance (e.g., distance in the functional-reach-test), or reactive balance (e.g., CoP displacement during perturbed one-legged stance), and one measure of maximal strength (e.g., maximum voluntary contraction), explosive force (e.g., rate of force development), or muscle power (e.g., jump height). In total, 37 studies met the inclusionary criteria for review. The included studies were coded for the following criteria: age (i.e., children: 6-12 years, adolescents: 13-18 years, young adults: 19-44 years, middle-aged adults: 45-64 years, old adults: ≥65 years), sex (i.e., female, male), and test modality/outcome (i.e., test for the assessment of balance, strength, and power). Studies with athletes, patients, and/or people with diseases were excluded. Pearson's correlation coefficients were extracted, transformed (i.e., Fisher's z-transformed r z value), aggregated (i.e., weighted mean r z value), back-transformed to r values, classified according to their magnitude (i.e., small: r ≤ 0.69, medium: r ≤ 0.89, large: r ≥ 0.90), and, if possible, statistically compared. Heterogeneity between studies was assessed using I2 and Chi-squared (χ2) statistics. Three studies examined associations between balance and lower-extremity muscle strength/power in children, one study in adolescents, nine studies in young adults, three studies in middle-aged adults, and 23 studies in old adults. Overall, small-sized associations were found between variables of balance and lower-extremity muscle strength/power, irrespective of the age group considered. In addition, small-sized but significantly larger correlation coefficients were found between measures of dynamic steady-state balance and maximal strength in children (r = 0.57) compared with young (r = 0.09, z = 3.30, p = 0.001) and old adults (r = 0.35, z = 2.94, p = 0.002) as well as in old compared with young adults (z = 1.95, p = 0.03). Even though the reported results provided further insight into the associations between measures of balance and lower-extremity muscle strength/power, they did not allow us to deduce cause and effect relations. Further, the investigated associations could be biased by other variables such as joint flexibility, muscle mass, and/or auditory/visual acuity. Our systematic review and meta-analysis showed predominately small-sized correlations between measures of balance and lower-extremity muscle strength/power in children, adolescents, and young, middle-aged, and old adults. This indicates that these neuromuscular components are independent of each other and should therefore be tested and trained complementarily across the lifespan. Significantly larger but still small-sized associations were found between measures of dynamic steady-state balance and maximal strength in children compared with young and old adults as well as in old compared with young adults. These findings imply that age/maturation may have an impact on the association of selected components of balance and lower-extremity muscle strength.

  2. Individual Traits, Personal Values, and Conflict Resolution in an Isolated, Confined, Extreme Environment.

    PubMed

    Corneliussen, Jesper G; Leon, Gloria R; Kjærgaard, Anders; Fink, Birgit A; Venables, Noah C

    2017-06-01

    The study of personality traits, personal values, and the emergence of conflicts within groups performing in an isolated, confined, and extreme environment (ICE) may provide insights helpful for the composition and support of space crews for long duration missions. Studied pre/post and over the 2-yr period of the investigation were 10 Danish military personnel deployed to stations in Greenland on a 26-mo staggered rotation. Subjects completed the NEO PI-R, Triarchic Psychopathy Measure, and Portrait Values Questionnaire, and participated in structured interviews. During deployment, questionnaires were completed biweekly and a cognitive function test once a month. Personality findings indicated a generally well-adjusted group, above average in positive personality traits [Conscientiousness T-score = 59.4 (11.41); Agreeableness T-score = 54.4 (9.36)] and boldness. Personal values of benevolence and self-direction were highly rated. The decision when to "pick sides" and intervene during disagreements between group members was viewed as an important component of conflict resolution. There were no changes in positive/negative affect or cognitive function over the annual light/dark cycle. The personal values of group members appear highly compatible for living in a small group ICE environment for an extended period. Disagreements between group members impact the functioning of the entire group, particularly in regard to decisions whether to support one of the individuals or let the argument run its course. Extended training in strategies for conflict resolution are needed in planning for future long duration missions to avoid fault lines forming within the group.Corneliussen JG, Leon GR, Kjærgaard A, Fink BA, Venables NC. Individual traits, personal values, and conflict resolution in an isolated, confined, extreme environment. Aerosp Med Hum Perform. 2017; 88(6):535-543.

  3. Extreme dissolved oxygen variability in urbanised tropical wetlands: The need for detailed monitoring to protect nursery ground values

    NASA Astrophysics Data System (ADS)

    Dubuc, Alexia; Waltham, Nathan; Malerba, Martino; Sheaves, Marcus

    2017-11-01

    Little is known about levels of dissolved oxygen fish are exposed to daily in typical urbanised tropical wetlands found along the Great Barrier Reef coastline. This study investigates diel dissolved oxygen (DO) dynamics in one of these typical urbanised wetlands, in tropical North Queensland, Australia. High frequency data loggers (DO, temperature, depth) were deployed for several days over the summer months in different tidal pools and channels that fish use as temporal or permanent refuges. DO was extremely variable over a 24 h cycle, and across the small-scale wetland. The high spatial and temporal DO variability measured was affected by time of day and tidal factors, namely water depth, tidal range and tidal direction (flood vs ebb). For the duration of the logging time, DO was mainly above the adopted threshold for hypoxia (50% saturation), however, for around 11% of the time, and on almost every logging day, DO values fell below the threshold, including a severe hypoxic event (<5% saturation) that continued for several hours. Fish still use this wetland intensively, so must be able to cope with low DO periods. Despite the ability of fish to tolerate extreme conditions, continuing urban expansion is likely to lead to further water quality degradation and so potential loss of nursery ground value. There is a substantial discontinuity between the recommended DO values in the Australian and New Zealand Guidelines for Fresh and Marine Water Quality and the values observed in this wetland, highlighting the limited value of these guidelines for management purposes. Local and regional high frequency data monitoring programs, in conjunction with local exposure risk studies are needed to underpin the development of the management that will ensure the sustainability of coastal wetlands.

  4. Advanced Artificial Dielectric Materials for Millimeter Wavelength Applications.

    DTIC Science & Technology

    1984-10-26

    extreme, it reaches a finite value, ot’ = - 3/8ir, corresponding to the magnetic behavior of a superconducting sphere [201. E 0.01 10 / - +.c I/LIMIT...dispersion of small magnetic particles, one must proceed carefully in accounting for the various demagnetizing effects, internal and external fields, etc...rum!:er, * ’B0 GOUPSUB OR Artificial Dielectrics, Induced Magnetic Permeability, Properties of’ Heterogeneous Media. Fine Powder Preparation

  5. Nanomaterials Commercialization Center

    DTIC Science & Technology

    2013-02-01

    turbine manufacturer). ln the wind energy area , customers clearly stated that the major short-tenn technical need for toughening is in the area of...interactions: • The wind energy composites market for turbine blades is an extremely high growth, high potential opportunity. Potential value ofnano...Wire Takeup System (MTS), with a winding pitch modified to meet the needs of the small diameter wire (- 100J.1m) produced in this reel-to-reelline

  6. Does dance-based therapy increase gait speed in older adults with chronic lower extremity pain: a feasibility study.

    PubMed

    Krampe, Jean; Wagner, Joanne M; Hawthorne, Kelly; Sanazaro, Deborah; Wong-Anuchit, Choochart; Budhathoki, Chakra; Lorenz, Rebecca A; Raaf, Soren

    2014-01-01

    A decreased gait speed in older adults can lead to dependency when the individuals are no longer able to participate in activities or do things for themselves. Thirty-seven senior apartment residents (31 females; Mean age=80.6 years; SD=8.9) with lower extremity pain/stiffness participated in a feasibility and preliminary efficacy study of 12 weeks (24 sessions). Healthy-Steps dance therapy compared to a wait-list control group. Small improvements in gait speed ([ES]=0.33) were noted for participants completing 19-24 dance sessions. Improvements in gait speed measured by a 10 Meter Walk Test (0.0517 m/s) exceeded 0.05 m/s, a value deemed to be meaningful in community dwelling older adults. These feasibility study findings support the need for additional research using dance-based therapy for older adults with lower extremity pain. Copyright © 2014 Elsevier Inc. All rights reserved.

  7. Small black holes and near-extremal CFTs

    DOE PAGES

    Benjamin, Nathan; Dyer, Ethan; Fitzpatrick, A. Liam; ...

    2016-08-02

    Pure theories of AdS 3 quantum gravity are conjectured to be dual to CFTs with sparse spectra of light primary operators. The sparsest possible spectrum consistent with modular invariance includes only black hole states above the vacuum. Witten conjectured the existence of a family of extremal CFTs, which realize this spectrum for all admissible values of the central charge. We consider the quantum corrections to the classical spectrum, and propose a specific modification of Witten’s conjecture which takes into account the existence of “small” black hole states. These have zero classical horizon area, with a calculable entropy attributed solely tomore » loop effects. Lastly, our conjecture passes various consistency checks, especially when generalized to include theories with supersymmetry. In theories with N = 2 supersymmetry, this “near-extremal CFT” proposal precisely evades the no-go results of Gaberdiel et al.« less

  8. Combining Radar and Daily Precipitation Data to Estimate Meaningful Sub-daily Precipitation Extremes

    NASA Astrophysics Data System (ADS)

    Pegram, G. G. S.; Bardossy, A.

    2016-12-01

    Short duration extreme rainfalls are important for design. The purpose of this presentation is not to improve the day by day estimation of precipitation, but to obtain reasonable statistics for the subdaily extremes at gauge locations. We are interested specifically in daily and sub-daily extreme values of precipitation at gauge locations. We do not employ the common procedure of using time series of control station to determine the missing data values in a target. We are interested in individual rare events, not sequences. The idea is to use radar to disaggregate daily totals to sub-daily amounts. In South Arica, an S-band radar operated relatively continuously at Bethlehem from 1998 to 2003, whose scan at 1.5 km above ground [CAPPI] overlapped a dense (10 km spacing) set of 45 pluviometers recording in the same 6-year period. Using this valuable set of data, we are only interested in rare extremes, therefore small to medium values of rainfall depth were neglected, leaving 12 days of ranked daily maxima in each set per year, whose sum typically comprised about 50% of each annual rainfall total. The method presented here uses radar for disaggregating daily gauge totals in subdaily intervals down to 15 minutes in order to extract the maxima of sub-hourly through to daily rainfall at each of 37 selected radar pixels [1 km square in plan] which contained one of the 45 pluviometers not masked out by the radar foot-print. The pluviometer data were aggregated to daily totals, to act as if they were daily read gauges; their only other task was to help in the cross-validation exercise. The extrema were obtained as quantiles by ordering the 12 daily maxima of each interval per year. The unusual and novel goal was not to obtain the reproduction of the precipitation matching in space and time, but to obtain frequency distributions of the gauge and radar extremes, by matching their ranks, which we found to be stable and meaningful in cross-validation tests. We provide and compare a range of different methodologies to enable reasonable estimation of subdaily extremes using radar and daily precipitation observations.

  9. Why anthropic reasoning cannot predict Lambda.

    PubMed

    Starkman, Glenn D; Trotta, Roberto

    2006-11-17

    We revisit anthropic arguments purporting to explain the measured value of the cosmological constant. We argue that different ways of assigning probabilities to candidate universes lead to totally different anthropic predictions. As an explicit example, we show that weighting different universes by the total number of possible observations leads to an extremely small probability for observing a value of Lambda equal to or greater than what we now measure. We conclude that anthropic reasoning within the framework of probability as frequency is ill-defined and that in the absence of a fundamental motivation for selecting one weighting scheme over another the anthropic principle cannot be used to explain the value of Lambda, nor, likely, any other physical parameters.

  10. Stress-induced phase sensitivity of small diameter polarization maintaining solid-core photonic crystal fibre

    NASA Astrophysics Data System (ADS)

    Zhang, Zhihao; Zhang, Chunxi; Xu, Xiaobin

    2017-09-01

    Small diameter (cladding and coating diameter of 100 and 135 μm) polarization maintaining photonic crystal fibres (SDPM-PCFs) possess many unique properties and are extremely suitable for applications in fibre optic gyroscopes. In this study, we have investigated and measured the stress characteristics of an SDPM-PCF using the finite-element method and a Mach-Zehnder interferometer, respectively. Our results reveal a radial and axial sensitivity of 0.315 ppm/N/m and 25.2 ppm per 1 × 105 N/m2, respectively, for the SDPM-PCF. These values are 40% smaller than the corresponding parameters of conventional small diameter (cladding and coating diameter of 80 and 135 μm) panda fibres.

  11. Measurements of characteristic parameters of extremely small cogged wheels with low module by means of low-coherence interferometry

    NASA Astrophysics Data System (ADS)

    Pakula, Anna; Tomczewski, Slawomir; Skalski, Andrzej; Biało, Dionizy; Salbut, Leszek

    2010-05-01

    This paper presents novel application of Low Coherence Interferometry (LCI) in measurements of characteristic parameters as circular pitch, foot diameter, heads diameter, in extremely small cogged wheels (cogged wheel diameter lower than θ=3 mm and module m = 0.15) produced from metal and ceramics. The most interesting issue concerning small diameter cogged wheels occurs during their production. The characteristic parameters of the wheel depend strongly on the manufacturing process and while inspecting small diameter wheels the shrinkage during the cast varies with the slight change of fabrication process. In the paper the LCI interferometric Twyman - Green setup with pigtailed high power light emitting diode, for cogged wheels measurement, is described. Due to its relatively big field of view the whole wheel can be examined in one measurement, without the necessity of numerical stitching. For purposes of small cogged wheel's characteristic parameters measurement the special binarization algorithm was developed and successfully applied. At the end the results of measurement of heads and foot diameters of two cogged wheels obtained by proposed LCI setup are presented and compared with the results obtained by the commercial optical profiler. The results of examination of injection moulds used for fabrication of measured cogged wheels are also presented. Additionally, the value of cogged wheels shrinkage is calculated as a conclusion for obtained results. Proposed method is suitable for complex measurements of small diameter cogged wheels with low module especially when there are no measurements standards for such objects.

  12. Bioethics and Public Health Collaborate to Reveal Impacts of Climate Change on Caribbean Life

    NASA Astrophysics Data System (ADS)

    Macpherson, C.; Akpinar-Elci, M.

    2011-12-01

    Interdisciplinary dialog and collaboration aimed at protecting health against climate change is impeded by the small number of scientists and health professionals skilled in interdisciplinary work, and by the view held by many that "climate change won't affect me personally". These challenges may be surmounted by discussions about the lived experience of climate change and how this threatens things we value. Dialog between bioethics and public health generated an innovative collaboration using the focus group method. The main limitation of focus groups is the small number of participants however the data obtained is generalizable to wider groups and is used regularly in business to enhance marketing strategies. Caribbean academicians from varied disciplines discussed how climate change affects them and life in the Caribbean. Caribbean states are particularly vulnerable to climate change because their large coastal areas are directly exposed to rising sea levels and their development relies heavily on foreign aid. The Caribbean comprises about half of the 39 members of the Association of Small Island States (AOSIS), and small island states comprise about 5% of global population [1]. Participants described socioeconomic and environmental changes in the Caribbean that they attribute to climate change. These include extreme weather, unusual rain and drought, drying rivers, beach erosion, declining fish catches, and others. The session exposed impacts on individuals, businesses, agriculture, and disaster preparedness. This data helps to reframe climate change as a personal reality rather than a vague future concern. It is relevant to the design, implementation, and sustainability of climate policies in the Caribbean and perhaps other small island states. The method and interdisciplinary approach can be used in other settings to elicit dialog about experiences and values across sectors, and to inform policies. Those who have experienced extreme weather are more concerned about climate change than others [2] and no expertise is needed to discuss such experiences or related values. These are accessible concepts in all disciplines and across socioeconomic levels. Research to further identify and describe values challenged by climate change is needed and can be communicated across disciplines and to the public. The resultant dialog will facilitate interdisciplinary collaboration, public and political debate, and possibly generate behavior change. References 1. Alliance of Small Island States (AOSIS). Accessed July 6, 2011. http://aosis.info/members-and-observers/ 2. Spence A., Poortinga W., Butler C., Pidgeon N.F. Perceptions of climate change and willingness to save energy related to flood experience. Nature Climate Change. March 2011. Accessed July 6, 2011. http://www.nature.com/nclimate/journal/vaop/ncurrent/full/nclimate1059.html

  13. How morphometric characteristics affect flow accumulation values

    NASA Astrophysics Data System (ADS)

    Farek, Vladimir

    2014-05-01

    Remote sensing methods (like aerial based LIDAR recording, land-use recording etc.) become continually more available and accurate. On the other hand in-situ surveying is still expensive. Above all in small, anthropogenically uninfluenced catchments, with poor, or non-existing surveying network could be remote sensing methods extremely useful. Overland flow accumulation (FA) values belong to important indicators of higher flash floods or soil erosion exposure. This value gives the number of cells of the Digital Elevation Model (DEM) grid, which are drained to each point of the catchment. This contribution deals with relations between basic geomorphological and morphometric characteristics (like hypsometric integral, Melton index of subcatchment etc.) and FA values. These relations are studied in the rocky sandstone landscapes of National park Ceské Svycarsko with the particular occurrence of broken relief. All calculations are based on high-resolution LIDAR DEM named Genesis created by TU Dresden. The main computational platform is GIS GRASS . The goal of the conference paper is to submit a quick method or indicators to estimate small particular subcatchments threatened by higher flash floods or soil erosion risks, without the necessity of using sophisticated rainfall-runoff models. There is a possibility to split catchments easily to small subcatchments (or use existing disjunction), compute basic characteristics and (with knowledge of links between this characteristics and FA values) identify, which particular subcatchment is potentially threatened by flash floods or soil erosion.

  14. Min and Max Exponential Extreme Interval Values and Statistics

    ERIC Educational Resources Information Center

    Jance, Marsha; Thomopoulos, Nick

    2009-01-01

    The extreme interval values and statistics (expected value, median, mode, standard deviation, and coefficient of variation) for the smallest (min) and largest (max) values of exponentially distributed variables with parameter ? = 1 are examined for different observation (sample) sizes. An extreme interval value g[subscript a] is defined as a…

  15. Chironomid midges (Diptera, chironomidae) show extremely small genome sizes.

    PubMed

    Cornette, Richard; Gusev, Oleg; Nakahara, Yuichi; Shimura, Sachiko; Kikawada, Takahiro; Okuda, Takashi

    2015-06-01

    Chironomid midges (Diptera; Chironomidae) are found in various environments from the high Arctic to the Antarctic, including temperate and tropical regions. In many freshwater habitats, members of this family are among the most abundant invertebrates. In the present study, the genome sizes of 25 chironomid species were determined by flow cytometry and the resulting C-values ranged from 0.07 to 0.20 pg DNA (i.e. from about 68 to 195 Mbp). These genome sizes were uniformly very small and included, to our knowledge, the smallest genome sizes recorded to date among insects. Small proportion of transposable elements and short intron sizes were suggested to contribute to the reduction of genome sizes in chironomids. We discuss about the possible developmental and physiological advantages of having a small genome size and about putative implications for the ecological success of the family Chironomidae.

  16. On problems of analyzing aerodynamic properties of blunted rotary bodies with small random surface distortions under supersonic and hypersonic flows

    NASA Astrophysics Data System (ADS)

    Degtyar, V. G.; Kalashnikov, S. T.; Mokin, Yu. A.

    2017-10-01

    The paper considers problems of analyzing aerodynamic properties (ADP) of reenetry vehicles (RV) as blunted rotary bodies with small random surface distortions. The interactions of math simulation of surface distortions, selection of tools for predicting ADPs of shaped bodies, evaluation of different-type ADP variations and their adaptation for dynamic problems are analyzed. The possibilities of deterministic and probabilistic approaches to evaluation of ADP variations are considered. The practical value of the probabilistic approach is demonstrated. The examples of extremal deterministic evaluations of ADP variations for a sphere and a sharp cone are given.

  17. Emergence and space-time structure of lump solution to the (2+1)-dimensional generalized KP equation

    NASA Astrophysics Data System (ADS)

    Tan, Wei; Dai, Houping; Dai, Zhengde; Zhong, Wenyong

    2017-11-01

    A periodic breather-wave solution is obtained using homoclinic test approach and Hirota's bilinear method with a small perturbation parameter u0 for the (2+1)-dimensional generalized Kadomtsev-Petviashvili equation. Based on the periodic breather-wave, a lump solution is emerged by limit behaviour. Finally, three different forms of the space-time structure of the lump solution are investigated and discussed using the extreme value theory.

  18. Diagnosing ion-beam targets, data acquisition, reactor conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mendel, Jr., C. W.

    1982-01-01

    The final lecture will discuss diagnostics of the target. These are very difficult because of the short times, small spatial extent, and extreme values of temperature and pressure. Diagnostics for temperature, density profile, and neutron production will be discussed. A few minutes will be devoted to data acquisition needs. The lecture will end with a discussion of current areas where improvements are needed and future diagnostics that will be required for reactor conditions.

  19. Linear mode stability of the Kerr-Newman black hole and its quasinormal modes.

    PubMed

    Dias, Óscar J C; Godazgar, Mahdi; Santos, Jorge E

    2015-04-17

    We provide strong evidence that, up to 99.999% of extremality, Kerr-Newman black holes (KNBHs) are linear mode stable within Einstein-Maxwell theory. We derive and solve, numerically, a coupled system of two partial differential equations for two gauge invariant fields that describe the most general linear perturbations of a KNBH. We determine the quasinormal mode (QNM) spectrum of the KNBH as a function of its three parameters and find no unstable modes. In addition, we find that the lowest radial overtone QNMs that are connected continuously to the gravitational ℓ=m=2 Schwarzschild QNM dominate the spectrum for all values of the parameter space (m is the azimuthal number of the wave function and ℓ measures the number of nodes along the polar direction). Furthermore, the (lowest radial overtone) QNMs with ℓ=m approach Reω=mΩH(ext) and Imω=0 at extremality; this is a universal property for any field of arbitrary spin |s|≤2 propagating on a KNBH background (ω is the wave frequency and ΩH(ext) the black hole angular velocity at extremality). We compare our results with available perturbative results in the small charge or small rotation regimes and find good agreement.

  20. The spatial distribution of threats to plant species with extremely small populations

    NASA Astrophysics Data System (ADS)

    Wang, Chunjing; Zhang, Jing; Wan, Jizhong; Qu, Hong; Mu, Xianyun; Zhang, Zhixiang

    2017-03-01

    Many biological conservationists take actions to conserve plant species with extremely small populations (PSESP) in China; however, there have been few studies on the spatial distribution of threats to PSESP. Hence, we selected distribution data of PSESP and made a map of the spatial distribution of threats to PSESP in China. First, we used the weight assignment method to evaluate the threat risk to PSESP at both country and county scales. Second, we used a geographic information system to map the spatial distribution of threats to PSESP, and explored the threat factors based on linear regression analysis. Finally, we suggested some effective conservation options. We found that the PSESP with high values of protection, such as the plants with high scientific research values and ornamental plants, were threatened by over-exploitation and utilization, habitat fragmentation, and a small sized wild population in broad-leaved forests and bush fallows. We also identified some risk hotspots for PSESP in China. Regions with low elevation should be given priority for ex- and in-situ conservation. Moreover, climate change should be considered for conservation of PSESP. To avoid intensive over-exploitation or utilization and habitat fragmentation, in-situ conservation should be practiced in regions with high temperatures and low temperature seasonality, particularly in the high risk hotspots for PSESP that we proposed. Ex-situ conservation should be applied in these same regions, and over-exploitation and utilization of natural resources should be prevented. It is our goal to apply the concept of PSESP to the global scale in the future.

  1. Triggering collective oscillations by three-flavor effects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dasgupta, Basudeb; Raffelt, Georg G.; Tamborra, Irene

    2010-04-01

    Collective flavor transformations in supernovae, caused by neutrino-neutrino interactions, are essentially a two-flavor phenomenon driven by the atmospheric mass difference and the small mixing angle {theta}{sub 13}. In the two-flavor approximation, the initial evolution depends logarithmically on {theta}{sub 13} and the system remains trapped in an unstable fixed point for {theta}{sub 13}=0. However, any effect breaking exact {nu}{sub {mu}-{nu}{tau}}equivalence triggers the conversion. Such three-flavor perturbations include radiative corrections to weak interactions, small differences between the {nu}{sub {mu}}and {nu}{sub {tau}}fluxes, or nonstandard interactions. Therefore, extremely small values of {theta}{sub 13} are in practice equivalent, the fate of the system depending onlymore » on the neutrino spectra and their mass ordering.« less

  2. A regional peculiarity of the low-latitude lower ionosphere

    NASA Astrophysics Data System (ADS)

    Givishvili, G. V.; Afinogenov, Iu. A.

    1985-02-01

    Experiments performed with the Al method at frequencies of 2.0 and 2.8 MHz on the ship Akademik Kurchatov during March-June 1976 in the Indian Ocean (28 deg N to 18 deg S, 40-79 deg E) revealed an area (Persian Gulf, 28-24 deg N) with a highly unusual diurnal variation of the ionospheric absorption of radio waves. This peculiarity consisted in extremely small prenoon values of absorption; the difference between the prenoon values and the higher postnoon absorption values at the two frequencies used was considerably higher than the measurement error (+ or - 3 dB). It is suggested that this peculiarity was connected with anomalously high rates of recombination processes in the morning hours in the lower ionosphere in this region.

  3. Assessing the features of extreme smog in China and the differentiated treatment strategy

    NASA Astrophysics Data System (ADS)

    Deng, Lu; Zhang, Zhengjun

    2018-01-01

    Extreme smog can have potentially harmful effects on human health, the economy and daily life. However, the average (mean) values do not provide strategically useful information on the hazard analysis and control of extreme smog. This article investigates China's smog extremes by applying extreme value analysis to hourly PM2.5 data from 2014 to 2016 obtained from monitoring stations across China. By fitting a generalized extreme value (GEV) distribution to exceedances over a station-specific extreme smog level at each monitoring location, all study stations are grouped into eight different categories based on the estimated mean and shape parameter values of fitted GEV distributions. The extreme features characterized by the mean of the fitted extreme value distribution, the maximum frequency and the tail index of extreme smog at each location are analysed. These features can provide useful information for central/local government to conduct differentiated treatments in cities within different categories and conduct similar prevention goals and control strategies among those cities belonging to the same category in a range of areas. Furthermore, hazardous hours, breaking probability and the 1-year return level of each station are demonstrated by category, based on which the future control and reduction targets of extreme smog are proposed for the cities of Beijing, Tianjin and Hebei as an example.

  4. Paretian Poisson Processes

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo; Klafter, Joseph

    2008-05-01

    Many random populations can be modeled as a countable set of points scattered randomly on the positive half-line. The points may represent magnitudes of earthquakes and tornados, masses of stars, market values of public companies, etc. In this article we explore a specific class of random such populations we coin ` Paretian Poisson processes'. This class is elemental in statistical physics—connecting together, in a deep and fundamental way, diverse issues including: the Poisson distribution of the Law of Small Numbers; Paretian tail statistics; the Fréchet distribution of Extreme Value Theory; the one-sided Lévy distribution of the Central Limit Theorem; scale-invariance, renormalization and fractality; resilience to random perturbations.

  5. Evaluation of extreme ionospheric total electron content gradient associated with plasma bubbles for GNSS Ground-Based Augmentation System

    NASA Astrophysics Data System (ADS)

    Saito, S.; Yoshihara, T.

    2017-08-01

    Associated with plasma bubbles, extreme spatial gradients in ionospheric total electron content (TEC) were observed on 8 April 2008 at Ishigaki (24.3°N, 124.2°E, +19.6° magnetic latitude), Japan. The largest gradient was 3.38 TECU km-1 (total electron content unit, 1 TECU = 1016 el m-2), which is equivalent to an ionospheric delay gradient of 540 mm km-1 at the GPS L1 frequency (1.57542 GHz). This value is confirmed by using multiple estimating methods. The observed value exceeds the maximum ionospheric gradient that has ever been observed (412 mm km-1 or 2.59 TECU km-1) to be associated with a severe magnetic storm. It also exceeds the assumed maximum value (500 mm km-1 or 3.08 TECU km-1) which was used to validate the draft international standard for Global Navigation Satellite System (GNSS) Ground-Based Augmentation Systems (GBAS) to support Category II/III approaches and landings. The steepest part of this extreme gradient had a scale size of 5.3 km, and the front-normal velocities were estimated to be 71 m s-1 with a wavefront-normal direction of east-northeastward. The total width of the transition region from outside to inside the plasma bubble was estimated to be 35.3 km. The gradient of relatively small spatial scale size may fall between an aircraft and a GBAS ground subsystem and may be undetectable by both aircraft and ground.

  6. Comparison of the Effects of Walking with and without Nordic Pole on Upper Extremity and Lower Extremity Muscle Activation.

    PubMed

    Shim, Je-Myung; Kwon, Hae-Yeon; Kim, Ha-Roo; Kim, Bo-In; Jung, Ju-Hyeon

    2013-12-01

    [Purpose] The aim of this study was to assess the effect of Nordic pole walking on the electromyographic activities of upper extremity and lower extremity muscles. [Subjects and Methods] The subjects were randomly divided into two groups as follows: without Nordic pole walking group (n=13) and with Nordic pole walking group (n=13). The EMG data were collected by measurement while the subjects walking on a treadmill for 30 minutes by measuring from one heel strike to the next. [Results] Both the average values and maximum values of the muscle activity of the upper extremity increased in both the group that used Nordic poles and the group that did not use Nordic poles, and the values showed statistically significant differences. There was an increase in the average value for muscle activity of the latissimus dorsi, but the difference was not statistically significant, although there was a statistically significant increase in its maximum value. The average and maximum values for muscle activity of the lower extremity did not show large differences in either group, and the values did not show any statistically significant differences. [Conclusion] The use of Nordic poles by increased muscle activity of the upper extremity compared with regular walking but did not affect the lower extremity.

  7. Comparison of the Effects of Walking with and without Nordic Pole on Upper Extremity and Lower Extremity Muscle Activation

    PubMed Central

    Shim, Je-myung; Kwon, Hae-yeon; Kim, Ha-roo; Kim, Bo-in; Jung, Ju-hyeon

    2014-01-01

    [Purpose] The aim of this study was to assess the effect of Nordic pole walking on the electromyographic activities of upper extremity and lower extremity muscles. [Subjects and Methods] The subjects were randomly divided into two groups as follows: without Nordic pole walking group (n=13) and with Nordic pole walking group (n=13). The EMG data were collected by measurement while the subjects walking on a treadmill for 30 minutes by measuring from one heel strike to the next. [Results] Both the average values and maximum values of the muscle activity of the upper extremity increased in both the group that used Nordic poles and the group that did not use Nordic poles, and the values showed statistically significant differences. There was an increase in the average value for muscle activity of the latissimus dorsi, but the difference was not statistically significant, although there was a statistically significant increase in its maximum value. The average and maximum values for muscle activity of the lower extremity did not show large differences in either group, and the values did not show any statistically significant differences. [Conclusion] The use of Nordic poles by increased muscle activity of the upper extremity compared with regular walking but did not affect the lower extremity. PMID:24409018

  8. Can quantile mapping improve precipitation extremes from regional climate models?

    NASA Astrophysics Data System (ADS)

    Tani, Satyanarayana; Gobiet, Andreas

    2015-04-01

    The ability of quantile mapping to accurately bias correct regard to precipitation extremes is investigated in this study. We developed new methods by extending standard quantile mapping (QMα) to improve the quality of bias corrected extreme precipitation events as simulated by regional climate model (RCM) output. The new QM version (QMβ) was developed by combining parametric and nonparametric bias correction methods. The new nonparametric method is tested with and without a controlling shape parameter (Qmβ1 and Qmβ0, respectively). Bias corrections are applied on hindcast simulations for a small ensemble of RCMs at six different locations over Europe. We examined the quality of the extremes through split sample and cross validation approaches of these three bias correction methods. This split-sample approach mimics the application to future climate scenarios. A cross validation framework with particular focus on new extremes was developed. Error characteristics, q-q plots and Mean Absolute Error (MAEx) skill scores are used for evaluation. We demonstrate the unstable behaviour of correction function at higher quantiles with QMα, whereas the correction functions with for QMβ0 and QMβ1 are smoother, with QMβ1 providing the most reasonable correction values. The result from q-q plots demonstrates that, all bias correction methods are capable of producing new extremes but QMβ1 reproduces new extremes with low biases in all seasons compared to QMα, QMβ0. Our results clearly demonstrate the inherent limitations of empirical bias correction methods employed for extremes, particularly new extremes, and our findings reveals that the new bias correction method (Qmß1) produces more reliable climate scenarios for new extremes. These findings present a methodology that can better capture future extreme precipitation events, which is necessary to improve regional climate change impact studies.

  9. Generalised Extreme Value Distributions Provide a Natural Hypothesis for the Shape of Seed Mass Distributions

    PubMed Central

    2015-01-01

    Among co-occurring species, values for functionally important plant traits span orders of magnitude, are uni-modal, and generally positively skewed. Such data are usually log-transformed “for normality” but no convincing mechanistic explanation for a log-normal expectation exists. Here we propose a hypothesis for the distribution of seed masses based on generalised extreme value distributions (GEVs), a class of probability distributions used in climatology to characterise the impact of event magnitudes and frequencies; events that impose strong directional selection on biological traits. In tests involving datasets from 34 locations across the globe, GEVs described log10 seed mass distributions as well or better than conventional normalising statistics in 79% of cases, and revealed a systematic tendency for an overabundance of small seed sizes associated with low latitudes. GEVs characterise disturbance events experienced in a location to which individual species’ life histories could respond, providing a natural, biological explanation for trait expression that is lacking from all previous hypotheses attempting to describe trait distributions in multispecies assemblages. We suggest that GEVs could provide a mechanistic explanation for plant trait distributions and potentially link biology and climatology under a single paradigm. PMID:25830773

  10. Low Thermal Conductivity of Bulk Amorphous Si1- x Ge x Containing Nano-Sized Crystalline Particles Synthesized by Ball-Milling Process

    NASA Astrophysics Data System (ADS)

    Muthusamy, Omprakash; Nishino, Shunsuke; Ghodke, Swapnil; Inukai, Manabu; Sobota, Robert; Adachi, Masahiro; Kiyama, Makato; Yamamoto, Yoshiyuki; Takeuchi, Tsunehiro; Santhanakrishnan, Harish; Ikeda, Hiroya; Hayakawa, Yasuhiro

    2018-06-01

    Amorphous Si0.65Ge0.35 powder containing a small amount of nano-sized crystalline particles was synthesized by means of the mechanical alloying process. Hot pressing for 24 h under the pressure of 400 MPa at 823 K, which is below the crystallization temperature, allowed us to obtain bulk amorphous Si-Ge alloy containing a small amount of nanocrystals. The thermal conductivity of the prepared bulk amorphous Si-Ge alloy was extremely low, showing a magnitude of less than 1.35 Wm-1 K-1 over the entire temperature range from 300 K to 700 K. The sound velocity of longitudinal and transverse waves for the bulk amorphous Si0.65Ge0.35 were measured, and the resulting values were 5841 m/s and 2840 m/s, respectively. The estimated mean free path of phonons was kept at the very small value of ˜ 4.2 nm, which was mainly due to the strong scattering limit of phonons in association with the amorphous structure.

  11. Extreme values in the Chinese and American stock markets based on detrended fluctuation analysis

    NASA Astrophysics Data System (ADS)

    Cao, Guangxi; Zhang, Minjia

    2015-10-01

    This paper focuses on the comparative analysis of extreme values in the Chinese and American stock markets based on the detrended fluctuation analysis (DFA) algorithm using the daily data of Shanghai composite index and Dow Jones Industrial Average. The empirical results indicate that the multifractal detrended fluctuation analysis (MF-DFA) method is more objective than the traditional percentile method. The range of extreme value of Dow Jones Industrial Average is smaller than that of Shanghai composite index, and the extreme value of Dow Jones Industrial Average is more time clustering. The extreme value of the Chinese or American stock markets is concentrated in 2008, which is consistent with the financial crisis in 2008. Moreover, we investigate whether extreme events affect the cross-correlation between the Chinese and American stock markets using multifractal detrended cross-correlation analysis algorithm. The results show that extreme events have nothing to do with the cross-correlation between the Chinese and American stock markets.

  12. Evidence for extreme Ti-50 enrichments in primitive meteorites

    NASA Technical Reports Server (NTRS)

    Fahey, A.; Mckeegan, K. D.; Zinner, E.; Goswami, J. N.

    1985-01-01

    The results of the first high mass resolution ion microprobe study of Ti isotopic compositions in individual refractory grains from primitive carbonaceous meteorites are reported. One hibonite from the Murray carbonaceous chondrite has a 10 percent excess of Ti-50, 25 times higher than the maximum value previously reported for bulk samples of refractory inclusions from carbonaceous chondrites. The variation of the Ti compositions between different hibonite grains, and among pyroxenes from a single Allende refractory inclusion, indicates isotopic inhomogeneities over small scale lengths in the solar nebula and emphasizes the importance of the analysis of small individual phases. This heterogeneity makes it unlikely that the isotopic anomalies were carried into the solar system in the gas phase.

  13. Convergent chaos

    NASA Astrophysics Data System (ADS)

    Pradas, Marc; Pumir, Alain; Huber, Greg; Wilkinson, Michael

    2017-07-01

    Chaos is widely understood as being a consequence of sensitive dependence upon initial conditions. This is the result of an instability in phase space, which separates trajectories exponentially. Here, we demonstrate that this criterion should be refined. Despite their overall intrinsic instability, trajectories may be very strongly convergent in phase space over extremely long periods, as revealed by our investigation of a simple chaotic system (a realistic model for small bodies in a turbulent flow). We establish that this strong convergence is a multi-facetted phenomenon, in which the clustering is intense, widespread and balanced by lacunarity of other regions. Power laws, indicative of scale-free features, characterize the distribution of particles in the system. We use large-deviation and extreme-value statistics to explain the effect. Our results show that the interpretation of the ‘butterfly effect’ needs to be carefully qualified. We argue that the combination of mixing and clustering processes makes our specific model relevant to understanding the evolution of simple organisms. Lastly, this notion of convergent chaos, which implies the existence of conditions for which uncertainties are unexpectedly small, may also be relevant to the valuation of insurance and futures contracts.

  14. A rational decision rule with extreme events.

    PubMed

    Basili, Marcello

    2006-12-01

    Risks induced by extreme events are characterized by small or ambiguous probabilities, catastrophic losses, or windfall gains. Through a new functional, that mimics the restricted Bayes-Hurwicz criterion within the Choquet expected utility approach, it is possible to represent the decisionmaker behavior facing both risky (large and reliable probability) and extreme (small or ambiguous probability) events. A new formalization of the precautionary principle (PP) is shown and a new functional, which encompasses both extreme outcomes and expectation of all the possible results for every act, is claimed.

  15. Comparison of design and torque measurements of various manual wrenches.

    PubMed

    Neugebauer, Jörg; Petermöller, Simone; Scheer, Martin; Happe, Arndt; Faber, Franz-Josef; Zoeller, Joachim E

    2015-01-01

    Accurate torque application and determination of the applied torque during surgical and prosthetic treatment is important to reduce complications. A study was performed to determine and compare the accuracy of manual wrenches, which are available in different designs with a large range of preset torques. Thirteen different wrench systems with a variety of preset torques ranging from 10 to 75 Ncm were evaluated. Three different designs were available, with a spring-in-coil or toggle design as an active mechanism or a beam as a passive mechanism, to select the preset torque. To provide a clinically relevant analysis, a total of 1,170 torque measurements in the range of 10 to 45 Ncm were made in vitro using an electronic torque measurement device. The absolute deviations in Ncm and percent deviations across all wrenches were small, with a mean of -0.24 ± 2.15 Ncm and -0.84% ± 11.72% as a shortfall relative to the preset value. The greatest overage was 8.2 Ncm (82.5%), and the greatest shortfall was 8.47 Ncm (46%). However, extreme values were rare, with 95th-percentile values of -1.5% (lower value) and -0.16% (upper value). A comparison with respect to wrench design revealed significantly higher deviations for coil and toggle-style wrenches than for beam wrenches. Beam wrenches were associated with a lower risk of rare extreme values thanks to their passive mechanism of achieving the selected preset torque, which minimizes the risk of harming screw connections.

  16. Valuing happiness is associated with bipolar disorder.

    PubMed

    Ford, Brett Q; Mauss, Iris B; Gruber, June

    2015-04-01

    Although people who experience happiness tend to have better psychological health, people who value happiness to an extreme tend to have worse psychological health, including more depression. We propose that the extreme valuing of happiness may be a general risk factor for mood disturbances, both depressive and manic. To test this hypothesis, we examined the relationship between the extreme valuing of happiness and risk for, diagnosis of, and illness course for bipolar disorder (BD). Supporting our hypothesis, the extreme valuing of happiness was associated with a measure of increased risk for developing BD (Studies 1 and 2), increased likelihood of past diagnosis of BD (Studies 2 and 3), and worse prospective illness course in BD (Study 3), even when controlling for current mood symptoms (Studies 1-3). These findings indicate that the extreme valuing of happiness is associated with and even predicts BD. Taken together with previous evidence, these findings suggest that the extreme valuing of happiness is a general risk factor for mood disturbances. More broadly, what emotions people strive to feel may play a critical role in psychological health. (c) 2015 APA, all rights reserved).

  17. Valuing happiness is associated with bipolar disorder

    PubMed Central

    Ford, Brett Q.; Mauss, Iris B.; Gruber, June

    2015-01-01

    While people who experience happiness tend to have better psychological health, people who value happiness to an extreme tend to have worse psychological health, including more depression. We propose that the extreme valuing of happiness may be a general risk factor for mood disturbances, both depressive and manic. To test this hypothesis, we examined the relationship between the extreme valuing of happiness and risk for, diagnosis of, and illness course for Bipolar Disorder (BD). Supporting our hypothesis, the extreme valuing of happiness was associated with a measure of increased risk for developing BD (Studies 1–2), increased likelihood of past diagnosis of BD (Studies 2–3), and worse prospective illness course in BD (Study 3), even when controlling for current mood symptoms (Studies 1–3). These findings indicate that the extreme valuing of happiness is associated with and even predicts BD. Taken together with previous evidence, these findings suggest that the extreme valuing of happiness is a general risk factor for mood disturbances. More broadly, what emotions people strive to feel may play a critical role in psychological health. PMID:25603134

  18. Operational Risk Measurement of Chinese Commercial Banks Based on Extreme Value Theory

    NASA Astrophysics Data System (ADS)

    Song, Jiashan; Li, Yong; Ji, Feng; Peng, Cheng

    The financial institutions and supervision institutions have all agreed on strengthening the measurement and management of operational risks. This paper attempts to build a model on the loss of operational risks basing on Peak Over Threshold model, emphasizing on weighted least square, which improved Hill’s estimation method, while discussing the situation of small sample, and fix the sample threshold more objectively basing on the media-published data of primary banks loss on operational risk from 1994 to 2007.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lis, Jakub

    In this paper, we investigate the Q-ball Ansatz in the baby Skyrme model. First, the appearance of peakons, i.e. solutions with extremely large absolute values of the second derivative at maxima, is analyzed. It is argued that such solutions are intrinsic to the baby Skyrme model and do not depend on the detailed form of a potential used in calculations. Next, we concentrate on compact nonspinning Q-balls. We show the failure of a small parameter expansion in this case. Finally, we explore the existence and parameter dependence of Q-ball solutions.

  20. Extreme-value dependence: An application to exchange rate markets

    NASA Astrophysics Data System (ADS)

    Fernandez, Viviana

    2007-04-01

    Extreme value theory (EVT) focuses on modeling the tail behavior of a loss distribution using only extreme values rather than the whole data set. For a sample of 10 countries with dirty/free float regimes, we investigate whether paired currencies exhibit a pattern of asymptotic dependence. That is, whether an extremely large appreciation or depreciation in the nominal exchange rate of one country might transmit to another. In general, after controlling for volatility clustering and inertia in returns, we do not find evidence of extreme-value dependence between paired exchange rates. However, for asymptotic-independent paired returns, we find that tail dependency of exchange rates is stronger under large appreciations than under large depreciations.

  1. Exchangeability, extreme returns and Value-at-Risk forecasts

    NASA Astrophysics Data System (ADS)

    Huang, Chun-Kai; North, Delia; Zewotir, Temesgen

    2017-07-01

    In this paper, we propose a new approach to extreme value modelling for the forecasting of Value-at-Risk (VaR). In particular, the block maxima and the peaks-over-threshold methods are generalised to exchangeable random sequences. This caters for the dependencies, such as serial autocorrelation, of financial returns observed empirically. In addition, this approach allows for parameter variations within each VaR estimation window. Empirical prior distributions of the extreme value parameters are attained by using resampling procedures. We compare the results of our VaR forecasts to that of the unconditional extreme value theory (EVT) approach and the conditional GARCH-EVT model for robust conclusions.

  2. Heidelberg Retina Tomography Analysis in Optic Disks with Anatomic Particularities

    PubMed Central

    Alexandrescu, C; Pascu, R; Ilinca, R; Popescu, V; Ciuluvica, R; Voinea, L; Celea, C

    2010-01-01

    Due to its objectivity, reproducibility and predictive value confirmed by many large scale statistical clinical studies, Heidelberg Retina Tomography has become one of the most used computerized image analysis of the optic disc in glaucoma. It has been signaled, though, that the diagnostic value of Moorfieds Regression Analyses and Glaucoma Probability Score decreases when analyzing optic discs with extreme sizes. The number of false positive results increases in cases of megalopapilllae and the number of false negative results increases in cases of small size optic discs. The present paper is a review of the aspects one should take into account when analyzing a HRT result of an optic disc with anatomic particularities. PMID:21254731

  3. Assessment of Coastal and Urban Flooding Hazards Applying Extreme Value Analysis and Multivariate Statistical Techniques: A Case Study in Elwood, Australia

    NASA Astrophysics Data System (ADS)

    Guimarães Nobre, Gabriela; Arnbjerg-Nielsen, Karsten; Rosbjerg, Dan; Madsen, Henrik

    2016-04-01

    Traditionally, flood risk assessment studies have been carried out from a univariate frequency analysis perspective. However, statistical dependence between hydrological variables, such as extreme rainfall and extreme sea surge, is plausible to exist, since both variables to some extent are driven by common meteorological conditions. Aiming to overcome this limitation, multivariate statistical techniques has the potential to combine different sources of flooding in the investigation. The aim of this study was to apply a range of statistical methodologies for analyzing combined extreme hydrological variables that can lead to coastal and urban flooding. The study area is the Elwood Catchment, which is a highly urbanized catchment located in the city of Port Phillip, Melbourne, Australia. The first part of the investigation dealt with the marginal extreme value distributions. Two approaches to extract extreme value series were applied (Annual Maximum and Partial Duration Series), and different probability distribution functions were fit to the observed sample. Results obtained by using the Generalized Pareto distribution demonstrate the ability of the Pareto family to model the extreme events. Advancing into multivariate extreme value analysis, first an investigation regarding the asymptotic properties of extremal dependence was carried out. As a weak positive asymptotic dependence between the bivariate extreme pairs was found, the Conditional method proposed by Heffernan and Tawn (2004) was chosen. This approach is suitable to model bivariate extreme values, which are relatively unlikely to occur together. The results show that the probability of an extreme sea surge occurring during a one-hour intensity extreme precipitation event (or vice versa) can be twice as great as what would occur when assuming independent events. Therefore, presuming independence between these two variables would result in severe underestimation of the flooding risk in the study area.

  4. Arterial blood pressure response to heavy resistance exercise.

    PubMed

    MacDougall, J D; Tuxen, D; Sale, D G; Moroz, J R; Sutton, J R

    1985-03-01

    The purpose of this study was to record the blood pressure response to heavy weight-lifting exercise in five experienced body builders. Blood pressure was directly recorded by means of a capacitance transducer connected to a catheter in the brachial artery. Intrathoracic pressure with the Valsalva maneuver was recorded as mouth pressure by having the subject maintain an open glottis while expiring against a column of Hg during the lifts. Exercises included single-arm curls, overhead presses, and both double- and single-leg presses performed to failure at 80, 90, 95, and 100% of maximum. Systolic and diastolic blood pressures rose rapidly to extremely high values during the concentric contraction phase for each lift and declined with the eccentric contraction. The greatest peak pressures occurred during the double-leg press where the mean value for the group was 320/250 mmHg, with pressures in one subject exceeding 480/350 mmHg. Peak pressures with the single-arm curl exercise reached a mean group value of 255/190 mmHg when repetitions were continued to failure. Mouth pressures of 30-50 Torr during a single maximum lift, or as subjects approached failure with a submaximal weight, indicate that a portion of the observed increase in blood pressure was caused by a Valsalva maneuver. It was concluded that when healthy young subjects perform weight-lifting exercises the mechanical compression of blood vessels combines with a potent pressor response and a Valsalva response to produce extreme elevations in blood pressure. Pressures are extreme even when exercise is performed with a relatively small muscle mass.

  5. Residues of chromium, nickel, cadmium and lead in Rook Corvus frugilegus eggshells from urban and rural areas of Poland.

    PubMed

    Orłowski, Grzegorz; Kasprzykowski, Zbigniew; Dobicki, Wojciech; Pokorny, Przemysław; Wuczyński, Andrzej; Polechoński, Ryszard; Mazgajski, Tomasz D

    2014-08-15

    We examined the concentrations of chromium (Cr), nickel (Ni), cadmium (Cd) and lead (Pb) in Rook Corvus frugilegus eggshells from 43 rookeries situated in rural and urban areas of western (=intensive agriculture) and eastern (=extensive agriculture) Poland. We found small ranges in the overall level of Cr (the difference between the extreme values was 1.8-fold; range of concentrations=5.21-9.40 Cr ppm), Ni (3.5-fold; 1.15-4.07 Ni ppm), and Cd (2.6-fold; 0.34-0.91 Cd ppm), whereas concentrations of Pb varied markedly, i.e. 6.7-fold between extreme values (1.71-11.53 Pb ppm). Eggshell levels of these four elements did not differ between rural rookeries from western and eastern Poland, but eggshells from rookeries in large/industrial cities had significantly higher concentrations of Cr, Ni and Pb than those from small towns and villages. Our study suggests that female Rooks exhibited an apparent variation in the intensity of trace metal bioaccumulation in their eggshells, that rapid site-dependent bioaccumulation of Cu, Cr, Ni and Pb occurs as a result of the pollution gradient (rural

  6. Extreme events in total ozone over Arosa - Part 2: Fingerprints of atmospheric dynamics and chemistry and effects on mean values and long-term changes

    NASA Astrophysics Data System (ADS)

    Rieder, H. E.; Staehelin, J.; Maeder, J. A.; Peter, T.; Ribatet, M.; Davison, A. C.; Stübi, R.; Weihs, P.; Holawe, F.

    2010-10-01

    In this study the frequency of days with extreme low (termed ELOs) and extreme high (termed EHOs) total ozone values and their influence on mean values and trends are analyzed for the world's longest total ozone record (Arosa, Switzerland). The results show (i) an increase in ELOs and (ii) a decrease in EHOs during the last decades and (iii) that the overall trend during the 1970s and 1980s in total ozone is strongly dominated by changes in these extreme events. After removing the extremes, the time series shows a strongly reduced trend (reduction by a factor of 2.5 for trend in annual mean). Excursions in the frequency of extreme events reveal "fingerprints" of dynamical factors such as ENSO or NAO, and chemical factors, such as cold Arctic vortex ozone losses, as well as major volcanic eruptions of the 20th century (Gunung Agung, El Chichón, Mt. Pinatubo). Furthermore, atmospheric loading of ozone depleting substances leads to a continuous modification of column ozone in the Northern Hemisphere also with respect to extreme values (partly again in connection with polar vortex contributions). Application of extreme value theory allows the identification of many more such "fingerprints" than conventional time series analysis of annual and seasonal mean values. The analysis shows in particular the strong influence of dynamics, revealing that even moderate ENSO and NAO events have a discernible effect on total ozone. Overall the approach to extremal modelling provides new information on time series properties, variability, trends and the influence of dynamics and chemistry, complementing earlier analyses focusing only on monthly (or annual) mean values.

  7. Extreme events in total ozone over Arosa - Part 2: Fingerprints of atmospheric dynamics and chemistry and effects on mean values and long-term changes

    NASA Astrophysics Data System (ADS)

    Rieder, H. E.; Staehelin, J.; Maeder, J. A.; Peter, T.; Ribatet, M.; Davison, A. C.; Stübi, R.; Weihs, P.; Holawe, F.

    2010-05-01

    In this study the frequency of days with extreme low (termed ELOs) and extreme high (termed EHOs) total ozone values and their influence on mean values and trends are analyzed for the world's longest total ozone record (Arosa, Switzerland). The results show (a) an increase in ELOs and (b) a decrease in EHOs during the last decades and (c) that the overall trend during the 1970s and 1980s in total ozone is strongly dominated by changes in these extreme events. After removing the extremes, the time series shows a strongly reduced trend (reduction by a factor of 2.5 for trend in annual mean). Excursions in the frequency of extreme events reveal "fingerprints" of dynamical factors such as ENSO or NAO, and chemical factors, such as cold Arctic vortex ozone losses, as well as major volcanic eruptions of the 20th century (Gunung Agung, El Chichón, Mt. Pinatubo). Furthermore, atmospheric loading of ozone depleting substances leads to a continuous modification of column ozone in the Northern Hemisphere also with respect to extreme values (partly again in connection with polar vortex contributions). Application of extreme value theory allows the identification of many more such "fingerprints" than conventional time series analysis of annual and seasonal mean values. The analysis shows in particular the strong influence of dynamics, revealing that even moderate ENSO and NAO events have a discernible effect on total ozone. Overall the approach to extremal modelling provides new information on time series properties, variability, trends and the influence of dynamics and chemistry, complementing earlier analyses focusing only on monthly (or annual) mean values.

  8. Genetic diversity and genomic resources available for the small millet crops to accelerate a New Green Revolution.

    PubMed

    Goron, Travis L; Raizada, Manish N

    2015-01-01

    Small millets are nutrient-rich food sources traditionally grown and consumed by subsistence farmers in Asia and Africa. They include finger millet (Eleusine coracana), foxtail millet (Setaria italica), kodo millet (Paspalum scrobiculatum), proso millet (Panicum miliaceum), barnyard millet (Echinochloa spp.), and little millet (Panicum sumatrense). Local farmers value the small millets for their nutritional and health benefits, tolerance to extreme stress including drought, and ability to grow under low nutrient input conditions, ideal in an era of climate change and steadily depleting natural resources. Little scientific attention has been paid to these crops, hence they have been termed "orphan cereals." Despite this challenge, an advantageous quality of the small millets is that they continue to be grown in remote regions of the world which has preserved their biodiversity, providing breeders with unique alleles for crop improvement. The purpose of this review, first, is to highlight the diverse traits of each small millet species that are valued by farmers and consumers which hold potential for selection, improvement or mechanistic study. For each species, the germplasm, genetic and genomic resources available will then be described as potential tools to exploit this biodiversity. The review will conclude with noting current trends and gaps in the literature and make recommendations on how to better preserve and utilize diversity within these species to accelerate a New Green Revolution for subsistence farmers in Asia and Africa.

  9. Simulated trends of extreme climate indices for the Carpathian basin using outputs of different regional climate models

    NASA Astrophysics Data System (ADS)

    Pongracz, R.; Bartholy, J.; Szabo, P.; Pieczka, I.; Torma, C. S.

    2009-04-01

    Regional climatological effects of global warming may be recognized not only in shifts of mean temperature and precipitation, but in the frequency or intensity changes of different climate extremes. Several climate extreme indices are analyzed and compared for the Carpathian basin (located in Central/Eastern Europe) following the guidelines suggested by the joint WMO-CCl/CLIVAR Working Group on climate change detection. Our statistical trend analysis includes the evaluation of several extreme temperature and precipitation indices, e.g., the numbers of severe cold days, winter days, frost days, cold days, warm days, summer days, hot days, extremely hot days, cold nights, warm nights, the intra-annual extreme temperature range, the heat wave duration, the growing season length, the number of wet days (using several threshold values defining extremes), the maximum number of consecutive dry days, the highest 1-day precipitation amount, the greatest 5-day rainfall total, the annual fraction due to extreme precipitation events, etc. In order to evaluate the future trends (2071-2100) in the Carpathian basin, daily values of meteorological variables are obtained from the outputs of various regional climate model (RCM) experiments accomplished in the frame of the completed EU-project PRUDENCE (Prediction of Regional scenarios and Uncertainties for Defining EuropeaN Climate change risks and Effects). Horizontal resolution of the applied RCMs is 50 km. Both scenarios A2 and B2 are used to compare past and future trends of the extreme climate indices for the Carpathian basin. Furthermore, fine-resolution climate experiments of two additional RCMs adapted and run at the Department of Meteorology, Eotvos Lorand University are used to extend the trend analysis of climate extremes for the Carpathian basin. (1) Model PRECIS (run at 25 km horizontal resolution) was developed at the UK Met Office, Hadley Centre, and it uses the boundary conditions from the HadCM3 GCM. (2) Model RegCM3 (run at 10 km horizontal resolution) was developed by Giorgi et al. and it is available from the ICTP (International Centre for Theoretical Physics). Analysis of the simulated daily temperature datasets suggests that the detected regional warming is expected to continue in the 21st century. Cold temperature extremes are projected to decrease while warm extremes tend to increase significantly. Expected changes of annual precipitation indices are small, but generally consistent with the detected trends of the 20th century. Based on the simulations, extreme precipitation events are expected to become more intense and more frequent in winter, while a general decrease of extreme precipitation indices is expected in summer.

  10. Cannabis, motivation, and life satisfaction in an internet sample

    PubMed Central

    Barnwell, Sara Smucker; Earleywine, Mitch; Wilcox, Rand

    2006-01-01

    Although little evidence supports cannabis-induced amotivational syndrome, sources continue to assert that the drug saps motivation [1], which may guide current prohibitions. Few studies report low motivation in chronic users; another reveals that they have higher subjective wellbeing. To assess differences in motivation and subjective wellbeing, we used a large sample (N = 487) and strict definitions of cannabis use (7 days/week) and abstinence (never). Standard statistical techniques showed no differences. Robust statistical methods controlling for heteroscedasticity, non-normality and extreme values found no differences in motivation but a small difference in subjective wellbeing. Medical users of cannabis reporting health problems tended to account for a significant portion of subjective wellbeing differences, suggesting that illness decreased wellbeing. All p-values were above p = .05. Thus, daily use of cannabis does not impair motivation. Its impact on subjective wellbeing is small and may actually reflect lower wellbeing due to medical symptoms rather than actual consumption of the plant. PMID:16722561

  11. Induced electric currents in the Alaska oil pipeline measured by gradient, fluxgate, and SQUID magnetometers

    NASA Technical Reports Server (NTRS)

    Campbell, W. H.; Zimmerman, J. E.

    1979-01-01

    The field gradient method for observing the electric currents in the Alaska pipeline provided consistent values for both the fluxgate and SQUID method of observation. These currents were linearly related to the regularly measured electric and magnetic field changes. Determinations of pipeline current were consistent with values obtained by a direct connection, current shunt technique at a pipeline site about 9.6 km away. The gradient method has the distinct advantage of portability and buried- pipe capability. Field gradients due to the pipe magnetization, geological features, or ionospheric source currents do not seem to contribute a measurable error to such pipe current determination. The SQUID gradiometer is inherently sensitive enough to detect very small currents in a linear conductor at 10 meters, or conversely, to detect small currents of one amphere or more at relatively great distances. It is fairly straightforward to achieve imbalance less than one part in ten thousand, and with extreme care, one part in one million or better.

  12. Recalibration of the Shear Stress Transport Model to Improve Calculation of Shock Separated Flows

    NASA Technical Reports Server (NTRS)

    Georgiadis, Nicholas J.; Yoder, Dennis A.

    2013-01-01

    The Menter Shear Stress Transport (SST) k . turbulence model is one of the most widely used two-equation Reynolds-averaged Navier-Stokes turbulence models for aerodynamic analyses. The model extends Menter s baseline (BSL) model to include a limiter that prevents the calculated turbulent shear stress from exceeding a prescribed fraction of the turbulent kinetic energy via a proportionality constant, a1, set to 0.31. Compared to other turbulence models, the SST model yields superior predictions of mild adverse pressure gradient flows including those with small separations. In shock - boundary layer interaction regions, the SST model produces separations that are too large while the BSL model is on the other extreme, predicting separations that are too small. In this paper, changing a1 to a value near 0.355 is shown to significantly improve predictions of shock separated flows. Several cases are examined computationally and experimental data is also considered to justify raising the value of a1 used for shock separated flows.

  13. Nonstationary Extreme Value Analysis in a Changing Climate: A Software Package

    NASA Astrophysics Data System (ADS)

    Cheng, L.; AghaKouchak, A.; Gilleland, E.

    2013-12-01

    Numerous studies show that climatic extremes have increased substantially in the second half of the 20th century. For this reason, analysis of extremes under a nonstationary assumption has received a great deal of attention. This paper presents a software package developed for estimation of return levels, return periods, and risks of climatic extremes in a changing climate. This MATLAB software package offers tools for analysis of climate extremes under both stationary and non-stationary assumptions. The Nonstationary Extreme Value Analysis (hereafter, NEVA) provides an efficient and generalized framework for analyzing extremes using Bayesian inference. NEVA estimates the extreme value parameters using a Differential Evolution Markov Chain (DE-MC) which utilizes the genetic algorithm Differential Evolution (DE) for global optimization over the real parameter space with the Markov Chain Monte Carlo (MCMC) approach and has the advantage of simplicity, speed of calculation and convergence over conventional MCMC. NEVA also offers the confidence interval and uncertainty bounds of estimated return levels based on the sampled parameters. NEVA integrates extreme value design concepts, data analysis tools, optimization and visualization, explicitly designed to facilitate analysis extremes in geosciences. The generalized input and output files of this software package make it attractive for users from across different fields. Both stationary and nonstationary components of the package are validated for a number of case studies using empirical return levels. The results show that NEVA reliably describes extremes and their return levels.

  14. Applied extreme-value statistics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kinnison, R.R.

    1983-05-01

    The statistical theory of extreme values is a well established part of theoretical statistics. Unfortunately, it is seldom part of applied statistics and is infrequently a part of statistical curricula except in advanced studies programs. This has resulted in the impression that it is difficult to understand and not of practical value. In recent environmental and pollution literature, several short articles have appeared with the purpose of documenting all that is necessary for the practical application of extreme value theory to field problems (for example, Roberts, 1979). These articles are so concise that only a statistician can recognise all themore » subtleties and assumptions necessary for the correct use of the material presented. The intent of this text is to expand upon several recent articles, and to provide the necessary statistical background so that the non-statistician scientist can recognize and extreme value problem when it occurs in his work, be confident in handling simple extreme value problems himself, and know when the problem is statistically beyond his capabilities and requires consultation.« less

  15. Bivariate extreme value distributions

    NASA Technical Reports Server (NTRS)

    Elshamy, M.

    1992-01-01

    In certain engineering applications, such as those occurring in the analyses of ascent structural loads for the Space Transportation System (STS), some of the load variables have a lower bound of zero. Thus, the need for practical models of bivariate extreme value probability distribution functions with lower limits was identified. We discuss the Gumbel models and present practical forms of bivariate extreme probability distributions of Weibull and Frechet types with two parameters. Bivariate extreme value probability distribution functions can be expressed in terms of the marginal extremel distributions and a 'dependence' function subject to certain analytical conditions. Properties of such bivariate extreme distributions, sums and differences of paired extremals, as well as the corresponding forms of conditional distributions, are discussed. Practical estimation techniques are also given.

  16. An application of small-gap equations in sealing devices

    NASA Technical Reports Server (NTRS)

    Vionnet, Carlos A.; Heinrich, Juan C.

    1993-01-01

    The study of a thin, incompressible Newtonian fluid layer trapped between two almost parallel, sliding surfaces has been actively pursued in the last decades. This subject includes lubrication applications such as slider bearings or the sealing of non-pressurized fluids with rubber rotary shaft seals. In the present work we analyze numerically the flow of lubricant fluid through a micro-gap of sealing devices. The first stage of this study is carried out assuming that a 'small-gap' parameter delta attains an extreme value in the Navier-Stokes equations. The precise meaning of small-gap is achieved by the particular limit delta = 0 which, within the bounds of the hypotheses, predicts transport of lubricant through the sealed area by centrifugal instabilities. Numerical results obtained with the penalty function approximation in the finite element method are presented. In particular, the influence of inflow and outflow boundary conditions, and their impact in the simulated flow, are discussed.

  17. Effects of forebody geometry on subsonic boundary-layer stability

    NASA Technical Reports Server (NTRS)

    Dodbele, Simha S.

    1990-01-01

    As part of an effort to develop computational techniques for design of natural laminar flow fuselages, a computational study was made of the effect of forebody geometry on laminar boundary layer stability on axisymmetric body shapes. The effects of nose radius on the stability of the incompressible laminar boundary layer was computationally investigated using linear stability theory for body length Reynolds numbers representative of small and medium-sized airplanes. The steepness of the pressure gradient and the value of the minimum pressure (both functions of fineness ratio) govern the stability of laminar flow possible on an axisymmetric body at a given Reynolds number. It was found that to keep the laminar boundary layer stable for extended lengths, it is important to have a small nose radius. However, nose shapes with extremely small nose radii produce large pressure peaks at off-design angles of attack and can produce vortices which would adversely affect transition.

  18. Aerodynamic study of a stall regulated horizontal-axis wind turbine

    NASA Astrophysics Data System (ADS)

    Constantinescu, S. G.; Crunteanu, D. E.; Niculescu, M. L.

    2013-10-01

    The wind energy is deemed as one of the most durable energetic variants of the future because the wind resources are immense. Furthermore, one predicts that the small wind turbines will play a vital role in the urban environment. Unfortunately, the complexity and the price of pitch regulated small horizontal-axis wind turbines represent ones of the main obstacles to widespread the use in populated zones. Moreover, the energetic efficiency of small stall regulated wind turbines has to be high even at low and medium wind velocities because, usually the cities are not windy places. During the running stall regulated wind turbines, due to the extremely broad range of the wind velocity, the angle of attack can reach high values and some regions of the blade will show stall and post-stall behavior. This paper deals with stall and post-stall regimes because they can induce significant vibrations, fatigue and even the wind turbine failure.

  19. An application of small-gap equations in sealing devices

    NASA Astrophysics Data System (ADS)

    Vionnet, Carlos A.; Heinrich, Juan C.

    1993-11-01

    The study of a thin, incompressible Newtonian fluid layer trapped between two almost parallel, sliding surfaces has been actively pursued in the last decades. This subject includes lubrication applications such as slider bearings or the sealing of non-pressurized fluids with rubber rotary shaft seals. In the present work we analyze numerically the flow of lubricant fluid through a micro-gap of sealing devices. The first stage of this study is carried out assuming that a 'small-gap' parameter delta attains an extreme value in the Navier-Stokes equations. The precise meaning of small-gap is achieved by the particular limit delta = 0 which, within the bounds of the hypotheses, predicts transport of lubricant through the sealed area by centrifugal instabilities. Numerical results obtained with the penalty function approximation in the finite element method are presented. In particular, the influence of inflow and outflow boundary conditions, and their impact in the simulated flow, are discussed.

  20. Small Sample Reactivity Measurements in the RRR/SEG Facility: Reanalysis using TRIPOLI-4

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hummel, Andrew; Palmiotti, Guiseppe

    2016-08-01

    This work involved reanalyzing the RRR/SEG integral experiments performed at the Rossendorf facility in Germany throughout the 1970s and 80s. These small sample reactivity worth measurements were carried out using the pile oscillator technique for many different fission products, structural materials, and standards. The coupled fast-thermal system was designed such that the measurements would provide insight into elemental data, specifically the competing effects between neutron capture and scatter. Comparing the measured to calculated reactivity values can then provide adjustment criteria to ultimately improve nuclear data for fast reactor designs. Due to the extremely small reactivity effects measured (typically less thanmore » 1 pcm) and the specific heterogeneity of the core, the tool chosen for this analysis was TRIPOLI-4. This code allows for high fidelity 3-dimensional geometric modeling, and the most recent, unreleased version, is capable of exact perturbation theory.« less

  1. Small deformations of extreme five dimensional Myers-Perry black hole initial data

    NASA Astrophysics Data System (ADS)

    Alaee, Aghil; Kunduri, Hari K.

    2015-02-01

    We demonstrate the existence of a one-parameter family of initial data for the vacuum Einstein equations in five dimensions representing small deformations of the extreme Myers-Perry black hole. This initial data set has `' symmetry and preserves the angular momenta and horizon geometry of the extreme solution. Our proof is based upon an earlier result of Dain and Gabach-Clement concerning the existence of -invariant initial data sets which preserve the geometry of extreme Kerr (at least for short times). In addition, we construct a general class of transverse, traceless symmetric rank 2 tensors in these geometries.

  2. Future Projection of Summer Extreme Precipitation from High Resolution Multi-RCMs over East Asia

    NASA Astrophysics Data System (ADS)

    Kim, Gayoung; Park, Changyong; Cha, Dong-Hyun; Lee, Dong-Kyou; Suh, Myoung-Seok; Ahn, Joong-Bae; Min, Seung-Ki; Hong, Song-You; Kang, Hyun-Suk

    2017-04-01

    Recently, the frequency and intensity of natural hazards have been increasing due to human-induced climate change. Because most damages of natural hazards over East Asia have been related to extreme precipitation events, it is important to estimate future change in extreme precipitation characteristics caused by climate change. We investigate future changes in extremal values of summer precipitation simulated by five regional climate models participating in the CORDEX-East Asia project (i.e., HadGEM3-RA, RegCM4, MM5, WRF, and GRIMs) over East Asia. 100-year return value calculated from the generalized extreme value (GEV) parameters is analysed as an indicator of extreme intensity. In the future climate, the mean values as well as the extreme values of daily precipitation tend to increase over land region. The increase of 100-year return value can be significantly associated with the changes in the location (intensity) and scale (variability) GEV parameters for extreme precipitation. It is expected that the results of this study can be used as fruitful references when making the policy of disaster management. Acknowledgements The research was supported by the Ministry of Public Safety and Security of Korean government and Development program under grant MPSS-NH-2013-63 and the National Research Foundation of Korea Grant funded by the Ministry of Science, ICT and Future Planning of Korea (NRF-2016M3C4A7952637) for its support and assistant in completion of the study.

  3. The influence of wheelchair propulsion technique on upper extremity muscle demand: a simulation study.

    PubMed

    Rankin, Jeffery W; Kwarciak, Andrew M; Richter, W Mark; Neptune, Richard R

    2012-11-01

    The majority of manual wheelchair users will experience upper extremity injuries or pain, in part due to the high force requirements, repetitive motion and extreme joint postures associated with wheelchair propulsion. Recent studies have identified cadence, contact angle and peak force as important factors for reducing upper extremity demand during propulsion. However, studies often make comparisons between populations (e.g., able-bodied vs. paraplegic) or do not investigate specific measures of upper extremity demand. The purpose of this study was to use a musculoskeletal model and forward dynamics simulations of wheelchair propulsion to investigate how altering cadence, peak force and contact angle influence individual muscle demand. Forward dynamics simulations of wheelchair propulsion were generated to emulate group-averaged experimental data during four conditions: 1) self-selected propulsion technique, and while 2) minimizing cadence, 3) maximizing contact angle, and 4) minimizing peak force using biofeedback. Simulations were used to determine individual muscle mechanical power and stress as measures of muscle demand. Minimizing peak force and cadence had the lowest muscle power requirements. However, minimizing peak force increased cadence and recovery power, while minimizing cadence increased average muscle stress. Maximizing contact angle increased muscle stress and had the highest muscle power requirements. Minimizing cadence appears to have the most potential for reducing muscle demand and fatigue, which could decrease upper extremity injuries and pain. However, altering any of these variables to extreme values appears to be less effective; instead small to moderate changes may better reduce overall muscle demand. Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. Extremely cold events and sudden air temperature drops during winter season in the Czech Republic

    NASA Astrophysics Data System (ADS)

    Crhová, Lenka; Valeriánová, Anna; Holtanová, Eva; Müller, Miloslav; Kašpar, Marek; Stříž, Martin

    2014-05-01

    Today a great attention is turned to analysis of extreme weather events and frequency of their occurrence under changing climate. In most cases, these studies are focused on extremely warm events in summer season. However, extremely low values of air temperature during winter can have serious impacts on many sectors as well (e.g. power engineering, transportation, industry, agriculture, human health). Therefore, in present contribution we focus on extremely and abnormally cold air temperature events in winter season in the Czech Republic. Besides the seasonal extremes of minimum air temperature determined from station data, the standardized data with removed annual cycle are used as well. Distribution of extremely cold events over the season and the temporal evolution of frequency of occurrence during the period 1961-2010 are analyzed. Furthermore, the connection of cold events with extreme sudden temperature drops is studied. The extreme air temperature events and events of extreme sudden temperature drop are assessed using the Weather Extremity Index, which evaluates the extremity (based on return periods) and spatial extent of the meteorological extreme event of interest. The generalized extreme value distribution parameters are used to estimate return periods of daily temperature values. The work has been supported by the grant P209/11/1990 funded by the Czech Science Foundation.

  5. Spatiotemporal variability of extreme temperature frequency and amplitude in China

    NASA Astrophysics Data System (ADS)

    Zhang, Yuanjie; Gao, Zhiqiu; Pan, Zaitao; Li, Dan; Huang, Xinhui

    2017-03-01

    Temperature extremes in China are examined based on daily maximum and minimum temperatures from station observations and multiple global climate models. The magnitude and frequency of extremes are expressed in terms of return values and periods, respectively, estimated by the fitted Generalized Extreme Value (GEV) distribution of annual extreme temperatures. The observations suggest that changes in temperature extremes considerably exceed changes in the respective climatological means during the past five decades, with greater amplitude of increases in cold extremes than in warm extremes. The frequency of warm (cold) extremes increases (decreases) over most areas, with an increasingly faster rate as the extremity level rises. Changes in warm extremes are more dependent on the varying shape of GEV distribution than the location shift, whereas changes in cold extremes are more closely associated with the location shift. The models simulate the overall pattern of temperature extremes during 1961-1981 reasonably well in China, but they show a smaller asymmetry between changes in warm and cold extremes primarily due to their underestimation of increases in cold extremes especially over southern China. Projections from a high emission scenario show the multi-model median change in warm and cold extremes by 2040 relative to 1971 will be 2.6 °C and 2.8 °C, respectively, with the strongest changes in cold extremes shifting southward. By 2040, warm extremes at the 1971 20-year return values would occur about every three years, while the 1971 cold extremes would occur once in > 500 years.

  6. [Hierarchical regionalization for spatial epidemiology: a case study of thyroid cancer incidence in Yiwu, Zhejiang].

    PubMed

    Teng, Shizhu; Jia, Qiaojuan; Huang, Yijian; Chen, Liangcao; Fei, Xufeng; Wu, Jiaping

    2015-10-01

    Sporadic cases occurring in mall geographic unit could lead to extreme value of incidence due to the small population bases, which would influence the analysis of actual incidence. This study introduced a method of hierarchy clustering and partitioning regionalization, which integrates areas with small population into larger areas with enough population by using Geographic Information System (GIS) based on the principles of spatial continuity and geographical similarity (homogeneity test). This method was applied in spatial epidemiology by using a data set of thyroid cancer incidence in Yiwu, Zhejiang province, between 2010 and 2013. Thyroid cancer incidence data were more reliable and stable in the new regionalized areas. Hotspot analysis (Getis-Ord) on the incidence in new areas indicated that there was obvious case clustering in the central area of Yiwu. This method can effectively solve the problem of small population base in small geographic units in spatial epidemiological analysis of thyroid cancer incidence and can be used for other diseases and in other areas.

  7. Size distribution and growth rate of crystal nuclei near critical undercooling in small volumes

    NASA Astrophysics Data System (ADS)

    Kožíšek, Z.; Demo, P.

    2017-11-01

    Kinetic equations are numerically solved within standard nucleation model to determine the size distribution of nuclei in small volumes near critical undercooling. Critical undercooling, when first nuclei are detected within the system, depends on the droplet volume. The size distribution of nuclei reaches the stationary value after some time delay and decreases with nucleus size. Only a certain maximum size of nuclei is reached in small volumes near critical undercooling. As a model system, we selected recently studied nucleation in Ni droplet [J. Bokeloh et al., Phys. Rev. Let. 107 (2011) 145701] due to available experimental and simulation data. However, using these data for sample masses from 23 μg up to 63 mg (corresponding to experiments) leads to the size distribution of nuclei, when no critical nuclei in Ni droplet are formed (the number of critical nuclei < 1). If one takes into account the size dependence of the interfacial energy, the size distribution of nuclei increases to reasonable values. In lower volumes (V ≤ 10-9 m3) nucleus size reaches some maximum extreme size, which quickly increases with undercooling. Supercritical clusters continue their growth only if the number of critical nuclei is sufficiently high.

  8. Identifying and Clarifying Organizational Values.

    ERIC Educational Resources Information Center

    Seevers, Brenda S.

    2000-01-01

    Of the 14 organizational values ranked by a majority of 146 New Mexico Cooperative Extension educators as extremely valued, 9 were extremely evident in organizational policies and procedures. A values audit such as this forms an important initial step in strategic planning. (SK)

  9. Applications of Extreme Value Theory in Public Health.

    PubMed

    Thomas, Maud; Lemaitre, Magali; Wilson, Mark L; Viboud, Cécile; Yordanov, Youri; Wackernagel, Hans; Carrat, Fabrice

    2016-01-01

    We present how Extreme Value Theory (EVT) can be used in public health to predict future extreme events. We applied EVT to weekly rates of Pneumonia and Influenza (P&I) deaths over 1979-2011. We further explored the daily number of emergency department visits in a network of 37 hospitals over 2004-2014. Maxima of grouped consecutive observations were fitted to a generalized extreme value distribution. The distribution was used to estimate the probability of extreme values in specified time periods. An annual P&I death rate of 12 per 100,000 (the highest maximum observed) should be exceeded once over the next 30 years and each year, there should be a 3% risk that the P&I death rate will exceed this value. Over the past 10 years, the observed maximum increase in the daily number of visits from the same weekday between two consecutive weeks was 1133. We estimated at 0.37% the probability of exceeding a daily increase of 1000 on each month. The EVT method can be applied to various topics in epidemiology thus contributing to public health planning for extreme events.

  10. Examining personal values in extreme environment contexts: Revisiting the question of generalizability

    NASA Astrophysics Data System (ADS)

    Smith, N.; Sandal, G. M.; Leon, G. R.; Kjærgaard, A.

    2017-08-01

    Land-based extreme environments (e.g. polar expeditions, Antarctic research stations, confinement chambers) have often been used as analog settings for spaceflight. These settings share similarities with the conditions experienced during space missions, including confinement, isolation and limited possibilities for evacuation. To determine the utility of analog settings for understanding human spaceflight, researchers have examined the extent to which the individual characteristics (e.g., personality) of people operating in extreme environments can be generalized across contexts (Sandal, 2000) [1]. Building on previous work, and utilising new and pre-existing data, the present study examined the extent to which personal value motives could be generalized across extreme environments. Four populations were assessed; mountaineers (N =59), military personnel (N = 25), Antarctic over-winterers (N = 21) and Mars simulation participants (N = 12). All participants completed the Portrait Values Questionnaire (PVQ; Schwartz; 2) capturing information on 10 personal values. Rank scores suggest that all groups identified Self-direction, Stimulation, Universalism and Benevolence as important values and acknowledged Power and Tradition as being low priorities. Results from difference testing suggest the extreme environment groups were most comparable on Self-direction, Stimulation, Benevolence, Tradition and Security. There were significant between-group differences on five of the ten values. Overall, findings pinpointed specific values that may be important for functioning in challenging environments. However, the differences that emerged on certain values highlight the importance of considering the specific population when comparing results across extreme settings. We recommend that further research examine the impact of personal value motives on indicators of adjustment, group working, and performance. Information from such studies could then be used to aid selection and training processes for personnel operating in extreme settings, and in space.

  11. An R2 statistic for fixed effects in the linear mixed model.

    PubMed

    Edwards, Lloyd J; Muller, Keith E; Wolfinger, Russell D; Qaqish, Bahjat F; Schabenberger, Oliver

    2008-12-20

    Statisticians most often use the linear mixed model to analyze Gaussian longitudinal data. The value and familiarity of the R(2) statistic in the linear univariate model naturally creates great interest in extending it to the linear mixed model. We define and describe how to compute a model R(2) statistic for the linear mixed model by using only a single model. The proposed R(2) statistic measures multivariate association between the repeated outcomes and the fixed effects in the linear mixed model. The R(2) statistic arises as a 1-1 function of an appropriate F statistic for testing all fixed effects (except typically the intercept) in a full model. The statistic compares the full model with a null model with all fixed effects deleted (except typically the intercept) while retaining exactly the same covariance structure. Furthermore, the R(2) statistic leads immediately to a natural definition of a partial R(2) statistic. A mixed model in which ethnicity gives a very small p-value as a longitudinal predictor of blood pressure (BP) compellingly illustrates the value of the statistic. In sharp contrast to the extreme p-value, a very small R(2) , a measure of statistical and scientific importance, indicates that ethnicity has an almost negligible association with the repeated BP outcomes for the study.

  12. Weak values and weak coupling maximizing the output of weak measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Di Lorenzo, Antonio, E-mail: dilorenzo.antonio@gmail.com

    2014-06-15

    In a weak measurement, the average output 〈o〉 of a probe that measures an observable A{sup -hat} of a quantum system undergoing both a preparation in a state ρ{sub i} and a postselection in a state E{sub f} is, to a good approximation, a function of the weak value A{sub w}=Tr[E{sub f}A{sup -hat} ρ{sub i}]/Tr[E{sub f}ρ{sub i}], a complex number. For a fixed coupling λ, when the overlap Tr[E{sub f}ρ{sub i}] is very small, A{sub w} diverges, but 〈o〉 stays finite, often tending to zero for symmetry reasons. This paper answers the questions: what is the weak value that maximizesmore » the output for a fixed coupling? What is the coupling that maximizes the output for a fixed weak value? We derive equations for the optimal values of A{sub w} and λ, and provide the solutions. The results are independent of the dimensionality of the system, and they apply to a probe having a Hilbert space of arbitrary dimension. Using the Schrödinger–Robertson uncertainty relation, we demonstrate that, in an important case, the amplification 〈o〉 cannot exceed the initial uncertainty σ{sub o} in the observable o{sup -hat}, we provide an upper limit for the more general case, and a strategy to obtain 〈o〉≫σ{sub o}. - Highlights: •We have provided a general framework to find the extremal values of a weak measurement. •We have derived the location of the extremal values in terms of preparation and postselection. •We have devised a maximization strategy going beyond the limit of the Schrödinger–Robertson relation.« less

  13. Risk assessment of precipitation extremes in northern Xinjiang, China

    NASA Astrophysics Data System (ADS)

    Yang, Jun; Pei, Ying; Zhang, Yanwei; Ge, Quansheng

    2018-05-01

    This study was conducted using daily precipitation records gathered at 37 meteorological stations in northern Xinjiang, China, from 1961 to 2010. We used the extreme value theory model, generalized extreme value (GEV) and generalized Pareto distribution (GPD), statistical distribution function to fit outputs of precipitation extremes with different return periods to estimate risks of precipitation extremes and diagnose aridity-humidity environmental variation and corresponding spatial patterns in northern Xinjiang. Spatiotemporal patterns of daily maximum precipitation showed that aridity-humidity conditions of northern Xinjiang could be well represented by the return periods of the precipitation data. Indices of daily maximum precipitation were effective in the prediction of floods in the study area. By analyzing future projections of daily maximum precipitation (2, 5, 10, 30, 50, and 100 years), we conclude that the flood risk will gradually increase in northern Xinjiang. GEV extreme value modeling yielded the best results, proving to be extremely valuable. Through example analysis for extreme precipitation models, the GEV statistical model was superior in terms of favorable analog extreme precipitation. The GPD model calculation results reflect annual precipitation. For most of the estimated sites' 2 and 5-year T for precipitation levels, GPD results were slightly greater than GEV results. The study found that extreme precipitation reaching a certain limit value level will cause a flood disaster. Therefore, predicting future extreme precipitation may aid warnings of flood disaster. A suitable policy concerning effective water resource management is thus urgently required.

  14. Estimation of breeding values using selected pedigree records.

    PubMed

    Morton, Richard; Howarth, Jordan M

    2005-06-01

    Fish bred in tanks or ponds cannot be easily tagged individually. The parentage of any individual may be determined by DNA fingerprinting, but is sufficiently expensive that large numbers cannot be so finger-printed. The measurement of the objective trait can be made on a much larger sample relatively cheaply. This article deals with experimental designs for selecting individuals to be finger-printed and for the estimation of the individual and family breeding values. The general setup provides estimates for both genetic effects regarded as fixed or random and for fixed effects due to known regressors. The family effects can be well estimated when even very small numbers are finger-printed, provided that they are the individuals with the most extreme phenotypes.

  15. Optimal control of orientation and entanglement for two dipole-dipole coupled quantum planar rotors.

    PubMed

    Yu, Hongling; Ho, Tak-San; Rabitz, Herschel

    2018-05-09

    Optimal control simulations are performed for orientation and entanglement of two dipole-dipole coupled identical quantum rotors. The rotors at various fixed separations lie on a model non-interacting plane with an applied control field. It is shown that optimal control of orientation or entanglement represents two contrasting control scenarios. In particular, the maximally oriented state (MOS) of the two rotors has a zero entanglement entropy and is readily attainable at all rotor separations. Whereas, the contrasting maximally entangled state (MES) has a zero orientation expectation value and is most conveniently attainable at small separations where the dipole-dipole coupling is strong. It is demonstrated that the peak orientation expectation value attained by the MOS at large separations exhibits a long time revival pattern due to the small energy splittings arising form the extremely weak dipole-dipole coupling between the degenerate product states of the two free rotors. Moreover, it is found that the peak entanglement entropy value attained by the MES remains largely unchanged as the two rotors are transported to large separations after turning off the control field. Finally, optimal control simulations of transition dynamics between the MOS and the MES reveal the intricate interplay between orientation and entanglement.

  16. The application of the statistical theory of extreme values to gust-load problems

    NASA Technical Reports Server (NTRS)

    Press, Harry

    1950-01-01

    An analysis is presented which indicates that the statistical theory of extreme values is applicable to the problems of predicting the frequency of encountering the larger gust loads and gust velocities for both specific test conditions as well as commercial transport operations. The extreme-value theory provides an analytic form for the distributions of maximum values of gust load and velocity. Methods of fitting the distribution are given along with a method of estimating the reliability of the predictions. The theory of extreme values is applied to available load data from commercial transport operations. The results indicate that the estimates of the frequency of encountering the larger loads are more consistent with the data and more reliable than those obtained in previous analyses. (author)

  17. Extreme value statistics analysis of fracture strengths of a sintered silicon nitride failing from pores

    NASA Technical Reports Server (NTRS)

    Chao, Luen-Yuan; Shetty, Dinesh K.

    1992-01-01

    Statistical analysis and correlation between pore-size distribution and fracture strength distribution using the theory of extreme-value statistics is presented for a sintered silicon nitride. The pore-size distribution on a polished surface of this material was characterized, using an automatic optical image analyzer. The distribution measured on the two-dimensional plane surface was transformed to a population (volume) distribution, using the Schwartz-Saltykov diameter method. The population pore-size distribution and the distribution of the pore size at the fracture origin were correllated by extreme-value statistics. Fracture strength distribution was then predicted from the extreme-value pore-size distribution, usin a linear elastic fracture mechanics model of annular crack around pore and the fracture toughness of the ceramic. The predicted strength distribution was in good agreement with strength measurements in bending. In particular, the extreme-value statistics analysis explained the nonlinear trend in the linearized Weibull plot of measured strengths without postulating a lower-bound strength.

  18. On alternative q-Weibull and q-extreme value distributions: Properties and applications

    NASA Astrophysics Data System (ADS)

    Zhang, Fode; Ng, Hon Keung Tony; Shi, Yimin

    2018-01-01

    Tsallis statistics and Tsallis distributions have been attracting a significant amount of research work in recent years. Importantly, the Tsallis statistics, q-distributions have been applied in different disciplines. Yet, a relationship between some existing q-Weibull distributions and q-extreme value distributions that is parallel to the well-established relationship between the conventional Weibull and extreme value distributions through a logarithmic transformation has not be established. In this paper, we proposed an alternative q-Weibull distribution that leads to a q-extreme value distribution via the q-logarithm transformation. Some important properties of the proposed q-Weibull and q-extreme value distributions are studied. Maximum likelihood and least squares estimation methods are used to estimate the parameters of q-Weibull distribution and their performances are investigated through a Monte Carlo simulation study. The methodologies and the usefulness of the proposed distributions are illustrated by fitting the 2014 traffic fatalities data from The National Highway Traffic Safety Administration.

  19. The Characteristics of Extreme Erosion Events in a Small Mountainous Watershed

    PubMed Central

    Fang, Nu-Fang; Shi, Zhi-Hua; Yue, Ben-Jiang; Wang, Ling

    2013-01-01

    A large amount of soil loss is caused by a small number of extreme events that are mainly responsible for the time compression of geomorphic processes. The aim of this study was to analyze suspended sediment transport during extreme erosion events in a mountainous watershed. Field measurements were conducted in Wangjiaqiao, a small agricultural watershed (16.7 km2) in the Three Gorges Area (TGA) of China. Continuous records were used to analyze suspended sediment transport regimes and assess the sediment loads of 205 rainfall–runoff events during a period of 16 hydrological years (1989–2004). Extreme events were defined as the largest events, ranked in order of their absolute magnitude (representing the 95th percentile). Ten extreme erosion events from 205 erosion events, representing 83.8% of the total suspended sediment load, were selected for study. The results of canonical discriminant analysis indicated that extreme erosion events are characterized by high maximum flood-suspended sediment concentrations, high runoff coefficients, and high flood peak discharge, which could possibly be explained by the transport of deposited sediment within the stream bed during previous events or bank collapses. PMID:24146898

  20. Extreme values and the level-crossing problem: An application to the Feller process

    NASA Astrophysics Data System (ADS)

    Masoliver, Jaume

    2014-04-01

    We review the question of the extreme values attained by a random process. We relate it to level crossings to one boundary (first-passage problems) as well as to two boundaries (escape problems). The extremes studied are the maximum, the minimum, the maximum absolute value, and the range or span. We specialize in diffusion processes and present detailed results for the Wiener and Feller processes.

  1. Optimal sensitivity for molecular recognition MAC-mode AFM

    PubMed

    Schindler; Badt; Hinterdorfer; Kienberger; Raab; Wielert-Badt; Pastushenko

    2000-02-01

    Molecular recognition force microscopy (MRFM) using the magnetic AC mode (MAC mode) atomic force microscope (AFM) was recently investigated to locate and probe recognition sites. A flexible crosslinker carrying a ligand is bound to the tip for the molecular recognition of receptors on the surface of a sample. In this report, the driving frequency is calculated which optimizes the sensitivity (S). The sensitivity of MRFM is defined as the relative change of the magnetically excited cantilever deflection amplitude arising from a crosslinker/antibody/antigen connection that is characterized by a very small force constant. The sensitivity is calculated in a damped oscillator model with a certain value of quality factor Q, which, together with load, defines the frequency response (unloaded oscillator shows resonance at Q > 0.707). If Q < 1, the greatest value of S corresponds to zero driving frequency omega (measured in units of eigenfrequency). Therefore, for Q < 1, MAC-mode has no advantage in comparison with DC-mode. Two additional extremes are found at omegaL = (1 - 1/Q)(1/2) and omegaR = (1 + 1/Q)(1/2), with corresponding sensitivities S(L) = Q2/(2Q - 1), S(R) = Q2/(2Q + 1). The L-extreme exists only for Q > 1, and then S(L) > S(R), i.e. the L-extreme is the main one. For Q > 1, S(L) > 1, and for Q > 2.41, S(R) > 1. These are the critical Q-values, above which selecting driving frequency equal to sigmaL or sigmaR brings advantage to MAC mode vs. DC mode. Satisfactory quality of the oscillator model is demonstrated by comparison of some results with those calculated within the classical description of cantilevers.

  2. Extreme Temperature Performance of Automotive-Grade Small Signal Bipolar Junction Transistors

    NASA Technical Reports Server (NTRS)

    Boomer, Kristen; Damron, Benny; Gray, Josh; Hammoud, Ahmad

    2018-01-01

    Electronics designed for space exploration missions must display efficient and reliable operation under extreme temperature conditions. For example, lunar outposts, Mars rovers and landers, James Webb Space Telescope, Europa orbiter, and deep space probes represent examples of missions where extreme temperatures and thermal cycling are encountered. Switching transistors, small signal as well as power level devices, are widely used in electronic controllers, data instrumentation, and power management and distribution systems. Little is known, however, about their performance in extreme temperature environments beyond their specified operating range; in particular under cryogenic conditions. This report summarizes preliminary results obtained on the evaluation of commercial-off-the-shelf (COTS) automotive-grade NPN small signal transistors over a wide temperature range and thermal cycling. The investigations were carried out to establish a baseline on functionality of these transistors and to determine suitability for use outside their recommended temperature limits.

  3. Absolute measurement of the extreme UV solar flux

    NASA Technical Reports Server (NTRS)

    Carlson, R. W.; Ogawa, H. S.; Judge, D. L.; Phillips, E.

    1984-01-01

    A windowless rare-gas ionization chamber has been developed to measure the absolute value of the solar extreme UV flux in the 50-575-A region. Successful results were obtained on a solar-pointing sounding rocket. The ionization chamber, operated in total absorption, is an inherently stable absolute detector of ionizing UV radiation and was designed to be independent of effects from secondary ionization and gas effusion. The net error of the measurement is + or - 7.3 percent, which is primarily due to residual outgassing in the instrument, other errors such as multiple ionization, photoelectron collection, and extrapolation to the zero atmospheric optical depth being small in comparison. For the day of the flight, Aug. 10, 1982, the solar irradiance (50-575 A), normalized to unit solar distance, was found to be 5.71 + or - 0.42 x 10 to the 10th photons per sq cm sec.

  4. A review of statistical methods to analyze extreme precipitation and temperature events in the Mediterranean region

    NASA Astrophysics Data System (ADS)

    Lazoglou, Georgia; Anagnostopoulou, Christina; Tolika, Konstantia; Kolyva-Machera, Fotini

    2018-04-01

    The increasing trend of the intensity and frequency of temperature and precipitation extremes during the past decades has substantial environmental and socioeconomic impacts. Thus, the objective of the present study is the comparison of several statistical methods of the extreme value theory (EVT) in order to identify which is the most appropriate to analyze the behavior of the extreme precipitation, and high and low temperature events, in the Mediterranean region. The extremes choice was made using both the block maxima and the peaks over threshold (POT) technique and as a consequence both the generalized extreme value (GEV) and generalized Pareto distributions (GPDs) were used to fit them. The results were compared, in order to select the most appropriate distribution for extremes characterization. Moreover, this study evaluates the maximum likelihood estimation, the L-moments and the Bayesian method, based on both graphical and statistical goodness-of-fit tests. It was revealed that the GPD can characterize accurately both precipitation and temperature extreme events. Additionally, GEV distribution with the Bayesian method is proven to be appropriate especially for the greatest values of extremes. Another important objective of this investigation was the estimation of the precipitation and temperature return levels for three return periods (50, 100, and 150 years) classifying the data into groups with similar characteristics. Finally, the return level values were estimated with both GEV and GPD and with the three different estimation methods, revealing that the selected method can affect the return level values for both the parameter of precipitation and temperature.

  5. Stress transfer mechanisms at the submicron level for graphene/polymer systems.

    PubMed

    Anagnostopoulos, George; Androulidakis, Charalampos; Koukaras, Emmanuel N; Tsoukleri, Georgia; Polyzos, Ioannis; Parthenios, John; Papagelis, Konstantinos; Galiotis, Costas

    2015-02-25

    The stress transfer mechanism from a polymer substrate to a nanoinclusion, such as a graphene flake, is of extreme interest for the production of effective nanocomposites. Previous work conducted mainly at the micron scale has shown that the intrinsic mechanism of stress transfer is shear at the interface. However, since the interfacial shear takes its maximum value at the very edge of the nanoinclusion it is of extreme interest to assess the effect of edge integrity upon axial stress transfer at the submicron scale. Here, we conduct a detailed Raman line mapping near the edges of a monolayer graphene flake that is simply supported onto an epoxy-based photoresist (SU8)/poly(methyl methacrylate) matrix at steps as small as 100 nm. We show for the first time that the distribution of axial strain (stress) along the flake deviates somewhat from the classical shear-lag prediction for a region of ∼ 2 μm from the edge. This behavior is mainly attributed to the presence of residual stresses, unintentional doping, and/or edge effects (deviation from the equilibrium values of bond lengths and angles, as well as different edge chiralities). By considering a simple balance of shear-to-normal stresses at the interface we are able to directly convert the strain (stress) gradient to values of interfacial shear stress for all the applied tensile levels without assuming classical shear-lag behavior. For large flakes a maximum value of interfacial shear stress of 0.4 MPa is obtained prior to flake slipping.

  6. A Fiducial Approach to Extremes and Multiple Comparisons

    ERIC Educational Resources Information Center

    Wandler, Damian V.

    2010-01-01

    Generalized fiducial inference is a powerful tool for many difficult problems. Based on an extension of R. A. Fisher's work, we used generalized fiducial inference for two extreme value problems and a multiple comparison procedure. The first extreme value problem is dealing with the generalized Pareto distribution. The generalized Pareto…

  7. Genetic diversity and genomic resources available for the small millet crops to accelerate a New Green Revolution

    PubMed Central

    Goron, Travis L.; Raizada, Manish N.

    2015-01-01

    Small millets are nutrient-rich food sources traditionally grown and consumed by subsistence farmers in Asia and Africa. They include finger millet (Eleusine coracana), foxtail millet (Setaria italica), kodo millet (Paspalum scrobiculatum), proso millet (Panicum miliaceum), barnyard millet (Echinochloa spp.), and little millet (Panicum sumatrense). Local farmers value the small millets for their nutritional and health benefits, tolerance to extreme stress including drought, and ability to grow under low nutrient input conditions, ideal in an era of climate change and steadily depleting natural resources. Little scientific attention has been paid to these crops, hence they have been termed “orphan cereals.” Despite this challenge, an advantageous quality of the small millets is that they continue to be grown in remote regions of the world which has preserved their biodiversity, providing breeders with unique alleles for crop improvement. The purpose of this review, first, is to highlight the diverse traits of each small millet species that are valued by farmers and consumers which hold potential for selection, improvement or mechanistic study. For each species, the germplasm, genetic and genomic resources available will then be described as potential tools to exploit this biodiversity. The review will conclude with noting current trends and gaps in the literature and make recommendations on how to better preserve and utilize diversity within these species to accelerate a New Green Revolution for subsistence farmers in Asia and Africa. PMID:25852710

  8. Assessment of extremely low frequency magnetic field exposure from GSM mobile phones.

    PubMed

    Calderón, Carolina; Addison, Darren; Mee, Terry; Findlay, Richard; Maslanyj, Myron; Conil, Emmanuelle; Kromhout, Hans; Lee, Ae-kyoung; Sim, Malcolm R; Taki, Masao; Varsier, Nadège; Wiart, Joe; Cardis, Elisabeth

    2014-04-01

    Although radio frequency (RF) electromagnetic fields emitted by mobile phones have received much attention, relatively little is known about the extremely low frequency (ELF) magnetic fields emitted by phones. This paper summarises ELF magnetic flux density measurements on global system for mobile communications (GSM) mobile phones, conducted as part of the MOBI-KIDS epidemiological study. The main challenge is to identify a small number of generic phone models that can be used to classify the ELF exposure for the different phones reported in the study. Two-dimensional magnetic flux density measurements were performed on 47 GSM mobile phones at a distance of 25 mm. Maximum resultant magnetic flux density values at 217 Hz had a geometric mean of 221 (+198/-104) nT. Taking into account harmonic data, measurements suggest that mobile phones could make a substantial contribution to ELF exposure in the general population. The maximum values and easily available variables were poorly correlated. However, three groups could be defined on the basis of field pattern indicating that manufacturers and shapes of mobile phones may be the important parameters linked to the spatial characteristics of the magnetic field, and the categorization of ELF magnetic field exposure for GSM phones in the MOBI-KIDS study may be achievable on the basis of a small number of representative phones. Such categorization would result in a twofold exposure gradient between high and low exposure based on type of phone used, although there was overlap in the grouping. © 2013 Wiley Periodicals, Inc.

  9. The large sample size fallacy.

    PubMed

    Lantz, Björn

    2013-06-01

    Significance in the statistical sense has little to do with significance in the common practical sense. Statistical significance is a necessary but not a sufficient condition for practical significance. Hence, results that are extremely statistically significant may be highly nonsignificant in practice. The degree of practical significance is generally determined by the size of the observed effect, not the p-value. The results of studies based on large samples are often characterized by extreme statistical significance despite small or even trivial effect sizes. Interpreting such results as significant in practice without further analysis is referred to as the large sample size fallacy in this article. The aim of this article is to explore the relevance of the large sample size fallacy in contemporary nursing research. Relatively few nursing articles display explicit measures of observed effect sizes or include a qualitative discussion of observed effect sizes. Statistical significance is often treated as an end in itself. Effect sizes should generally be calculated and presented along with p-values for statistically significant results, and observed effect sizes should be discussed qualitatively through direct and explicit comparisons with the effects in related literature. © 2012 Nordic College of Caring Science.

  10. α '-corrected black holes in String Theory

    NASA Astrophysics Data System (ADS)

    Cano, Pablo A.; Meessen, Patrick; Ortín, Tomás; Ramírez, Pedro F.

    2018-05-01

    We consider the well-known solution of the Heterotic Superstring effective action to zeroth order in α ' that describes the intersection of a fundamental string with momentum and a solitonic 5-brane and which gives a 3-charge, static, extremal, supersymmetric black hole in 5 dimensions upon dimensional reduction on T5. We compute explicitly the first-order in α ' corrections to this solution, including SU(2) Yang-Mills fields which can be used to cancel some of these corrections and we study the main properties of this α '-corrected solution: supersymmetry, values of the near-horizon and asymptotic charges, behavior under α '-corrected T-duality, value of the entropy (using Wald formula directly in 10 dimensions), existence of small black holes etc. The value obtained for the entropy agrees, within the limits of approximation, with that obtained by microscopic methods. The α ' corrections coming from Wald's formula prove crucial for this result.

  11. Impacts of climate change on precipitation and discharge extremes through the use of statistical downscaling approaches in a Mediterranean basin.

    PubMed

    Piras, Monica; Mascaro, Giuseppe; Deidda, Roberto; Vivoni, Enrique R

    2016-02-01

    Mediterranean region is characterized by high precipitation variability often enhanced by orography, with strong seasonality and large inter-annual fluctuations, and by high heterogeneity of terrain and land surface properties. As a consequence, catchments in this area are often prone to the occurrence of hydrometeorological extremes, including storms, floods and flash-floods. A number of climate studies focused in the Mediterranean region predict that extreme events will occur with higher intensity and frequency, thus requiring further analyses to assess their effect at the land surface, particularly in small- and medium-sized watersheds. In this study, climate and hydrologic simulations produced within the Climate Induced Changes on the Hydrology of Mediterranean Basins (CLIMB) EU FP7 research project were used to analyze how precipitation extremes propagate into discharge extremes in the Rio Mannu basin (472.5km(2)), located in Sardinia, Italy. The basin hydrologic response to climate forcings in a reference (1971-2000) and a future (2041-2070) period was simulated through the combined use of a set of global and regional climate models, statistical downscaling techniques, and a process based distributed hydrologic model. We analyzed and compared the distribution of annual maxima extracted from hourly and daily precipitation and peak discharge time series, simulated by the hydrologic model under climate forcing. For this aim, yearly maxima were fit by the Generalized Extreme Value (GEV) distribution using a regional approach. Next, we discussed commonality and contrasting behaviors of precipitation and discharge maxima distributions to better understand how hydrological transformations impact propagation of extremes. Finally, we show how rainfall statistical downscaling algorithms produce more reliable forcings for hydrological models than coarse climate model outputs. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Closing the Gap: An Analysis of Options for Improving the USAF Fighter Fleet from 2105 to 2035

    DTIC Science & Technology

    2015-10-01

    capacity. The CBO predicts an increase in capacity for both large, or 2000 lbs class weapons, and small , either 500 lbs class or Small Diameter Bomb ...Laser Guided Bomb (LGB) designed to penetrate extremely hardened bunkers with extreme accuracy.54 Larger weapons can provide better standoff range...operate with impunity in low intensity CAS scenarios. While survivability, with the exception of against small arms ground fire, is far less a

  13. Assessing changes in extreme convective precipitation from a damage perspective

    NASA Astrophysics Data System (ADS)

    Schroeer, K.; Tye, M. R.

    2016-12-01

    Projected increases in high-intensity short-duration convective precipitation are expected even in regions that are likely to become more arid. Such high intensity precipitation events can trigger hazardous flash floods, debris flows and landslides that put people and local assets at risk. However, the assessment of local scale precipitation extremes is hampered by its high spatial and temporal variability. In addition to which, not only are extreme events rare, but such small scale events are likely to be underreported where they don't coincide with the observation network. Rather than focus solely on the convective precipitation, understanding the characteristics of these extremes which drive damage may be more effective to assess future risks. Two sources of data are used in this study. First, sub-daily precipitation observations over the Southern Alps enable an examination of seasonal and regional patterns in high-intensity convective precipitation and their relationship with weather types. Secondly, reports of private loss and damage on a household scale are used to identify which events are most damaging, or what conditions potentially enhance the vulnerability to these extremes.This study explores the potential added value from including recorded loss and damage data to understand the risks from summertime convective precipitation events. By relating precipitation generating weather types to the severity of damage we hope to develop a mechanism to assess future risks. A further benefit would be to identify from damage reports the likely occurrence of precipitation extremes where no direct observations are available and use this information to validate remotely sensed observations.

  14. Seasonal and Non-Seasonal Generalized Pareto Distribution to Estimate Extreme Significant Wave Height in The Banda Sea

    NASA Astrophysics Data System (ADS)

    Nursamsiah; Nugroho Sugianto, Denny; Suprijanto, Jusup; Munasik; Yulianto, Bambang

    2018-02-01

    The information of extreme wave height return level was required for maritime planning and management. The recommendation methods in analyzing extreme wave were better distributed by Generalized Pareto Distribution (GPD). Seasonal variation was often considered in the extreme wave model. This research aims to identify the best model of GPD by considering a seasonal variation of the extreme wave. By using percentile 95 % as the threshold of extreme significant wave height, the seasonal GPD and non-seasonal GPD fitted. The Kolmogorov-Smirnov test was applied to identify the goodness of fit of the GPD model. The return value from seasonal and non-seasonal GPD was compared with the definition of return value as criteria. The Kolmogorov-Smirnov test result shows that GPD fits data very well both seasonal and non-seasonal model. The seasonal return value gives better information about the wave height characteristics.

  15. Temporal development of extreme precipitation in Germany projected by EURO-CORDEX simulations

    NASA Astrophysics Data System (ADS)

    Brendel, Christoph; Deutschländer, Thomas

    2017-04-01

    A sustainable operation of transport infrastructure requires an enhanced resilience to the increasing impacts of climate change and related extreme meteorological events. To meet this challenge, the German Federal Ministry of Transport and Digital Infrastructure (BMVI) commenced a comprehensive national research program on safe and sustainable transport in Germany. A network of departmental research institutes addresses the "Adaptation of the German transport infrastructure towards climate change and extreme events". Various studies already have identified an increase in the average global precipitation for the 20th century. There is some indication that these increases are most visible in a rising frequency of precipitation extremes. However, the changes are highly variable between regions and seasons. With a further increase of atmospheric greenhouse gas concentrations in the 21st century, the likelihood of occurrence of such extreme events will continue to rise. A kernel estimator has been used in order to obtain a robust estimate of the temporal development of extreme precipitation events projected by an ensemble of EURO-CORDEX simulations. The kernel estimator measures the intensity of the poisson point process indicating temporal changes in the frequency of extreme events. Extreme precipitation events were selected using the peaks over threshold (POT) method with the 90th, 95th and 99th quantile of daily precipitation sums as thresholds. Application of this non-parametric approach with relative thresholds renders the use of a bias correction non-mandatory. In addition, in comparison to fitting an extreme value theory (EVT) distribution, the method is completely unsusceptible to outliers. First results show an overall increase of extreme precipitation events for Germany until the end of the 21st century. However, major differences between seasons, quantiles and the three different Representative Concentration Pathways (RCP 2.6, 4.5, and 8.5) have been identified. For instance, the frequency of extreme precipitation events more than triples in the most extreme scenario. Regional differences are rather small with the largest increase in northern Germany, particularly in coastal regions and the weakest increase in the most southern parts of Germany.

  16. An Investigation on the Effect of Extremely Low Frequency Pulsed Electromagnetic Fields on Human Electrocardiograms (ECGs).

    PubMed

    Fang, Qiang; Mahmoud, Seedahmed S; Yan, Jiayong; Li, Hui

    2016-11-23

    For this investigation, we studied the effects of extremely low frequency pulse electromagnetic fields (ELF-PEMF) on the human cardiac signal. Electrocardiograms (ECGs) of 22 healthy volunteers before and after a short duration of ELF-PEMF exposure were recorded. The experiment was conducted under single-blind conditions. The root mean square (RMS) value of the recorded data was considered as comparison criteria. We also measured and analysed four important ECG time intervals before and after ELF-PEMF exposure. Results revealed that the RMS value of the ECG recordings from 18 participants (81.8% of the total participants) increased with a mean value of 3.72%. The increase in ECG voltage levels was then verified by a second experimental protocol with a control exposure. In addition to this, we used hyperbolic T-distributions (HTD) in the analysis of ECG signals to verify the change in the RR interval. It was found that there were small shifts in the frequency-domain signal before and after EMF exposure. This shift has an influence on all frequency components of the ECG signals, as all spectrums were shifted. It is shown from this investigation that a short time exposure to ELF-PEMF can affect the properties of ECG signals. Further study is needed to consolidate this finding and discover more on the biological effects of ELF-PEMF on human physiological processes.

  17. Nonparametric functional data estimation applied to ozone data: prediction and extreme value analysis.

    PubMed

    Quintela-del-Río, Alejandro; Francisco-Fernández, Mario

    2011-02-01

    The study of extreme values and prediction of ozone data is an important topic of research when dealing with environmental problems. Classical extreme value theory is usually used in air-pollution studies. It consists in fitting a parametric generalised extreme value (GEV) distribution to a data set of extreme values, and using the estimated distribution to compute return levels and other quantities of interest. Here, we propose to estimate these values using nonparametric functional data methods. Functional data analysis is a relatively new statistical methodology that generally deals with data consisting of curves or multi-dimensional variables. In this paper, we use this technique, jointly with nonparametric curve estimation, to provide alternatives to the usual parametric statistical tools. The nonparametric estimators are applied to real samples of maximum ozone values obtained from several monitoring stations belonging to the Automatic Urban and Rural Network (AURN) in the UK. The results show that nonparametric estimators work satisfactorily, outperforming the behaviour of classical parametric estimators. Functional data analysis is also used to predict stratospheric ozone concentrations. We show an application, using the data set of mean monthly ozone concentrations in Arosa, Switzerland, and the results are compared with those obtained by classical time series (ARIMA) analysis. Copyright © 2010 Elsevier Ltd. All rights reserved.

  18. Investigating NARCCAP Precipitation Extremes via Bivariate Extreme Value Theory (Invited)

    NASA Astrophysics Data System (ADS)

    Weller, G. B.; Cooley, D. S.; Sain, S. R.; Bukovsky, M. S.; Mearns, L. O.

    2013-12-01

    We introduce methodology from statistical extreme value theory to examine the ability of reanalysis-drive regional climate models to simulate past daily precipitation extremes. Going beyond a comparison of summary statistics such as 20-year return values, we study whether the most extreme precipitation events produced by climate model simulations exhibit correspondence to the most extreme events seen in observational records. The extent of this correspondence is formulated via the statistical concept of tail dependence. We examine several case studies of extreme precipitation events simulated by the six models of the North American Regional Climate Change Assessment Program (NARCCAP) driven by NCEP reanalysis. It is found that the NARCCAP models generally reproduce daily winter precipitation extremes along the Pacific coast quite well; in contrast, simulation of past daily summer precipitation extremes in a central US region is poor. Some differences in the strength of extremal correspondence are seen in the central region between models which employ spectral nudging and those which do not. We demonstrate how these techniques may be used to draw a link between extreme precipitation events and large-scale atmospheric drivers, as well as to downscale extreme precipitation simulated by a future run of a regional climate model. Specifically, we examine potential future changes in the nature of extreme precipitation along the Pacific coast produced by the pineapple express (PE) phenomenon. A link between extreme precipitation events and a "PE Index" derived from North Pacific sea-surface pressure fields is found. This link is used to study PE-influenced extreme precipitation produced by a future-scenario climate model run.

  19. Self-force as a cosmic censor in the Kerr overspinning problem

    NASA Astrophysics Data System (ADS)

    Colleoni, Marta; Barack, Leor; Shah, Abhay G.; van de Meent, Maarten

    2015-10-01

    It is known that a near-extremal Kerr black hole can be spun up beyond its extremal limit by capturing a test particle. Here we show that overspinning is always averted once backreaction from the particle's own gravity is properly taken into account. We focus on nonspinning, uncharged, massive particles thrown in along the equatorial plane and work in the first-order self-force approximation (i.e., we include all relevant corrections to the particle's acceleration through linear order in the ratio, assumed small, between the particle's energy and the black hole's mass). Our calculation is a numerical implementation of a recent analysis by two of us [Phys. Rev. D 91, 104024 (2015)], in which a necessary and sufficient "censorship" condition was formulated for the capture scenario, involving certain self-force quantities calculated on the one-parameter family of unstable circular geodesics in the extremal limit. The self-force information accounts both for radiative losses and for the finite-mass correction to the critical value of the impact parameter. Here we obtain the required self-force data and present strong evidence to suggest that captured particles never drive the black hole beyond its extremal limit. We show, however, that, within our first-order self-force approximation, it is possible to reach the extremal limit with a suitable choice of initial orbital parameters. To rule out such a possibility would require (currently unavailable) information about higher-order self-force corrections.

  20. Climate and its change over the Tibetan Plateau and its Surroundings in 1963-2015

    NASA Astrophysics Data System (ADS)

    Ding, J.; Cuo, L.

    2017-12-01

    Tibetan Plateau and its surroundings (TPS, 23°-43°N, 73°-106°E) lies in the southwest of China and includes Tibet Autonomous Region, Qinghai Province, southern Xinjiang Uygur Autonomous Region, part of Gansu Province, western Sichuan Province, and northern Yunnan Province. The region is of strategic importance in water resources because it is the headwater of ten large rivers that support more than 16 billion population. In this study, we use daily temperature maximum and minimum, precipitation and wind speed in 1963-2015 obtained from Climate Data Center of China Meteorological Administration and Qinghai Meteorological Bureau to investigate extreme climate conditions and their changes over the TPS. The extreme events are selected based on annual extreme values and percentiles. Annual extreme value approach produces one value each year for all variables, which enables us to examine the magnitude of extreme events; whereas percentile approach selects extreme values by setting 95th percentile as thresholds for maximum temperature, precipitation and wind speed, and 5th percentile for minimum temperature. Percentile approach not only enables us to investigate the magnitude but also frequency of the extreme events. Also, Mann-Kendall trend and mutation analysis were applied to analyze the changes in mean and extreme conditions. The results will help us understand more about the extreme events during the past five decades on the TPS and will provide valuable information for the upcoming IPCC reports on climate change.

  1. Uncertainty in Calibration, Detection and Estimation of Metal Concentrations in Engine Plumes Using OPAD

    NASA Technical Reports Server (NTRS)

    Hopkins, Randall C.; Benzing, Daniel A.

    1998-01-01

    Improvements in uncertainties in the values of radiant intensity (I) can be accomplished mainly by improvements in the calibration process and in minimizing the difference between the background and engine plume radiance. For engine tests in which the plume is extremely bright, the difference in luminance between the calibration lamp and the engine plume radiance can be so large as to cause relatively large uncertainties in the values of R. This is due to the small aperture necessary on the receiving optics to avoid saturating the instrument. However, this is not a problem with the SSME engine since the liquid oxygen/hydrogen combustion is not as bright as some other fuels. Applying the instrumentation to other type engine tests may require a much brighter calibration lamp.

  2. Last stand of single small field inflation

    NASA Astrophysics Data System (ADS)

    Bramante, Joseph; Lehman, Landon; Martin, Adam; Downes, Sean

    2014-07-01

    By incorporating both the tensor-to-scalar ratio and the measured value of the spectral index, we set a bound on solo small field inflation of Δϕ/mPl≥1.00√r/0.1 . Unlike previous bounds which require monotonic ɛV, |ηV|<1, and 60 e-folds of inflation, the bound remains valid for nonmonotonic ɛV, |ηV|≳1, and for inflation which occurs only over the eight e-folds which have been observed on the cosmic microwave background. The negative value of the spectral index over the observed eight e-folds is what makes the bound strong; we illustrate this by surveying single field models and finding that for r ≳0.1 and eight e-folds of inflation, there is no simple potential which reproduces observed cosmic microwave background perturbations and remains sub-Planckian. Models that are sub-Planckian after eight e-folds must be patched together with a second epoch of inflation that fills out the remaining ˜50 e-folds. This second, post-cosmic microwave background epoch is characterized by extremely small ɛV and therefore an increasing scalar power spectrum. Using the fact that large power can overabundantly produce primordial black holes, we bound the maximum energy level of the second phase of inflation.

  3. Spatiotemporal variation and statistical characteristic of extreme precipitation in the middle reaches of the Yellow River Basin during 1960-2013

    NASA Astrophysics Data System (ADS)

    Zhang, Yin; Xia, Jun; She, Dunxian

    2018-01-01

    In recent decades, extreme precipitation events have been a research hotspot worldwide. Based on 12 extreme precipitation indices, the spatiotemporal variation and statistical characteristic of precipitation extremes in the middle reaches of the Yellow River Basin (MRYRB) during 1960-2013 were investigated. The results showed that the values of most extreme precipitation indices (except consecutive dry days (CDD)) increased from the northwest to the southeast of the MRYRB, reflecting that the southeast was the wettest region in the study area. Temporally, the precipitation extremes presented a drying trend with less frequent precipitation events. Generalized extreme value (GEV) distribution was selected to fit the time series of all indices, and the quantiles values under the 50-year return period showed a similar spatial extent with the corresponding precipitation extreme indices during 1960-2013, indicating a higher risk of extreme precipitation in the southeast of the MRYRB. Furthermore, the changes in probability distribution functions of indices for the period of 1960-1986 and 1987-2013 revealed a drying tendency in our study area. Both El Niño-Southern Oscillation (ENSO) and Pacific Decadal Oscillation (PDO) were proved to have a strong influence on precipitation extremes in the MRYRB. The results of this study are useful to master the change rule of local precipitation extremes, which will help to prevent natural hazards caused.

  4. REMO poor man's reanalysis

    NASA Astrophysics Data System (ADS)

    Ries, H.; Moseley, C.; Haensler, A.

    2012-04-01

    Reanalyses depict the state of the atmosphere as a best fit in space and time of many atmospheric observations in a physically consistent way. By essentially solving the data assimilation problem in a very accurate manner, reanalysis results can be used as reference for model evaluation procedures and as forcing data sets for different model applications. However, the spatial resolution of the most common and accepted reanalysis data sets (e.g. JRA25, ERA-Interim) ranges from approximately 124 km to 80 km. This resolution is too coarse to simulate certain small scale processes often associated with extreme events. In addition, many models need higher resolved forcing data ( e.g. land-surface models, tools for identifying and assessing hydrological extremes). Therefore we downscaled the ERA-Interim reanalysis over the EURO-CORDEX-Domain for the time period 1989 to 2008 to a horizontal resolution of approximately 12 km. The downscaling is performed by nudging REMO-simulations to lower and lateral boundary conditions of the reanalysis, and by re-initializing the model every 24 hours ("REMO in forecast mode"). In this study the three following questions will be addressed: 1.) Does the REMO poor man's reanalysis meet the needs (accuracy, extreme value distribution) in validation and forcing? 2.) What lessons can be learned about the model used for downscaling? As REMO is used as a pure downscaling procedure, any systematic deviations from ERA-Interim result from poor process modelling but not from predictability limitations. 3.) How much small scale information generated by the downscaling model is lost with frequent initializations? A comparison to a simulation that is performed in climate mode will be presented.

  5. New digital capacitive measurement system for blade clearances

    NASA Astrophysics Data System (ADS)

    Moenich, Marcel; Bailleul, Gilles

    This paper presents a totally new concept for tip blade clearance evaluation in turbine engines. This system is able to detect exact 'measurands' even under high temperature and severe conditions like ionization. The system is based on a heavy duty probe head, a miniaturized thick-film hybrid electronic circuit and a signal processing unit for real time computing. The high frequency individual measurement values are digitally filtered and linearized in real time. The electronic is built in hybrid technology and therefore can be kept extremely small and robust, so that the system can be used on actual flights.

  6. On the thermal stability of graphone

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Podlivaev, A. I.; Openov, L. A., E-mail: LAOpenov@mephi.ru

    2011-07-15

    Molecular dynamics simulation is used to study thermally activated migration of hydrogen atoms in graphone, a magnetic semiconductor formed of a graphene monolayer with one side covered with hydrogen. The temperature dependence of the characteristic time of disordering of graphone via hopping of hydrogen atoms to neighboring carbon atoms is established directly. The activation energy of this process is determined at E{sub a} = (0.05 {+-} 0.01) eV. The small value of E{sub a} is indicative of the extremely low thermal stability of graphone. The low stability presents a serious handicap for practical use of the material in nanoelectronics.

  7. Investigation of CNTD Silicon Nitride on Complex Shapes.

    DTIC Science & Technology

    1980-02-06

    tubing) ! NH 3 + H 2 + N 2 SiCl4 + N2 Chamber Pressure, 4 2 To Manometer (copper tubing) t8 2 A: 2" DIA or 0 0 " To Pumping System _____________Pt/Pt-l3Rh...Total Gas Flow Rate: 47,350 cm3/min Gas Composition: N2 = 25,000 ml/min H2 = 20,600 ml/min Ar = 700 ml/min NH3 = 650 ml/min SiCl4 = 400 ml/min The...Values for SiCl4 were not available. In addition, the fractions of these two gases are extremely small in the overall gas composition. Individual

  8. Electrical properties of titanium dioxide nanoparticle on microelectrode: Gap size effect

    NASA Astrophysics Data System (ADS)

    Nadzirah, Sh.; Hashim, U.; Zakaria, M. R.; Rusop, M.

    2018-05-01

    TiO2 nanoparticle based interdigitated microelectrode was fabricated by spin-coating and conventional photolithography approaches. Aluminum metal was deposited by thermal evaporator on silicon dioxide substrate. The effect of aluminum microelectrode gap sizes (4, 5 and 6 µm) on the electrical performance was investigated using picoammeter. Extremely small output current values of three different gap sizes were acquired. A characteristic electrical behavior was observed for the studied geometry. The configuration demonstrated a reduction in the output current from 2.28E-10, 1.32E-9 and 2.38E-9 A with increasing gap size.

  9. Extreme value analysis in biometrics.

    PubMed

    Hüsler, Jürg

    2009-04-01

    We review some approaches of extreme value analysis in the context of biometrical applications. The classical extreme value analysis is based on iid random variables. Two different general methods are applied, which will be discussed together with biometrical examples. Different estimation, testing, goodness-of-fit procedures for applications are discussed. Furthermore, some non-classical situations are considered where the data are possibly dependent, where a non-stationary behavior is observed in the data or where the observations are not univariate. A few open problems are also stated.

  10. Human disease mortality kinetics are explored through a chain model embodying principles of extreme value theory and competing risks.

    PubMed

    Juckett, D A; Rosenberg, B

    1992-04-21

    The distributions for human disease-specific mortality exhibit two striking characteristics: survivorship curves that intersect near the longevity limit; and, the clustering of best-fitting Weibull shape parameter values into groups centered on integers. Correspondingly, we have hypothesized that the distribution intersections result from either competitive processes or population partitioning and the integral clustering in the shape parameter results from the occurrence of a small number of rare, rate-limiting events in disease progression. In this report we initiate a theoretical examination of these questions by exploring serial chain model dynamics and parameteric competing risks theory. The links in our chain models are composed of more than one bond, where the number of bonds in a link are denoted the link size and are the number of events necessary to break the link and, hence, the chain. We explored chains with all links of the same size or with segments of the chain composed of different size links (competition). Simulations showed that chain breakage dynamics depended on the weakest-link principle and followed kinetics of extreme-values which were very similar to human mortality kinetics. In particular, failure distributions for simple chains were Weibull-type extreme-value distributions with shape parameter values that were identifiable with the integral link size in the limit of infinite chain length. Furthermore, for chains composed of several segments of differing link size, the survival distributions for the various segments converged at a point in the S(t) tails indistinguishable from human data. This was also predicted by parameteric competing risks theory using Weibull underlying distributions. In both the competitive chain simulations and the parametric competing risks theory, however, the shape values for the intersecting distributions deviated from the integer values typical of human data. We conclude that rare events can be the source of integral shapes in human mortality, that convergence is a salient feature of multiple endpoints, but that pure competition may not be the best explanation for the exact type of convergence observable in human mortality. Finally, while the chain models were not motivated by any specific biological structures, interesting biological correlates to them may be useful in gerontological research.

  11. Effect of elevation on extreme precipitation of short durations: evidences of orographic signature on the parameters of Depth-Duration-Frequency curves

    NASA Astrophysics Data System (ADS)

    Avanzi, Francesco; De Michele, Carlo; Gabriele, Salvatore; Ghezzi, Antonio; Rosso, Renzo

    2015-04-01

    Here, we show how atmospheric circulation and topography rule the variability of depth-duration-frequency (DDF) curves parameters, and we discuss how this variability has physical implications on the formation of extreme precipitations at high elevations. A DDF is a curve ruling the value of the maximum annual precipitation H as a function of duration D and the level of probability F. We consider around 1500 stations over the Italian territory, with at least 20 years of data of maximum annual precipitation depth at different durations. We estimated the DDF parameters at each location by using the asymptotic distribution of extreme values, i.e. the so-called Generalized Extreme Value (GEV) distribution, and considering a statistical simple scale invariance hypothesis. Consequently, a DDF curve depends on five different parameters. A first set relates H with the duration (namely, the mean value of annual maximum precipitation depth for unit duration and the scaling exponent), while a second set links H to F (namely, a scale, position and shape parameter). The value of the shape parameter has consequences on the type of random variable (unbounded, upper or lower bounded). This extensive analysis shows that the variability of the mean value of annual maximum precipitation depth for unit duration obeys to the coupled effect of topography and modal direction of moisture flux during extreme events. Median values of this parameter decrease with elevation. We called this phenomenon "reverse orographic effect" on extreme precipitation of short durations, since it is in contrast with general knowledge about the orographic effect on mean precipitation. Moreover, the scaling exponent is mainly driven by topography alone (with increasing values of this parameter at increasing elevations). Therefore, the quantiles of H(D,F) at durations greater than unit turn to be more variable at high elevations than at low elevations. Additionally, the analysis of the variability of the shape parameter with elevation shows that extreme events at high elevations appear to be distributed according to an upper bounded probability distribution. These evidences could be a characteristic sign of the formation of extreme precipitation events at high elevations.

  12. Open mHealth Architecture: A Primer for Tomorrow's Orthopedic Surgeon and Introduction to Its Use in Lower Extremity Arthroplasty.

    PubMed

    Ramkumar, Prem N; Muschler, George F; Spindler, Kurt P; Harris, Joshua D; McCulloch, Patrick C; Mont, Michael A

    2017-04-01

    The recent private-public partnership to unlock and utilize all available health data has large-scale implications for public health and personalized medicine, especially within orthopedics. Today, consumer based technologies such as smartphones and "wearables" store tremendous amounts of personal health data (known as "mHealth") that, when processed and contextualized, have the potential to open new windows of insight for the orthopedic surgeon about their patients. In the present report, the landscape, role, and future technical considerations of mHealth and open architecture are defined with particular examples in lower extremity arthroplasty. A limitation of the current mHealth landscape is the fragmentation and lack of interconnectivity between the myriad of available apps. The importance behind the currently lacking open mHealth architecture is underscored by the offer of improved research, increased workflow efficiency, and value capture for the orthopedic surgeon. There exists an opportunity to leverage existing mobile health data for orthopaedic surgeons, particularly those specializing in lower extremity arthroplasty, by transforming patient small data into insightful big data through the implementation of "open" architecture that affords universal data standards and a global interconnected network. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. A radar-based regional extreme rainfall analysis to derive the thresholds for a novel automatic alert system in Switzerland

    NASA Astrophysics Data System (ADS)

    Panziera, Luca; Gabella, Marco; Zanini, Stefano; Hering, Alessandro; Germann, Urs; Berne, Alexis

    2016-06-01

    This paper presents a regional extreme rainfall analysis based on 10 years of radar data for the 159 regions adopted for official natural hazard warnings in Switzerland. Moreover, a nowcasting tool aimed at issuing heavy precipitation regional alerts is introduced. The two topics are closely related, since the extreme rainfall analysis provides the thresholds used by the nowcasting system for the alerts. Warm and cold seasons' monthly maxima of several statistical quantities describing regional rainfall are fitted to a generalized extreme value distribution in order to derive the precipitation amounts corresponding to sub-annual return periods for durations of 1, 3, 6, 12, 24 and 48 h. It is shown that regional return levels exhibit a large spatial variability in Switzerland, and that their spatial distribution strongly depends on the duration of the aggregation period: for accumulations of 3 h and shorter, the largest return levels are found over the northerly alpine slopes, whereas for longer durations the southern Alps exhibit the largest values. The inner alpine chain shows the lowest values, in agreement with previous rainfall climatologies. The nowcasting system presented here is aimed to issue heavy rainfall alerts for a large variety of end users, who are interested in different precipitation characteristics and regions, such as, for example, small urban areas, remote alpine catchments or administrative districts. The alerts are issued not only if the rainfall measured in the immediate past or forecast in the near future exceeds some predefined thresholds but also as soon as the sum of past and forecast precipitation is larger than threshold values. This precipitation total, in fact, has primary importance in applications for which antecedent rainfall is as important as predicted one, such as urban floods early warning systems. The rainfall fields, the statistical quantity representing regional rainfall and the frequency of alerts issued in case of continuous threshold exceedance are some of the configurable parameters of the tool. The analysis of the urban flood which occurred in the city of Schaffhausen in May 2013 suggests that this alert tool might have complementary skill with respect to radar-based thunderstorm nowcasting systems for storms which do not show a clear convective signature.

  14. Estimation of muscle torque in various combat sports.

    PubMed

    Pędzich, Wioletta; Mastalerz, Andrzej; Sadowski, Jerzy

    2012-01-01

    The purpose of the research was to compare muscle torque of elite combat groups. Twelve taekwondo WTF athletes, twelve taekwondo ITF athletes and nine boxers participated in the study. Measurements of muscle torques were done under static conditions on a special stand which belonged to the Department of Biomechanics. The sum of muscle torque of lower right and left extremities of relative values was significantly higher for taekwondo WTF athletes than for boxers (16%, p < 0.001 for right and 10%, p < 0.05 for left extremities) and taekwondo ITF (10%, p < 0.05 for right and 8% for left extremities). Taekwondo ITF athletes attained significantly higher absolute muscle torque values than boxers for elbow flexors (20%, p < 0.05 for right and 11% for left extremities) and extensors (14% for right and 18%, p < 0.05 for left extremities) and shoulder flexors (10% for right and 12%, p < 0.05 for left extremities) and extensors (11% for right and 1% for left extremities). Taekwondo WTF and taekwondo ITF athletes obtained significantly different relative values of muscle torque of the hip flexors (16%, p < 0.05) and extensors (11%, p < 0.05) of the right extremities.

  15. Design flow factors for sewerage systems in small arid communities.

    PubMed

    Imam, Emad H; Elnakar, Haitham Y

    2014-09-01

    Reliable estimation of sewage flow rates is essential for the proper design of sewers, pumping stations, and treatment plants. The design of the various components of the sewerage system should be based on the most critical flow rates with a focus on extremely low and peak flow rates that would be sustained for a duration related to the acceptable limits of behavior of the components under consideration. The extreme flow conditions and to what extent they differ from the average values are closely related to the size of the community or network, and the socioeconomic conditions. A single pumping station is usually sufficient to pump flow from small community in either flat or non-undulating topography. Therefore, the hydraulic loading on the wastewater treatment plant (WWTP) results from the pumped flow from the pumping station rather than the trunk sewer flow. The intermittent operation of the pumping units further accentuates the sewage hydrograph in the final trunk sewer. Accordingly, the design flow for the various components of the WWTP should be determined based on their relevant flow factors. In this study, analysis of one representative small community out of five monitored small communities in Egypt and the Kingdom of Saudi Arabia is presented. Pumped sewage flow rates were measured and the sewer incoming flows were hydraulically derived. The hourly and daily sewer and pumped flow records were analyzed to derive the relationship between the flow factors that would be sustained for various durations (instantaneously, 1 h, 2 h, etc.) and their probability of non-exceedance. The resulting peaking factors with a consideration for their sustained flow duration and specified probability would permit the design of the various components of the treatment plant using more accurate critical flows.

  16. Design flow factors for sewerage systems in small arid communities

    PubMed Central

    Imam, Emad H.; Elnakar, Haitham Y.

    2013-01-01

    Reliable estimation of sewage flow rates is essential for the proper design of sewers, pumping stations, and treatment plants. The design of the various components of the sewerage system should be based on the most critical flow rates with a focus on extremely low and peak flow rates that would be sustained for a duration related to the acceptable limits of behavior of the components under consideration. The extreme flow conditions and to what extent they differ from the average values are closely related to the size of the community or network, and the socioeconomic conditions. A single pumping station is usually sufficient to pump flow from small community in either flat or non-undulating topography. Therefore, the hydraulic loading on the wastewater treatment plant (WWTP) results from the pumped flow from the pumping station rather than the trunk sewer flow. The intermittent operation of the pumping units further accentuates the sewage hydrograph in the final trunk sewer. Accordingly, the design flow for the various components of the WWTP should be determined based on their relevant flow factors. In this study, analysis of one representative small community out of five monitored small communities in Egypt and the Kingdom of Saudi Arabia is presented. Pumped sewage flow rates were measured and the sewer incoming flows were hydraulically derived. The hourly and daily sewer and pumped flow records were analyzed to derive the relationship between the flow factors that would be sustained for various durations (instantaneously, 1 h, 2 h, etc.) and their probability of non-exceedance. The resulting peaking factors with a consideration for their sustained flow duration and specified probability would permit the design of the various components of the treatment plant using more accurate critical flows. PMID:25685521

  17. Simulation of climate characteristics and extremes of the Volta Basin using CCLM and RCA regional climate models

    NASA Astrophysics Data System (ADS)

    Darko, Deborah; Adjei, Kwaku A.; Appiah-Adjei, Emmanuel K.; Odai, Samuel N.; Obuobie, Emmanuel; Asmah, Ruby

    2018-06-01

    The extent to which statistical bias-adjusted outputs of two regional climate models alter the projected change signals for the mean (and extreme) rainfall and temperature over the Volta Basin is evaluated. The outputs from two regional climate models in the Coordinated Regional Climate Downscaling Experiment for Africa (CORDEX-Africa) are bias adjusted using the quantile mapping technique. Annual maxima rainfall and temperature with their 10- and 20-year return values for the present (1981-2010) and future (2051-2080) climates are estimated using extreme value analyses. Moderate extremes are evaluated using extreme indices (viz. percentile-based, duration-based, and intensity-based). Bias adjustment of the original (bias-unadjusted) models improves the reproduction of mean rainfall and temperature for the present climate. However, the bias-adjusted models poorly reproduce the 10- and 20-year return values for rainfall and maximum temperature whereas the extreme indices are reproduced satisfactorily for the present climate. Consequently, projected changes in rainfall and temperature extremes were weak. The bias adjustment results in the reduction of the change signals for the mean rainfall while the mean temperature signals are rather magnified. The projected changes for the original mean climate and extremes are not conserved after bias adjustment with the exception of duration-based extreme indices.

  18. Projections of extreme storm surge levels along Europe

    NASA Astrophysics Data System (ADS)

    Vousdoukas, Michalis I.; Voukouvalas, Evangelos; Annunziato, Alessandro; Giardino, Alessio; Feyen, Luc

    2016-11-01

    Storm surges are an important coastal hazard component and it is unknown how they will evolve along Europe's coastline in view of climate change. In the present contribution, the hydrodynamic model Delft3D-Flow was forced by surface wind and atmospheric pressure fields from a 8-member climate model ensemble in order to evaluate dynamics in storm surge levels (SSL) along the European coastline (1) for the baseline period 1970-2000; and (2) during this century under the Representative Concentration Pathways RCP4.5 and RCP8.5. Validation simulations, spanning from 2008 to 2014 and driven by ERA-Interim atmospheric forcing, indicated good predictive skill (0.06 m < RMSE < 0.29 m and 10 % < RMSE < 29 % for 110 tidal gauge stations across Europe). Peak-over-threshold extreme value analysis was applied to estimate SSL values for different return periods, and changes of future SSL were obtained from all models to obtain the final ensemble. Values for most scenarios and return periods indicate a projected increase in SSL at several locations along the North European coastline, which is more prominent for RCP8.5 and shows an increasing tendency towards the end of the century for both RCP4.5 and RCP8.5. Projected SSL changes along the European coastal areas south of 50°N show minimal change or even a small decrease, with the exception of RCP8.5 under which a moderate increase is projected towards the end of the century. The present findings indicate that the anticipated increase in extreme total water levels due to relative sea level rise (RSLR), can be further enforced by an increase of the extreme SSL, which can exceed 30 % of the RSLR, especially for the high return periods and pathway RCP8.5. This implies that the combined effect could increase even further anticipated impacts of climate change for certain European areas and highlights the necessity for timely coastal adaptation and protection measures. The dataset is publicly available under this link: http://data.jrc.ec.europa.eu/collection/LISCOAST.

  19. Oxygen isotope compositions of selected laramide-tertiary granitoid stocks in the Colorado Mineral Belt and their bearing on the origin of climax-type granite-molybdenum systems

    USGS Publications Warehouse

    Hannah, J.L.; Stein, H.J.

    1986-01-01

    Quartz phenocrysts from 31 granitoid stocks in the Colorado Mineral Belt yield ??18O values less than 10.4???, with most values between 9.3 and 10.4???. An average magmatic value of about 8.5??? is suggested. The stocks resemble A-type granites; these data support magma genesis by partial melting of previously depleted, fluorine-enriched, lower crustal granulites, followed by extreme differentiation and volatile evolution in the upper crust. Subsolidus interaction of isotopically light water with stocks has reduced most feldspar and whole rock ??18O values. Unaltered samples from Climax-type molybdenumbearing granites, however, show no greater isotopic disturbance than samples from unmineralized stocks. Although meteoric water certainly played a role in post-mineralization alteration, particularly in feldspars, it is not required during high-temperature mineralization processes. We suggest that slightly low ??18O values in some vein and replacement minerals associated with molybdenum mineralization may have resulted from equilibration with isotopically light magmatic water and/or heavy isotope depletion of the ore fluid by precipitation of earlier phases. Accumulation of sufficient quantities of isotopically light magmatic water to produce measured depletions of 18O requires extreme chemical stratification in a large magma reservoir. Upward migration of a highly fractionated, volatile-rich magma into a small apical Climax-type diapir, including large scale transport of silica, alkalis, molybdenum, and other vapor soluble elements, may occur with depression of the solidus temperature and reduction of magma viscosity by fluorine. Climax-type granites may provide examples of 18O depletion in magmatic systems without meteoric water influx. ?? 1986 Springer-Verlag.

  20. Local instability driving extreme events in a pair of coupled chaotic electronic circuits

    NASA Astrophysics Data System (ADS)

    de Oliveira, Gilson F.; Di Lorenzo, Orlando; de Silans, Thierry Passerat; Chevrollier, Martine; Oriá, Marcos; Cavalcante, Hugo L. D. de Souza

    2016-06-01

    For a long time, extreme events happening in complex systems, such as financial markets, earthquakes, and neurological networks, were thought to follow power-law size distributions. More recently, evidence suggests that in many systems the largest and rarest events differ from the other ones. They are dragon kings, outliers that make the distribution deviate from a power law in the tail. Understanding the processes of formation of extreme events and what circumstances lead to dragon kings or to a power-law distribution is an open question and it is a very important one to assess whether extreme events will occur too often in a specific system. In the particular system studied in this paper, we show that the rate of occurrence of dragon kings is controlled by the value of a parameter. The system under study here is composed of two nearly identical chaotic oscillators which fail to remain in a permanently synchronized state when coupled. We analyze the statistics of the desynchronization events in this specific example of two coupled chaotic electronic circuits and find that modifying a parameter associated to the local instability responsible for the loss of synchronization reduces the occurrence of dragon kings, while preserving the power-law distribution of small- to intermediate-size events with the same scaling exponent. Our results support the hypothesis that the dragon kings are caused by local instabilities in the phase space.

  1. Extreme Programming: Maestro Style

    NASA Technical Reports Server (NTRS)

    Norris, Jeffrey; Fox, Jason; Rabe, Kenneth; Shu, I-Hsiang; Powell, Mark

    2009-01-01

    "Extreme Programming: Maestro Style" is the name of a computer programming methodology that has evolved as a custom version of a methodology, called extreme programming that has been practiced in the software industry since the late 1990s. The name of this version reflects its origin in the work of the Maestro team at NASA's Jet Propulsion Laboratory that develops software for Mars exploration missions. Extreme programming is oriented toward agile development of software resting on values of simplicity, communication, testing, and aggressiveness. Extreme programming involves use of methods of rapidly building and disseminating institutional knowledge among members of a computer-programming team to give all the members a shared view that matches the view of the customers for whom the software system is to be developed. Extreme programming includes frequent planning by programmers in collaboration with customers, continually examining and rewriting code in striving for the simplest workable software designs, a system metaphor (basically, an abstraction of the system that provides easy-to-remember software-naming conventions and insight into the architecture of the system), programmers working in pairs, adherence to a set of coding standards, collaboration of customers and programmers, frequent verbal communication, frequent releases of software in small increments of development, repeated testing of the developmental software by both programmers and customers, and continuous interaction between the team and the customers. The environment in which the Maestro team works requires the team to quickly adapt to changing needs of its customers. In addition, the team cannot afford to accept unnecessary development risk. Extreme programming enables the Maestro team to remain agile and provide high-quality software and service to its customers. However, several factors in the Maestro environment have made it necessary to modify some of the conventional extreme-programming practices. The single most influential of these factors is that continuous interaction between customers and programmers is not feasible.

  2. 76 FR 29251 - Guidance for Industry and Food and Drug Administration Staff; Class II Special Controls; Guidance...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-20

    ...: Topical Oxygen Chamber for Extremities; Availability; Correction AGENCY: Food and Drug Administration, HHS... Special Controls Guidance Documents: Topical Oxygen Chamber for Extremities.'' The document published... Oxygen Chamber for Extremities'' to the Division of Small Manufacturers, International, and Consumer...

  3. Stress Transfer Mechanisms at the Submicron Level for Graphene/Polymer Systems

    PubMed Central

    2015-01-01

    The stress transfer mechanism from a polymer substrate to a nanoinclusion, such as a graphene flake, is of extreme interest for the production of effective nanocomposites. Previous work conducted mainly at the micron scale has shown that the intrinsic mechanism of stress transfer is shear at the interface. However, since the interfacial shear takes its maximum value at the very edge of the nanoinclusion it is of extreme interest to assess the effect of edge integrity upon axial stress transfer at the submicron scale. Here, we conduct a detailed Raman line mapping near the edges of a monolayer graphene flake that is simply supported onto an epoxy-based photoresist (SU8)/poly(methyl methacrylate) matrix at steps as small as 100 nm. We show for the first time that the distribution of axial strain (stress) along the flake deviates somewhat from the classical shear-lag prediction for a region of ∼2 μm from the edge. This behavior is mainly attributed to the presence of residual stresses, unintentional doping, and/or edge effects (deviation from the equilibrium values of bond lengths and angles, as well as different edge chiralities). By considering a simple balance of shear-to-normal stresses at the interface we are able to directly convert the strain (stress) gradient to values of interfacial shear stress for all the applied tensile levels without assuming classical shear-lag behavior. For large flakes a maximum value of interfacial shear stress of 0.4 MPa is obtained prior to flake slipping. PMID:25644121

  4. Monitoring Cellular Events in Living Mast Cells Stimulated with an Extremely Small Amount of Fluid on a Microchip

    NASA Astrophysics Data System (ADS)

    Munaka, Tatsuya; Abe, Hirohisa; Kanai, Masaki; Sakamoto, Takashi; Nakanishi, Hiroaki; Yamaoka, Tetsuji; Shoji, Shuichi; Murakami, Akira

    2006-07-01

    We successfully developed a measurement system for real-time analysis of cellular function using a newly designed microchip. This microchip was equipped with a micro cell incubation chamber (240 nl) and was stimulated by a very small amount of stimuli (as small as 24 nl). Using the microchip system, cultivation of mast cells was successfully carried out. Monitoring of the cellular events after stimulation with an extremely small amount of fluid on a microchip was performed. This system could be applicable for various types of cellular analysis including real-time monitoring of cellular response by stimulation.

  5. Scale dependency of regional climate modeling of current and future climate extremes in Germany

    NASA Astrophysics Data System (ADS)

    Tölle, Merja H.; Schefczyk, Lukas; Gutjahr, Oliver

    2017-11-01

    A warmer climate is projected for mid-Europe, with less precipitation in summer, but with intensified extremes of precipitation and near-surface temperature. However, the extent and magnitude of such changes are associated with creditable uncertainty because of the limitations of model resolution and parameterizations. Here, we present the results of convection-permitting regional climate model simulations for Germany integrated with the COSMO-CLM using a horizontal grid spacing of 1.3 km, and additional 4.5- and 7-km simulations with convection parameterized. Of particular interest is how the temperature and precipitation fields and their extremes depend on the horizontal resolution for current and future climate conditions. The spatial variability of precipitation increases with resolution because of more realistic orography and physical parameterizations, but values are overestimated in summer and over mountain ridges in all simulations compared to observations. The spatial variability of temperature is improved at a resolution of 1.3 km, but the results are cold-biased, especially in summer. The increase in resolution from 7/4.5 km to 1.3 km is accompanied by less future warming in summer by 1 ∘C. Modeled future precipitation extremes will be more severe, and temperature extremes will not exclusively increase with higher resolution. Although the differences between the resolutions considered (7/4.5 km and 1.3 km) are small, we find that the differences in the changes in extremes are large. High-resolution simulations require further studies, with effective parameterizations and tunings for different topographic regions. Impact models and assessment studies may benefit from such high-resolution model results, but should account for the impact of model resolution on model processes and climate change.

  6. Statistical Methods for Quantifying the Variability of Solar Wind Transients of All Sizes

    NASA Astrophysics Data System (ADS)

    Tindale, E.; Chapman, S. C.

    2016-12-01

    The solar wind is inherently variable across a wide range of timescales, from small-scale turbulent fluctuations to the 11-year periodicity induced by the solar cycle. Each solar cycle is unique, and this change in overall cycle activity is coupled from the Sun to Earth via the solar wind, leading to long-term trends in space weather. Our work [Tindale & Chapman, 2016] applies novel statistical methods to solar wind transients of all sizes, to quantify the variability of the solar wind associated with the solar cycle. We use the same methods to link solar wind observations with those on the Sun and Earth. We use Wind data to construct quantile-quantile (QQ) plots comparing the statistical distributions of multiple commonly used solar wind-magnetosphere coupling parameters between the minima and maxima of solar cycles 23 and 24. We find that in each case the distribution is multicomponent, ranging from small fluctuations to extreme values, with the same functional form at all phases of the solar cycle. The change in PDF is captured by a simple change of variables, which is independent of the PDF model. Using this method we can quantify the quietness of the cycle 24 maximum, identify which variable drives the changing distribution of composite parameters such as ɛ, and we show that the distribution of ɛ is less sensitive to changes in its extreme values than that of its constituents. After demonstrating the QQ method on solar wind data, we extend the analysis to include solar and magnetospheric data spanning the same time period. We focus on GOES X-ray flux and WDC AE index data. Finally, having studied the statistics of transients across the full distribution, we apply the same method to time series of extreme bursts in each variable. Using these statistical tools, we aim to track the solar cycle-driven variability from the Sun through the solar wind and into the Earth's magnetosphere. Tindale, E. and S.C. Chapman (2016), Geophys. Res. Lett., 43(11), doi: 10.1002/2016GL068920.

  7. Causes of Glacier Melt Extremes in the Alps Since 1949

    NASA Astrophysics Data System (ADS)

    Thibert, E.; Dkengne Sielenou, P.; Vionnet, V.; Eckert, N.; Vincent, C.

    2018-01-01

    Recent record-breaking glacier melt values are attributable to peculiar extreme events and long-term warming trends that shift averages upward. Analyzing one of the world's longest mass balance series with extreme value statistics, we show that detrending melt anomalies makes it possible to disentangle these effects, leading to a fairer evaluation of the return period of melt extreme values such as 2003, and to characterize them by a more realistic bounded behavior. Using surface energy balance simulations, we show that three independent drivers control melt: global radiation, latent heat, and the amount of snow at the beginning of the melting season. Extremes are governed by large deviations in global radiation combined with sensible heat. Long-term trends are driven by the lengthening of melt duration due to earlier and longer-lasting melting of ice along with melt intensification caused by trends in long-wave irradiance and latent heat due to higher air moisture.

  8. Correlation dimension and phase space contraction via extreme value theory

    NASA Astrophysics Data System (ADS)

    Faranda, Davide; Vaienti, Sandro

    2018-04-01

    We show how to obtain theoretical and numerical estimates of correlation dimension and phase space contraction by using the extreme value theory. The maxima of suitable observables sampled along the trajectory of a chaotic dynamical system converge asymptotically to classical extreme value laws where: (i) the inverse of the scale parameter gives the correlation dimension and (ii) the extremal index is associated with the rate of phase space contraction for backward iteration, which in dimension 1 and 2, is closely related to the positive Lyapunov exponent and in higher dimensions is related to the metric entropy. We call it the Dynamical Extremal Index. Numerical estimates are straightforward to obtain as they imply just a simple fit to a univariate distribution. Numerical tests range from low dimensional maps, to generalized Henon maps and climate data. The estimates of the indicators are particularly robust even with relatively short time series.

  9. Extreme events in total ozone: Spatio-temporal analysis from local to global scale

    NASA Astrophysics Data System (ADS)

    Rieder, Harald E.; Staehelin, Johannes; Maeder, Jörg A.; Ribatet, Mathieu; di Rocco, Stefania; Jancso, Leonhardt M.; Peter, Thomas; Davison, Anthony C.

    2010-05-01

    Recently tools from extreme value theory (e.g. Coles, 2001; Ribatet, 2007) have been applied for the first time in the field of stratospheric ozone research, as statistical analysis showed that previously used concepts assuming a Gaussian distribution (e.g. fixed deviations from mean values) of total ozone data do not address the internal data structure concerning extremes adequately (Rieder et al., 2010a,b). A case study the world's longest total ozone record (Arosa, Switzerland - for details see Staehelin et al., 1998a,b) illustrates that tools based on extreme value theory are appropriate to identify ozone extremes and to describe the tails of the total ozone record. Excursions in the frequency of extreme events reveal "fingerprints" of dynamical factors such as ENSO or NAO, and chemical factors, such as cold Arctic vortex ozone losses, as well as major volcanic eruptions of the 20th century (e.g. Gunung Agung, El Chichón, Mt. Pinatubo). Furthermore, atmospheric loading in ozone depleting substances led to a continuous modification of column ozone in the northern hemisphere also with respect to extreme values (partly again in connection with polar vortex contributions). It is shown that application of extreme value theory allows the identification of many more such fingerprints than conventional time series analysis of annual and seasonal mean values. Especially, the extremal analysis shows the strong influence of dynamics, revealing that even moderate ENSO and NAO events have a discernible effect on total ozone (Rieder et al., 2010b). Overall the extremes concept provides new information on time series properties, variability, trends and the influence of dynamics and chemistry, complementing earlier analyses focusing only on monthly (or annual) mean values. Findings described above could be proven also for the total ozone records of 5 other long-term series (Belsk, Hohenpeissenberg, Hradec Kralove, Potsdam, Uccle) showing that strong influence of atmospheric dynamics (NAO, ENSO) on total ozone is a global feature in the northern mid-latitudes (Rieder et al., 2010c). In a next step frequency distributions of extreme events are analyzed on global scale (northern and southern mid-latitudes). A specific focus here is whether findings gained through analysis of long-term European ground based stations can be clearly identified as a global phenomenon. By showing results from these three types of studies an overview of extreme events in total ozone (and the dynamical and chemical features leading to those) will be presented from local to global scales. References: Coles, S.: An Introduction to Statistical Modeling of Extreme Values, Springer Series in Statistics, ISBN:1852334592, Springer, Berlin, 2001. Ribatet, M.: POT: Modelling peaks over a threshold, R News, 7, 34-36, 2007. Rieder, H.E., Staehelin, J., Maeder, J.A., Ribatet, M., Stübi, R., Weihs, P., Holawe, F., Peter, T., and A.D., Davison (2010): Extreme events in total ozone over Arosa - Part I: Application of extreme value theory, to be submitted to ACPD. Rieder, H.E., Staehelin, J., Maeder, J.A., Ribatet, M., Stübi, R., Weihs, P., Holawe, F., Peter, T., and A.D., Davison (2010): Extreme events in total ozone over Arosa - Part II: Fingerprints of atmospheric dynamics and chemistry and effects on mean values and long-term changes, to be submitted to ACPD. Rieder, H.E., Jancso, L., Staehelin, J., Maeder, J.A., Ribatet, Peter, T., and A.D., Davison (2010): Extreme events in total ozone over the northern mid-latitudes: A case study based on long-term data sets from 5 ground-based stations, in preparation. Staehelin, J., Renaud, A., Bader, J., McPeters, R., Viatte, P., Hoegger, B., Bugnion, V., Giroud, M., and Schill, H.: Total ozone series at Arosa (Switzerland): Homogenization and data comparison, J. Geophys. Res., 103(D5), 5827-5842, doi:10.1029/97JD02402, 1998a. Staehelin, J., Kegel, R., and Harris, N. R.: Trend analysis of the homogenized total ozone series of Arosa (Switzerland), 1929-1996, J. Geophys. Res., 103(D7), 8389-8400, doi:10.1029/97JD03650, 1998b.

  10. Mathematical aspects of assessing extreme events for the safety of nuclear plants

    NASA Astrophysics Data System (ADS)

    Potempski, Slawomir; Borysiewicz, Mieczyslaw

    2015-04-01

    In the paper the review of mathematical methodologies applied for assessing low frequencies of rare natural events like earthquakes, tsunamis, hurricanes or tornadoes, floods (in particular flash floods and surge storms), lightning, solar flares, etc., will be given in the perspective of the safety assessment of nuclear plants. The statistical methods are usually based on the extreme value theory, which deals with the analysis of extreme deviation from the median (or the mean). In this respect application of various mathematical tools can be useful, like: the extreme value theorem of Fisher-Tippett-Gnedenko leading to possible choices of general extreme value distributions, or the Pickands-Balkema-de Haan theorem for tail fitting, or the methods related to large deviation theory. In the paper the most important stochastic distributions relevant for performing rare events statistical analysis will be presented. This concerns, for example, the analysis of the data with the annual extreme values (maxima - "Annual Maxima Series" or minima), or the peak values, exceeding given thresholds at some periods of interest ("Peak Over Threshold"), or the estimation of the size of exceedance. Despite of the fact that there is a lack of sufficient statistical data directly containing rare events, in some cases it is still possible to extract useful information from existing larger data sets. As an example one can consider some data sets available from the web sites for floods, earthquakes or generally natural hazards. Some aspects of such data sets will be also presented taking into account their usefulness for the practical assessment of risk for nuclear power plants coming from extreme weather conditions.

  11. Extremism without extremists: Deffuant model with emotions

    NASA Astrophysics Data System (ADS)

    Sobkowicz, Pawel

    2015-03-01

    The frequent occurrence of extremist views in many social contexts, often growing from small minorities to almost total majority, poses a significant challenge for democratic societies. The phenomenon can be described within the sociophysical paradigm. We present a modified version of the continuous bounded confidence opinion model, including a simple description of the influence of emotions on tolerances, and eventually on the evolution of opinions. Allowing for psychologically based correlation between the extreme opinions, high emotions and low tolerance for other people's views leads to quick dominance of the extreme views within the studied model, without introducing a special class of agents, as has been done in previous works. This dominance occurs even if the initial numbers of people with extreme opinions is very small. Possible suggestions related to mitigation of the process are briefly discussed.

  12. A comparative assessment of statistical methods for extreme weather analysis

    NASA Astrophysics Data System (ADS)

    Schlögl, Matthias; Laaha, Gregor

    2017-04-01

    Extreme weather exposure assessment is of major importance for scientists and practitioners alike. We compare different extreme value approaches and fitting methods with respect to their value for assessing extreme precipitation and temperature impacts. Based on an Austrian data set from 25 meteorological stations representing diverse meteorological conditions, we assess the added value of partial duration series over the standardly used annual maxima series in order to give recommendations for performing extreme value statistics of meteorological hazards. Results show the merits of the robust L-moment estimation, which yielded better results than maximum likelihood estimation in 62 % of all cases. At the same time, results question the general assumption of the threshold excess approach (employing partial duration series, PDS) being superior to the block maxima approach (employing annual maxima series, AMS) due to information gain. For low return periods (non-extreme events) the PDS approach tends to overestimate return levels as compared to the AMS approach, whereas an opposite behavior was found for high return levels (extreme events). In extreme cases, an inappropriate threshold was shown to lead to considerable biases that may outperform the possible gain of information from including additional extreme events by far. This effect was neither visible from the square-root criterion, nor from standardly used graphical diagnosis (mean residual life plot), but from a direct comparison of AMS and PDS in synoptic quantile plots. We therefore recommend performing AMS and PDS approaches simultaneously in order to select the best suited approach. This will make the analyses more robust, in cases where threshold selection and dependency introduces biases to the PDS approach, but also in cases where the AMS contains non-extreme events that may introduce similar biases. For assessing the performance of extreme events we recommend conditional performance measures that focus on rare events only in addition to standardly used unconditional indicators. The findings of this study are of relevance for a broad range of environmental variables, including meteorological and hydrological quantities.

  13. The rate of planet formation and the solar system's small bodies

    NASA Technical Reports Server (NTRS)

    Safronov, Viktor S.

    1991-01-01

    The evolution of random velocities and the mass distribution of preplanetary body at the early stage of accumulation are currently under review. Arguments were presented for and against the view of an extremely rapid, runaway growth of the largest bodies at this stage with parameter values of Theta approximately greater than 10(exp 3). Difficulties are encountered assuming such a large Theta: (1) bodies of the Jovian zone penetrate the asteroid zone too late and do not have time to hinder the formation of a normal-sized planet in the asteroidal zone and thereby remove a significant portion of the mass of solid matter and (2) Uranus and Neptune cannot eject bodies from the solar system into the cometary cloud. Therefore, the values Theta less than 10(exp 2) appear to be preferable.

  14. Behavior of the Position-Spread Tensor in Diatomic Systems.

    PubMed

    Brea, Oriana; El Khatib, Muammar; Angeli, Celestino; Bendazzoli, Gian Luigi; Evangelisti, Stefano; Leininger, Thierry

    2013-12-10

    The behavior of the Position-Spread Tensor (Λ) in a series of light diatomic molecules (either neutral or negative ions) is investigated at a Full Configuration Interaction level. This tensor, which is the second moment cumulant of the total position operator, is invariant with respect to molecular translations, while its trace is also rotationally invariant. Moreover, the tensor is additive in the case of noninteracting subsystems and can be seen as an intrinsic property of a molecule. In the present work, it is shown that the longitudinal component of the tensor, Λ∥, which is small for internuclear distances close to the equilibrium, tends to grow if the bond is stretched. A maximum is reached in the region of the bond breaking, then Λ∥ decreases and converges toward the isolated-atom value. The degenerate transversal components, Λ⊥, on the other hand, usually have a monotonic growth toward the atomic value. The Position Spread is extremely sensitive to reorganization of the molecular wave function, and it becomes larger in the case of an increase of the electron mobility, as illustrated by the neutral-ionic avoided crossing in LiF. For these reasons, the Position Spread can be an extremely useful property that characterizes the nature of the wave function in a molecular system.

  15. Extreme obesity is associated with variation in genes related to the circadian rhythm of food intake and hypothalamic signaling.

    PubMed

    Mariman, Edwin C M; Bouwman, Freek G; Aller, Erik E J G; van Baak, Marleen A; Wang, Ping

    2015-06-01

    The hypothalamus is important for regulation of energy intake. Mutations in genes involved in the function of the hypothalamus can lead to early-onset severe obesity. To look further into this, we have followed a strategy that allowed us to identify rare and common gene variants as candidates for the background of extreme obesity from a relatively small cohort. For that we focused on subjects with a well-selected phenotype and on a defined gene set and used a rich source of genetic data with stringent cut-off values. A list of 166 genes functionally related to the hypothalamus was generated. In those genes complete exome sequence data from 30 extreme obese subjects (60 genomes) were screened for novel rare indel, nonsense, and missense variants with a predicted negative impact on protein function. In addition, (moderately) common variants in those genes were analyzed for allelic association using the general population as reference (false discovery rate<0.05). Six novel rare deleterious missense variants were found in the genes for BAIAP3, NBEA, PRRC2A, RYR1, SIM1, and TRH, and a novel indel variant in LEPR. Common variants in the six genes for MBOAT4, NPC1, NPW, NUCB2, PER1, and PRRC2A showed significant allelic association with extreme obesity. Our findings underscore the complexity of the genetic background of extreme obesity involving rare and common variants of genes from defined metabolic and physiologic processes, in particular regulation of the circadian rhythm of food intake and hypothalamic signaling. Copyright © 2015 the American Physiological Society.

  16. Generalized extreme gust wind speeds distributions

    USGS Publications Warehouse

    Cheng, E.; Yeung, C.

    2002-01-01

    Since summer 1996, the US wind engineers are using the extreme gust (or 3-s gust) as the basic wind speed to quantify the destruction of extreme winds. In order to better understand these destructive wind forces, it is important to know the appropriate representations of these extreme gust wind speeds. Therefore, the purpose of this study is to determine the most suitable extreme value distributions for the annual extreme gust wind speeds recorded in large selected areas. To achieve this objective, we are using the generalized Pareto distribution as the diagnostic tool for determining the types of extreme gust wind speed distributions. The three-parameter generalized extreme value distribution function is, thus, reduced to either Type I Gumbel, Type II Frechet or Type III reverse Weibull distribution function for the annual extreme gust wind speeds recorded at a specific site.With the considerations of the quality and homogeneity of gust wind data collected at more than 750 weather stations throughout the United States, annual extreme gust wind speeds at selected 143 stations in the contiguous United States were used in the study. ?? 2002 Elsevier Science Ltd. All rights reserved.

  17. Analysis of the dependence of extreme rainfalls

    NASA Astrophysics Data System (ADS)

    Padoan, Simone; Ancey, Christophe; Parlange, Marc

    2010-05-01

    The aim of spatial analysis is to quantitatively describe the behavior of environmental phenomena such as precipitation levels, wind speed or daily temperatures. A number of generic approaches to spatial modeling have been developed[1], but these are not necessarily ideal for handling extremal aspects given their focus on mean process levels. The areal modelling of the extremes of a natural process observed at points in space is important in environmental statistics; for example, understanding extremal spatial rainfall is crucial in flood protection. In light of recent concerns over climate change, the use of robust mathematical and statistical methods for such analyses has grown in importance. Multivariate extreme value models and the class of maxstable processes [2] have a similar asymptotic motivation to the univariate Generalized Extreme Value (GEV) distribution , but providing a general approach to modeling extreme processes incorporating temporal or spatial dependence. Statistical methods for max-stable processes and data analyses of practical problems are discussed by [3] and [4]. This work illustrates methods to the statistical modelling of spatial extremes and gives examples of their use by means of a real extremal data analysis of Switzerland precipitation levels. [1] Cressie, N. A. C. (1993). Statistics for Spatial Data. Wiley, New York. [2] de Haan, L and Ferreria A. (2006). Extreme Value Theory An Introduction. Springer, USA. [3] Padoan, S. A., Ribatet, M and Sisson, S. A. (2009). Likelihood-Based Inference for Max-Stable Processes. Journal of the American Statistical Association, Theory & Methods. In press. [4] Davison, A. C. and Gholamrezaee, M. (2009), Geostatistics of extremes. Journal of the Royal Statistical Society, Series B. To appear.

  18. A Test-Length Correction to the Estimation of Extreme Proficiency Levels

    ERIC Educational Resources Information Center

    Magis, David; Beland, Sebastien; Raiche, Gilles

    2011-01-01

    In this study, the estimation of extremely large or extremely small proficiency levels, given the item parameters of a logistic item response model, is investigated. On one hand, the estimation of proficiency levels by maximum likelihood (ML), despite being asymptotically unbiased, may yield infinite estimates. On the other hand, with an…

  19. Examining global extreme sea level variations on the coast from in-situ and remote observations

    NASA Astrophysics Data System (ADS)

    Menendez, Melisa; Benkler, Anna S.

    2017-04-01

    The estimation of extreme water level values on the coast is a requirement for a wide range of engineering and coastal management applications. In addition, climate variations of extreme sea levels on the coastal area result from a complex interacting of oceanic, atmospheric and terrestrial processes across a wide range of spatial and temporal scales. In this study, variations of extreme sea level return values are investigated from two available sources of information: in-situ tide-gauge records and satellite altimetry data. Long time series of sea level from tide-gauge records are the most valuable observations since they directly measure water level in a specific coastal location. They have however a number of sources of in-homogeneities that may affect the climate description of extremes when this data source is used. Among others, the presence of gaps, historical time in-homogeneities and jumps in the mean sea level signal are factors that can provide uncertainty in the characterization of the extreme sea level behaviour. Moreover, long records from tide-gauges are sparse and there are many coastal areas worldwide without in-situ available information. On the other hand, with the accumulating altimeter records of several satellite missions from the 1990s, approaching 25 recorded years at the time of writing, it is becoming possible the analysis of extreme sea level events from this data source. Aside the well-known issue of altimeter measurements very close to the coast (mainly due to corruption by land, wet troposphere path delay errors and local tide effects on the coastal area), there are other aspects that have to be considered when sea surface height values estimated from satellite are going to be used in a statistical extreme model, such as the use of a multi-mission product to get long observed periods and the selection of the maxima sample, since altimeter observations do not provide values uniform in time and space. Here, we have compared the extreme values of 'still water level' and 'non-tidal-residual' of in-situ records from the GESLA2 dataset (Woodworth et al. 2016) against the novel coastal altimetry datasets (Cipollini et al. 2016). Seasonal patterns, inter-annual variability and long-term trends are analyzed. Then, a time-dependent extreme model (Menendez et al. 2009) is applied to characterize extreme sea level return values and their variability on the coastal area around the world.

  20. Annual Rainfall Maxima: Theoretical Estimation of the GEV Shape Parameter k Using Multifractal Models

    NASA Astrophysics Data System (ADS)

    Veneziano, D.; Langousis, A.; Lepore, C.

    2009-12-01

    The annual maximum of the average rainfall intensity in a period of duration d, Iyear(d), is typically assumed to have generalized extreme value (GEV) distribution. The shape parameter k of that distribution is especially difficult to estimate from either at-site or regional data, making it important to constraint k using theoretical arguments. In the context of multifractal representations of rainfall, we observe that standard theoretical estimates of k from extreme value (EV) and extreme excess (EE) theories do not apply, while estimates from large deviation (LD) theory hold only for very small d. We then propose a new theoretical estimator based on fitting GEV models to the numerically calculated distribution of Iyear(d). A standard result from EV and EE theories is that k depends on the tail behavior of the average rainfall in d, I(d). This result holds if Iyear(d) is the maximum of a sufficiently large number n of variables, all distributed like I(d); therefore its applicability hinges on whether n = 1yr/d is large enough and the tail of I(d) is sufficiently well known. One typically assumes that at least for small d the former condition is met, but poor knowledge of the upper tail of I(d) remains an obstacle for all d. In fact, in the case of multifractal rainfall, also the first condition is not met because, irrespective of d, 1yr/d is too small (Veneziano et al., 2009, WRR, in press). Applying large deviation (LD) theory to this multifractal case, we find that, as d → 0, Iyear(d) approaches a GEV distribution whose shape parameter kLD depends on a region of the distribution of I(d) well below the upper tail, is always positive (in the EV2 range), is much larger than the value predicted by EV and EE theories, and can be readily found from the scaling properties of I(d). The scaling properties of rainfall can be inferred also from short records, but the limitation remains that the result holds under d → 0 not for finite d. Therefore, for different reasons, none of the above asymptotic theories applies to Iyear(d). In practice, one is interested in the distribution of Iyear(d) over a finite range of averaging durations d and return periods T. Using multifractal representations of rainfall, we have numerically calculated the distribution of Iyear(d) and found that, although not GEV, the distribution can be accurately approximated by a GEV model. The best-fitting parameter k depends on d, but is insensitive to the scaling properties of rainfall and the range of return periods T used for fitting. We have obtained a default expression for k(d) and compared it with estimates from historical rainfall records. The theoretical function tracks well the empirical dependence on d, although it generally overestimates the empirical k values, possibly due to deviations of rainfall from perfect scaling. This issue is under investigation.

  1. Very Low Mass Stars with Extremely Low Metallicity in the Milky Way's Halo

    NASA Astrophysics Data System (ADS)

    Aoki, Wako; Beers, Timothy C.; Takuma, Suda; Honda, Satoshi; Lee, Young Sun

    2015-08-01

    Large surveys and follow-up spectroscopic studies in the past few decades have been providing chemical abundance data for a growing number of very metal-poor ([Fe/H] <-2) stars. Most of them are red giants or main-sequence turn-off stars having masses near 0.8 solar masses. Lower mass stars with extremely low metallicity ([Fe/H] <-3) have yet to be well explored. Our high-resolution spectroscopic study for very metal-poor stars found with SDSS has identified four cool main-sequence stars with [Fe/H] <-2.5 among 137 objects (Aoki et al. 2013, AJ, 145, 13). The effective temperatures of these stars are 4500--5000 K, corresponding to a mass of around 0.5 solar masses. Our standard analysis of the high-resolution spectra based on 1D-LTE model atmospheres have obtained self-consistent chemical abundances for these objects, assuming small values of micro-turbulent velocities compared with giants and turn-off stars. The low temperature of the atmospheres of these objects enables us to measure their detailed chemical abundances. Interestingly, two of the four stars have extreme chemical abundance patterns: one has the largest excesses of heavy neutron-capture elements associated with the r-process abundance pattern known to date (Aoki et al. 2010, ApJL 723, L201), and the other exhibits low abundances of the alpha-elements and odd-Z elements, suggested to be the signatures of the yields of very massive stars ( >100 solar masses; Aoki et al. 2014, Science 345, 912). Although the sample size is still small, these results indicate the potential of very low-mass stars as probes to study the early stages of the Milky Way's halo formation.

  2. Very Low-Mass Stars with Extremely Low Metallicity in the Milky Way's Halo

    NASA Astrophysics Data System (ADS)

    Aoki, Wako; Beers, Timothy C.; Suda, Takuma; Honda, Satoshi; Lee, Young Sun

    2016-08-01

    Large surveys and follow-up spectroscopic studies in the past few decades have been providing chemical abundance data for a growing number of very metal-poor ([Fe/H] <-2) stars. Most of them are red giants or main-sequence turn-off stars having masses near 0.8 solar masses. Lower mass stars with extremely low metallicity ([Fe/H] <-3) are yet to be explored. Our high-resolution spectroscopic study for very metal-poor stars found with SDSS has identified four cool main-sequence stars with [Fe/H] <-2.5 among 137 objects (Aoki et al. 2013). The effective temperatures of these stars are 4500-5000 K, corresponding to a mass of around 0.5 solar masses. Our standard analysis of the high-resolution spectra based on 1D-LTE model atmospheres has obtained self-consistent chemical abundances for these objects, assuming small values of micro-turbulent velocities compared with giants and turn-off stars. The low temperature of the atmospheres of these objects enables us to measure their detailed chemical abundances. Interestingly, two of the four stars have extreme chemical-abundance patterns: one has the largest excesses of heavy neutron-capture elements associated with the r-process abundance pattern known to date (Aoki et al. 2010), and the other exhibits low abundances of the α-elements and odd-Z elements, suggested to be signatures of the yields of very massive stars (> 100 solar masses; Aoki et al. 2014). Although the sample size is still small, these results indicate the potential of very low-mass stars as probes to study the early stages of the Milky Way's halo formation.

  3. Estimating the Effect of Climate Change on Crop Yields and Farmland Values: The Importance of Extreme Temperatures

    EPA Pesticide Factsheets

    This is a presentation titled Estimating the Effect of Climate Change on Crop Yields and Farmland Values: The Importance of Extreme Temperatures that was given for the National Center for Environmental Economics

  4. 110 Years of the Meyer–Overton Rule: Predicting Membrane Permeability of Gases and Other Small Compounds

    PubMed Central

    Missner, Andreas; Pohl, Peter

    2010-01-01

    The transport of gaseous compounds across biological membranes is essential in all forms of life. Although it was generally accepted that gases freely penetrate the lipid matrix of biological membranes, a number of studies challenged this doctrine as they found biological membranes to have extremely low gas-permeability values. These observations led to the identification of several membrane-embedded “gas” channels, which facilitate the transport of biological active gases, such as carbon dioxide, nitric oxide, and ammonia. However, some of these findings are in contrast to the well-established solubility–diffusion model (also known as the Meyer–Overton rule), which predicts membrane permeabilities from the molecule's oil–water partition coefficient. Herein, we discuss recently reported violations of the Meyer–Overton rule for small molecules, including carboxylic acids and gases, and show that Meyer and Overton continue to rule. PMID:19514034

  5. Changes in US extreme sea levels and the role of large scale climate variations

    NASA Astrophysics Data System (ADS)

    Wahl, T.; Chambers, D. P.

    2015-12-01

    We analyze a set of 20 tide gauge records covering the contiguous United States (US) coastline and the period from 1929 to 2013 to identify long-term trends and multi-decadal variations in extreme sea levels (ESLs) relative to changes in mean sea level (MSL). Significant but small long-term trends in ESLs above/below MSL are found at individual sites along most coastline stretches, but are mostly confined to the southeast coast and the winter season when storm surges are primarily driven by extra-tropical cyclones. We identify six regions with broadly coherent and considerable multi-decadal ESL variations unrelated to MSL changes. Using a quasi-non-stationary extreme value analysis approach we show that the latter would have caused variations in design relevant return water levels (RWLs; 50 to 200 year return periods) ranging from ~10 cm to as much as 110 cm across the six regions. To explore the origin of these temporal changes and the role of large-scale climate variability we develop different sets of simple and multiple linear regression models with RWLs as dependent variables and climate indices, or tailored (toward the goal of predicting multi-decadal RWL changes) versions of them, and wind stress curl as independent predictors. The models, after being tested for spatial and temporal stability, explain up to 97% of the observed variability at individual sites and almost 80% on average. Using the model predictions as covariates for the quasi-non-stationary extreme value analysis also significantly reduces the range of change in the 100-year RWLs over time, turning a non-stationary process into a stationary one. This highlights that the models - when used with regional and global climate model output of the predictors - should also be capable of projecting future RWL changes to be used by decision makers for improved flood preparedness and long-term resiliency.

  6. Restructuring Big Data to Improve Data Access and Performance in Analytic Services Making Research More Efficient for the Study of Extreme Weather Events and Application User Communities

    NASA Astrophysics Data System (ADS)

    Ostrenga, D.; Shen, S.; Vollmer, B.; Meyer, D. L.

    2017-12-01

    NASA climate reanalysis dataset from MERRA-2 contains numerous data for atmosphere, land, and ocean, that are grouped into 95 products of archived volume over 300 TB. The data files are saved as hourly-file, day-file (hourly time interval) and month-file containing up to 125 parameters. Due to the large number of data files and the sheer data volumes, it is a challenging for users, especially those in the application research community, to handle dealing with the original data files. Most of these researchers prefer to focus on a small region or single location using the hourly data for long time periods to analyze extreme weather events or say winds for renewable energy applications. At the GES DISC, we have been working closely with the science teams and the application user community to create several new value added data products and high quality services to facilitate the use of the model data for various types of research. We have tested converting hourly data from one-day per file into different data cubes, such as one-month, one-year, or whole-mission and then continued to analyze the efficiency of the accessibility of this newly structured data through various services. Initial results have shown that compared to the original file structure, the new data has significantly improved the performance for accessing long time series. It is noticed that the performance is associated to the cube size and structure, the compression method, and how the data are accessed. The optimized data cube structure will not only improve the data access, but also enable better online analytic services for doing statistical analysis and extreme events mining. Two case studies will be presented using the newly structured data and value added services, the California drought and the extreme drought of the Northeastern states of Brazil. Furthermore, data access and analysis through cloud storage capabilities will be investigated.

  7. Development of a Water Clarity Index for the Southeastern U.S. As a Climate Indicator

    NASA Astrophysics Data System (ADS)

    Sheridan, S. C.; Hu, C.; Lee, C. C.; Barnes, B.; Pirhalla, D.; Ransi, V.; Shein, K. A.

    2014-12-01

    A common index of water quality is water clarity, which can be estimated by measuring the diffuse attenuation coefficient for downwelling irradiance (Kd). Kd estimates the availability of light to marine organisms at various depths. Marine habitats, including such species as coral and seagrass, can be negatively affected by extreme episodes of sediment suspension, where water clarity is reduced and little light penetrates. Evidence of increased stress on coastal ecosystems exists, partially due to climate change, yet a systematic analysis of extreme events and trends is difficult due to limited data. To address this concern, we have developed as a potential climate indicator a Kd-Index for nine regions along the US coast of the Gulf of Mexico, in which Kd values have been standardized over time and space to allow for a more holistic assessment of climate drivers and their trends. Variability in the Kd-Index is then assessed with regard to occurrences of surface weather types (using the Spatial Synoptic Classification), a synoptic climatology of mean sea-level-pressure patterns across the region, along with heavy precipitation events. Kd can be estimated from MODIS and SeaWiFS observations from 1997 to date; an earlier period of satellite observations from 1978-86 is also available. A non-linear autoregressive neural network model with external input (NARX) is used to develop the historical relationship between Kd-Index and atmospheric conditions, and then this model is used to simulate a full time series from 1948 to 2013. The modeled data set is strongly correlated with observations, with correlations above 0.8 for many regions. Hit rates of extreme Kd-Index values - those which would most likely be associated with a negative environmental impact - exceed 70% in some regions. Across the full data set, long term trends vary slightly across regions but are generally small. Trends in extreme events appear to be more consistently increasing across the domain.

  8. Recent trends in aviation turbine fuel properties

    NASA Technical Reports Server (NTRS)

    Friedman, R.

    1982-01-01

    Plots and tables, compiled from Department of Energy (and predecessor agency) inspection reports from 1969 to 1980, present ranges, averages, extremes, and trends for most of the 22 properties of Jet A aviation turbine fuel. In recent years, average values of aromatics content, mercaptan sulfur content, distillation temperature of 10 percent recovered, smoke point, and freezing point show small but recognizable trends toward their specification limits. About 80 percent of the fuel samples had at least one property near specification, defined as within a standard band about the specification limit. By far the most common near-specification properties were aromatics content, smoke point, and freezing point.

  9. Equilibrium structure and atomic vibrations of Nin clusters

    NASA Astrophysics Data System (ADS)

    Borisova, Svetlana D.; Rusina, Galina G.

    2017-12-01

    The equilibrium bond lengths and binding energy, second differences in energy and vibrational frequencies of free clusters Nin (2 ≤ n ≤ 20) were calculated with the use of the interaction potential obtained in the tight-binding approximation (TBA). The results show that the minimum vibration frequency plays a significant role in the evaluation of the dynamic stability of the clusters. A nonmonotonic dependence of the minimum vibration frequency of clusters on their size and the extreme values for the number of atoms in a cluster n = 4, 6, 13, and 19 are demonstrated. This result agrees with the theoretical and experimental data on stable structures of small metallic clusters.

  10. Properties of charmonia in a hot equilibrated medium

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giannuzzi, Floriana; Mannarelli, Massimo

    2009-09-01

    We investigate the properties of charmonia in a thermal medium, showing that with increasing temperature the decay widths of these mesons behave in a nontrivial way. Our analysis is based on a potential model with interaction potential extracted from thermal lattice QCD calculations of the free-energy of a static quark-antiquark pair. We find that in the crossover region some decay widths are extremely enhanced. In particular, at temperatures T{approx}T{sub c} the decay widths of the J/{psi} that depend on the value of the wave function at the origin are enhanced with respect to the values in vacuum by about amore » factor 2. In the same temperature range the decay width of the process {chi}{sub cJ}{yields}J/{psi}+{gamma} is enhanced by approximately a factor 6 with respect to the value in vacuum. At higher temperatures the charmonia states dissociate and the widths of both decay processes become vanishing small.« less

  11. Efficient bootstrap estimates for tail statistics

    NASA Astrophysics Data System (ADS)

    Breivik, Øyvind; Aarnes, Ole Johan

    2017-03-01

    Bootstrap resamples can be used to investigate the tail of empirical distributions as well as return value estimates from the extremal behaviour of the sample. Specifically, the confidence intervals on return value estimates or bounds on in-sample tail statistics can be obtained using bootstrap techniques. However, non-parametric bootstrapping from the entire sample is expensive. It is shown here that it suffices to bootstrap from a small subset consisting of the highest entries in the sequence to make estimates that are essentially identical to bootstraps from the entire sample. Similarly, bootstrap estimates of confidence intervals of threshold return estimates are found to be well approximated by using a subset consisting of the highest entries. This has practical consequences in fields such as meteorology, oceanography and hydrology where return values are calculated from very large gridded model integrations spanning decades at high temporal resolution or from large ensembles of independent and identically distributed model fields. In such cases the computational savings are substantial.

  12. Charged rotating black holes in Einstein-Maxwell-Chern-Simons theory with a negative cosmological constant

    NASA Astrophysics Data System (ADS)

    Blázquez-Salcedo, Jose Luis; Kunz, Jutta; Navarro-Lérida, Francisco; Radu, Eugen

    2017-03-01

    We consider rotating black hole solutions in five-dimensional Einstein-Maxwell-Chern-Simons theory with a negative cosmological constant and a generic value of the Chern-Simons coupling constant λ . Using both analytical and numerical techniques, we focus on cohomogeneity-1 configurations, with two equal-magnitude angular momenta, which approach at infinity a globally anti-de Sitter background. We find that the generic solutions share a number of basic properties with the known Cvetič, Lü, and Pope black holes which have λ =1 . New features occur as well; for example, when the Chern-Simons coupling constant exceeds a critical value, the solutions are no longer uniquely determined by their global charges. Moreover, the black holes possess radial excitations which can be labelled by the node number of the magnetic gauge potential function. Solutions with small values of λ possess other distinct features. For instance, the extremal black holes there form two disconnected branches, while not all near-horizon solutions are associated with global solutions.

  13. ELROI Extremely Low Resource Optical Identifier. A license plate for your satellite, and more.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palmer, David

    ELROI (Extremely Low Resource Optical Identifier) is a license plate for your satellite; a small tag that flashes an optical identification code that can be read by a small telescope on the ground. The final version of the tag will be the size of a thick postage stamp and fully autonomous: you can attach it to everything that goes into space, including small cubesats and inert debris like rocket stages, and it will keep blinking even after the satellite is shut down, reliably identifying the object from launch until re-entry.

  14. Extreme Statistics of Storm Surges in the Baltic Sea

    NASA Astrophysics Data System (ADS)

    Kulikov, E. A.; Medvedev, I. P.

    2017-11-01

    Statistical analysis of the extreme values of the Baltic Sea level has been performed for a series of observations for 15-125 years at 13 tide gauge stations. It is shown that the empirical relation between value of extreme sea level rises or ebbs (caused by storm events) and its return period in the Baltic Sea can be well approximated by the Gumbel probability distribution. The maximum values of extreme floods/ebbs of the 100-year recurrence were observed in the Gulf of Finland and the Gulf of Riga. The two longest data series, observed in Stockholm and Vyborg over 125 years, have shown a significant deviation from the Gumbel distribution for the rarest events. Statistical analysis of the hourly sea level data series reveals some asymmetry in the variability of the Baltic Sea level. The probability of rises proved higher than that of ebbs. As for the magnitude of the 100-year recurrence surge, it considerably exceeded the magnitude of ebbs almost everywhere. This asymmetry effect can be attributed to the influence of low atmospheric pressure during storms. A statistical study of extreme values has also been applied to sea level series for Narva over the period of 1994-2000, which were simulated by the ROMS numerical model. Comparisons of the "simulated" and "observed" extreme sea level distributions show that the model reproduces quite satisfactorily extreme floods of "moderate" magnitude; however, it underestimates sea level changes for the most powerful storm surges.

  15. Values of Deploying a Compact Polarimetric Radar to Monitor Extreme Precipitation in a Mountainous Area: Mineral County, Colorado

    NASA Astrophysics Data System (ADS)

    Cheong, B. L.; Kirstetter, P. E.; Yu, T. Y.; Busto, J.; Speeze, T.; Dennis, J.

    2015-12-01

    Precipitation in mountainous regions can trigger flash floods and landslides especially in areas affected by wildfire. Because of the small space-time scales required for observation, they remain poorly observed. A light-weighted X-band polarimetric radar can rapidly respond to the situation and provide continuous rainfall information with high resolution for flood forecast and emergency management. A preliminary assessment of added values to the operational practice in Mineral county, Colorado was performed in Fall 2014 and Summer 2015 with a transportable polarimetric radar deployed at the Lobo Overlook. This region is one of the numerous areas in the Rocky Mountains where the WSR-88D network does not provide sufficient weather coverage due to blockages, and the limitations have impeded forecasters and local emergency managers from making accurate predictions and issuing weather warnings. High resolution observations were collected to document the precipitation characteristics and demonstrate the added values of deploying a small weather radar in such context. The analysis of the detailed vertical structure of precipitation explain the decreased signal sampled by the operational radars. The specific microphysics analyzed though polarimetry suggest that the operational Z-R relationships may not be appropriate to monitor severe weather over this wildfire affected region. Collaboration with the local emergency managers and the National Weather Service shows the critical value of deploying mobile, polarimetric and unmanned radars in complex terrain. Several selected cases are provided in this paper for illustration.

  16. Accurate and fast multiple-testing correction in eQTL studies.

    PubMed

    Sul, Jae Hoon; Raj, Towfique; de Jong, Simone; de Bakker, Paul I W; Raychaudhuri, Soumya; Ophoff, Roel A; Stranger, Barbara E; Eskin, Eleazar; Han, Buhm

    2015-06-04

    In studies of expression quantitative trait loci (eQTLs), it is of increasing interest to identify eGenes, the genes whose expression levels are associated with variation at a particular genetic variant. Detecting eGenes is important for follow-up analyses and prioritization because genes are the main entities in biological processes. To detect eGenes, one typically focuses on the genetic variant with the minimum p value among all variants in cis with a gene and corrects for multiple testing to obtain a gene-level p value. For performing multiple-testing correction, a permutation test is widely used. Because of growing sample sizes of eQTL studies, however, the permutation test has become a computational bottleneck in eQTL studies. In this paper, we propose an efficient approach for correcting for multiple testing and assess eGene p values by utilizing a multivariate normal distribution. Our approach properly takes into account the linkage-disequilibrium structure among variants, and its time complexity is independent of sample size. By applying our small-sample correction techniques, our method achieves high accuracy in both small and large studies. We have shown that our method consistently produces extremely accurate p values (accuracy > 98%) for three human eQTL datasets with different sample sizes and SNP densities: the Genotype-Tissue Expression pilot dataset, the multi-region brain dataset, and the HapMap 3 dataset. Copyright © 2015 The American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.

  17. A dynamical model on deposit and loan of banking: A bifurcation analysis

    NASA Astrophysics Data System (ADS)

    Sumarti, Novriana; Hasmi, Abrari Noor

    2015-09-01

    A dynamical model, which is one of sophisticated techniques using mathematical equations, can determine the observed state, for example bank profits, for all future times based on the current state. It will also show small changes in the state of the system create either small or big changes in the future depending on the model. In this research we develop a dynamical system of the form: d/D d t =f (D ,L ,rD,rL,r ), d/L d t =g (D ,L ,rD,rL,r ), Here D and rD are the volume of deposit and its rate, L and rL are the volume of loan and its rate, and r is the interbank market rate. There are parameters required in this model which give connections between two variables or between two derivative functions. In this paper we simulate the model for several parameters values. We do bifurcation analysis on the dynamics of the system in order to identify the appropriate parameters that control the stability behaviour of the system. The result shows that the system will have a limit cycle for small value of interest rate of loan, so the deposit and loan volumes are fluctuating and oscillating extremely. If the interest rate of loan is too high, the loan volume will be decreasing and vanish and the system will converge to its carrying capacity.

  18. Regularised extreme learning machine with misclassification cost and rejection cost for gene expression data classification.

    PubMed

    Lu, Huijuan; Wei, Shasha; Zhou, Zili; Miao, Yanzi; Lu, Yi

    2015-01-01

    The main purpose of traditional classification algorithms on bioinformatics application is to acquire better classification accuracy. However, these algorithms cannot meet the requirement that minimises the average misclassification cost. In this paper, a new algorithm of cost-sensitive regularised extreme learning machine (CS-RELM) was proposed by using probability estimation and misclassification cost to reconstruct the classification results. By improving the classification accuracy of a group of small sample which higher misclassification cost, the new CS-RELM can minimise the classification cost. The 'rejection cost' was integrated into CS-RELM algorithm to further reduce the average misclassification cost. By using Colon Tumour dataset and SRBCT (Small Round Blue Cells Tumour) dataset, CS-RELM was compared with other cost-sensitive algorithms such as extreme learning machine (ELM), cost-sensitive extreme learning machine, regularised extreme learning machine, cost-sensitive support vector machine (SVM). The results of experiments show that CS-RELM with embedded rejection cost could reduce the average cost of misclassification and made more credible classification decision than others.

  19. Extreme value analysis of the time derivative of the horizontal magnetic field and computed electric field

    NASA Astrophysics Data System (ADS)

    Wintoft, Peter; Viljanen, Ari; Wik, Magnus

    2016-05-01

    High-frequency ( ≈ minutes) variability of ground magnetic fields is caused by ionospheric and magnetospheric processes driven by the changing solar wind. The varying magnetic fields induce electrical fields that cause currents to flow in man-made conductors like power grids and pipelines. Under extreme conditions the geomagnetically induced currents (GIC) may be harmful to the power grids. Increasing our understanding of the extreme events is thus important for solar-terrestrial science and space weather. In this work 1-min resolution of the time derivative of measured local magnetic fields (|dBh/dt|) and computed electrical fields (Eh), for locations in Europe, have been analysed with extreme value analysis (EVA). The EVA results in an estimate of the generalized extreme value probability distribution that is described by three parameters: location, width, and shape. The shape parameter controls the extreme behaviour. The stations cover geomagnetic latitudes from 40 to 70° N. All stations included in the study have contiguous coverage of 18 years or more with 1-min resolution data. As expected, the EVA shows that the higher latitude stations have higher probability of large |dBh/dt| and |Eh| compared to stations further south. However, the EVA also shows that the shape of the distribution changes with magnetic latitude. The high latitudes have distributions that fall off faster to zero than the low latitudes, and upward bounded distributions can not be ruled out. The transition occurs around 59-61° N magnetic latitudes. Thus, the EVA shows that the observed series north of ≈ 60° N have already measured values that are close to the expected maxima values, while stations south of ≈ ° N will measure larger values in the future.

  20. GRiP - A flexible approach for calculating risk as a function of consequence, vulnerability, and threat.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whitfield, R. G.; Buehring, W. A.; Bassett, G. W.

    2011-04-08

    Get a GRiP (Gravitational Risk Procedure) on risk by using an approach inspired by the physics of gravitational forces between body masses! In April 2010, U.S. Department of Homeland Security Special Events staff (Protective Security Advisors [PSAs]) expressed concern about how to calculate risk given measures of consequence, vulnerability, and threat. The PSAs believed that it is not 'right' to assign zero risk, as a multiplicative formula would imply, to cases in which the threat is reported to be extremely small, and perhaps could even be assigned a value of zero, but for which consequences and vulnerability are potentially high.more » They needed a different way to aggregate the components into an overall measure of risk. To address these concerns, GRiP was proposed and developed. The inspiration for GRiP is Sir Isaac Newton's Universal Law of Gravitation: the attractive force between two bodies is directly proportional to the product of their masses and inversely proportional to the squares of the distance between them. The total force on one body is the sum of the forces from 'other bodies' that influence that body. In the case of risk, the 'other bodies' are the components of risk (R): consequence, vulnerability, and threat (which we denote as C, V, and T, respectively). GRiP treats risk as if it were a body within a cube. Each vertex (corner) of the cube represents one of the eight combinations of minimum and maximum 'values' for consequence, vulnerability, and threat. The risk at each of the vertices is a variable that can be set. Naturally, maximum risk occurs when consequence, vulnerability, and threat are at their maximum values; minimum risk occurs when they are at their minimum values. Analogous to gravitational forces among body masses, the GRiP formula for risk states that the risk at any interior point of the box depends on the squares of the distances from that point to each of the eight vertices. The risk value at an interior (movable) point will be dominated by the value of one vertex as that point moves closer and closer to that one vertex. GRiP is a visualization tool that helps analysts better understand risk and its relationship to consequence, vulnerability, and threat. Estimates of consequence, vulnerability, and threat are external to GRiP; however, the GRiP approach can be linked to models or data that provide estimates of consequence, vulnerability, and threat. For example, the Enhanced Critical Infrastructure Program/Infrastructure Survey Tool produces a vulnerability index (scaled from 0 to 100) that can be used for the vulnerability component of GRiP. We recognize that the values used for risk components can be point estimates and that, in fact, there is uncertainty regarding the exact values of C, V, and T. When we use T = t{sub o} (where t{sub o} is a value of threat in its range), we mean that threat is believed to be in an interval around t{sub o}. Hence, a value of t{sub o} = 0 indicates a 'best estimate' that the threat level is equal to zero, but still allows that it is not impossible for the threat to occur. When t{sub o} = 0 but is potentially small and not exactly zero, there will be little impact on the overall risk value as long as the C and V components are not large. However, when C and/or V have large values, there can be large differences in risk given t{sub o} = 0, and t{sub o} = epsilon (where epsilon is small but greater than a value of zero). We believe this scenario explains the PSA's intuition that risk is not equal to zero when t{sub o} = 0 and C and/or V have large values. (They may also be thinking that if C has an extremely large value, it is unlikely that T is equal to 0; in the terrorist context, T would likely be dependent on C when C is extremely large.) The PSAs are implicitly recognizing the potential that t{sub o} = epsilon. One way to take this possible scenario into account is to replace point estimates for risk with interval values that reflect the uncertainty in the risk components. In fact, one could argue that T never equals zero for a man-made hazard. This paper describes the thought process that led to the GRiP approach and the mathematical formula for GRiP and presents a few examples that will provide insights about how to use GRiP and interpret its results.« less

  1. Regional estimation of extreme suspended sediment concentrations using watershed characteristics

    NASA Astrophysics Data System (ADS)

    Tramblay, Yves; Ouarda, Taha B. M. J.; St-Hilaire, André; Poulin, Jimmy

    2010-01-01

    SummaryThe number of stations monitoring daily suspended sediment concentration (SSC) has been decreasing since the 1980s in North America while suspended sediment is considered as a key variable for water quality. The objective of this study is to test the feasibility of regionalising extreme SSC, i.e. estimating SSC extremes values for ungauged basins. Annual maximum SSC for 72 rivers in Canada and USA were modelled with probability distributions in order to estimate quantiles corresponding to different return periods. Regionalisation techniques, originally developed for flood prediction in ungauged basins, were tested using the climatic, topographic, land cover and soils attributes of the watersheds. Two approaches were compared, using either physiographic characteristics or seasonality of extreme SSC to delineate the regions. Multiple regression models to estimate SSC quantiles as a function of watershed characteristics were built in each region, and compared to a global model including all sites. Regional estimates of SSC quantiles were compared with the local values. Results show that regional estimation of extreme SSC is more efficient than a global regression model including all sites. Groups/regions of stations have been identified, using either the watershed characteristics or the seasonality of occurrence for extreme SSC values providing a method to better describe the extreme events of SSC. The most important variables for predicting extreme SSC are the percentage of clay in the soils, precipitation intensity and forest cover.

  2. Control of polymer network topology in semi-batch systems

    NASA Astrophysics Data System (ADS)

    Wang, Rui; Olsen, Bradley; Johnson, Jeremiah

    Polymer networks invariably possess topological defects: loops of different orders. Since small loops (primary loops and secondary loops) both lower the modulus of network and lead to stress concentration that causes material failure at low deformation, it is desirable to greatly reduce the loop fraction. We have shown that achieving loop fraction close to zero is extremely difficult in the batch process due to the slow decay of loop fraction with the polymer concentration and chain length. Here, we develop a modified kinetic graph theory that can model network formation reactions in semi-batch systems. We demonstrate that the loop fraction is not sensitive to the feeding policy if the reaction volume maintains constant during the network formation. However, if we initially put concentrated solution of small junction molecules in the reactor and continuously adding polymer solutions, the fractions of both primary loop and higher-order loops will be significantly reduced. There is a limiting value (nonzero) of loop fraction that can be achieved in the semi-batch system in condition of extremely slow feeding rate. This minimum loop fraction only depends on a single dimensionless variable, the product of concentration and with single chain pervaded volume, and defines an operating zone in which the loop fraction of polymer networks can be controlled through adjusting the feeding rate of the semi-batch process.

  3. Falling head ponded infiltration in the nonlinear limit

    NASA Astrophysics Data System (ADS)

    Triadis, D.

    2014-12-01

    The Green and Ampt infiltration solution represents only an extreme example of behavior within a larger class of very nonlinear, delta function diffusivity soils. The mathematical analysis of these soils is greatly simplified by the existence of a sharp wetting front below the soil surface. Solutions for more realistic delta function soil models have recently been presented for infiltration under surface saturation without ponding. After general formulation of the problem, solutions for a full suite of delta function soils are derived for ponded surface water depleted by infiltration. Exact expressions for the cumulative infiltration as a function of time, or the drainage time as a function of the initial ponded depth may take implicit or parametric forms, and are supplemented by simple asymptotic expressions valid for small times, and small and large initial ponded depths. As with surface saturation without ponding, the Green-Ampt model overestimates the effect of the soil hydraulic conductivity. At the opposing extreme, a low-conductivity model is identified that also takes a very simple mathematical form and appears to be more accurate than the Green-Ampt model for larger ponded depths. Between these two, the nonlinear limit of Gardner's soil is recommended as a physically valid first approximation. Relative discrepancies between different soil models are observed to reach a maximum for intermediate values of the dimensionless initial ponded depth, and in general are smaller than for surface saturation without ponding.

  4. Stationary and non-stationary extreme value modeling of extreme temperature in Malaysia

    NASA Astrophysics Data System (ADS)

    Hasan, Husna; Salleh, Nur Hanim Mohd; Kassim, Suraiya

    2014-09-01

    Extreme annual temperature of eighteen stations in Malaysia is fitted to the Generalized Extreme Value distribution. Stationary and non-stationary models with trend are considered for each station and the Likelihood Ratio test is used to determine the best-fitting model. Results show that three out of eighteen stations i.e. Bayan Lepas, Labuan and Subang favor a model which is linear in the location parameter. A hierarchical cluster analysis is employed to investigate the existence of similar behavior among the stations. Three distinct clusters are found in which one of them consists of the stations that favor the non-stationary model. T-year estimated return levels of the extreme temperature are provided based on the chosen models.

  5. Cool North European summers and possible links to explosive volcanic eruptions

    NASA Astrophysics Data System (ADS)

    Jones, P. D.; Melvin, T. M.; Harpham, C.; Grudd, H.; Helama, S.

    2013-06-01

    Exactly dated tree-ring measurements such as ring width (TRW) and maximum latewood density (MXD), which are sensitive to summer temperatures, can provide possible routes to investigate the occurrence of hemispheric-wide cool summers that might be linked to explosive tropical volcanic eruptions. These measurements can provide much longer records than the instrumental period, enabling much longer periods to be assessed and offers the potential to look at much larger eruptions than recorded over the last 200 years. This paper looks at TRW evidence from Northern Fennoscandia extending over the last 7500 years, using two independently produced chronologies from northern Sweden and northern Finland. TRW is less responsive than MXD to cool summer temperatures, but MXD is only available for the last 2000 years. Additionally, looking at a relatively small location, compared to the Northern Hemisphere average, adds considerable noise. Progress in this area is likely to be made by developing more millennial-long TRW series across northern high latitudes or being able to develop MXD series from the sub-fossil material, which comprises most of the samples prior to the last 1000 years. The three most extreme negative values for the region for the last 2000 years are 1601, 542, and 1837, although the latter is not extreme in a long instrumental record for the region. The most extreme year of all occurred in 330BC. Of the 20 most extreme negative years, nine occurred during the AD years with the remaining 11 occurring during the prior 5500 years.

  6. Gravitational Waves From the Kerr/CFT Correspondence

    NASA Astrophysics Data System (ADS)

    Porfyriadis, Achilleas

    Astronomical observation suggests the existence of near-extreme Kerr black holes in the sky. Properties of diffeomorphisms imply that dynamics of the near-horizon region of near-extreme Kerr are governed by an infinite-dimensional conformal symmetry. This symmetry may be exploited to analytically, rather than numerically, compute a variety of potentially observable processes. In this thesis we compute the gravitational radiation emitted by a small compact object that orbits in the near-horizon region and plunges into the horizon of a large rapidly rotating black hole. We study the holographically dual processes in the context of the Kerr/CFT correspondence and find our conformal field theory (CFT) computations in perfect agreement with the gravity results. We compute the radiation emitted by a particle on the innermost stable circular orbit (ISCO) of a rapidly spinning black hole. We confirm previous estimates of the overall scaling of the power radiated, but show that there are also small oscillations all the way to extremality. Furthermore, we reveal an intricate mode-by-mode structure in the flux to infinity, with only certain modes having the dominant scaling. The scaling of each mode is controlled by its conformal weight. Massive objects in adiabatic quasi-circular inspiral towards a near-extreme Kerr black hole quickly plunge into the horizon after passing the ISCO. The post-ISCO plunge trajectory is shown to be related by a conformal map to a circular orbit. Conformal symmetry of the near-horizon region is then used to compute analytically the gravitational radiation produced during the plunge phase. Most extreme-mass-ratio-inspirals of small compact objects into supermassive black holes end with a fast plunge from an eccentric last stable orbit. We use conformal transformations to analytically solve for the radiation emitted from various fast plunges into extreme and near-extreme Kerr black holes.

  7. On the identification of Dragon Kings among extreme-valued outliers

    NASA Astrophysics Data System (ADS)

    Riva, M.; Neuman, S. P.; Guadagnini, A.

    2013-07-01

    Extreme values of earth, environmental, ecological, physical, biological, financial and other variables often form outliers to heavy tails of empirical frequency distributions. Quite commonly such tails are approximated by stretched exponential, log-normal or power functions. Recently there has been an interest in distinguishing between extreme-valued outliers that belong to the parent population of most data in a sample and those that do not. The first type, called Gray Swans by Nassim Nicholas Taleb (often confused in the literature with Taleb's totally unknowable Black Swans), is drawn from a known distribution of the tails which can thus be extrapolated beyond the range of sampled values. However, the magnitudes and/or space-time locations of unsampled Gray Swans cannot be foretold. The second type of extreme-valued outliers, termed Dragon Kings by Didier Sornette, may in his view be sometimes predicted based on how other data in the sample behave. This intriguing prospect has recently motivated some authors to propose statistical tests capable of identifying Dragon Kings in a given random sample. Here we apply three such tests to log air permeability data measured on the faces of a Berea sandstone block and to synthetic data generated in a manner statistically consistent with these measurements. We interpret the measurements to be, and generate synthetic data that are, samples from α-stable sub-Gaussian random fields subordinated to truncated fractional Gaussian noise (tfGn). All these data have frequency distributions characterized by power-law tails with extreme-valued outliers about the tail edges.

  8. The Extreme Spin of the Black Hole in Cygnus X-1

    NASA Technical Reports Server (NTRS)

    Gou, Lijun; McClintock, Jeffre E.; Reid, Mark J.; Orosz, Jerome A.; Steiner, James F.; Narayan, Ramesh; Xiang, Jingen; Remillard, Ronald A.; Arnaud, Keith A.; Davis, Shane W.

    2005-01-01

    The compact primary in the X-ray binary Cygnus X-1 was the first black hole to be established via dynamical observatIOns. We have recently determined accurate values for its mass and distance, and for the orbital inclination angle of the binary. Building on these.results, which are based on our favored (asynchronous) dynamical model, we have measured the radius of the inner edge of the black hole's accretion disk by fitting its thermal continuum.spectrum to a fully relativistic model of a thin accretion disk. Assuming that the spin axis of the black hole is aligned with the orbital angular momentum vector, we have determined that Cygnus X-I contains a near-extreme Kerr black hole with a spin parameter a* > 0.95 (3(sigma)). For a less probable (synchronous) dynamIcal model, we find a* > 0.92 (3(sigma)). In our analysis, we include the uncertainties in black hole mass orbital inclination angle and distance, and we also include the uncertainty in the calibration of the absolute flux via the Crab. These four sources of uncertainty totally dominate the error budget. The uncertainties introduced by the thin-disk model we employ are particularly small in this case given the extreme spin of the black hole and the disk's low luminosity.

  9. The Extreme Spin of the Black Hole in Cygnus X-1

    NASA Technical Reports Server (NTRS)

    Gou, Lijun; McClintock, Jeffrey E.; Reid, Mark J.; Orosz, Jerome A.; Steiner, James F.; Narayan, Ramesh; Xiang, Jingen; Remillard, Ronald A.; Arnaud, Keith A.; Davis, Shane W.

    2011-01-01

    The compact primary in the X-ray binary Cygnus X-1 was the first black hole to be established via dynamical observations. We have recently determined accurate values for its mass and distance, and for the orbital inclination angle of the binary. Building on these results, which are based on our favored (asynchronous) dynamical model, we have measured the radius of the inner edge of the black hole s accretion disk by fitting its thermal continuum spectrum to a fully relativistic model of a thin accretion disk. Assuming that the spin axis of the black hole is aligned with the orbital angular momentum vector, we have determined that Cygnus X-1 contains a near-extreme Kerr black hole with a spin parameter a* > 0.95 (3(sigma)). For a less probable (synchronous) dynamical model, we find a. > 0.92 (3 ). In our analysis, we include the uncertainties in black hole mass, orbital inclination angle, and distance, and we also include the uncertainty in the calibration of the absolute flux via the Crab. These four sources of uncertainty totally dominate the error budget. The uncertainties introduced by the thin-disk model we employ are particularly small in this case given the extreme spin of the black hole and the disk s low luminosity.

  10. Extreme between-study homogeneity in meta-analyses could offer useful insights.

    PubMed

    Ioannidis, John P A; Trikalinos, Thomas A; Zintzaras, Elias

    2006-10-01

    Meta-analyses are routinely evaluated for the presence of large between-study heterogeneity. We examined whether it is also important to probe whether there is extreme between-study homogeneity. We used heterogeneity tests with left-sided statistical significance for inference and developed a Monte Carlo simulation test for testing extreme homogeneity in risk ratios across studies, using the empiric distribution of the summary risk ratio and heterogeneity statistic. A left-sided P=0.01 threshold was set for claiming extreme homogeneity to minimize type I error. Among 11,803 meta-analyses with binary contrasts from the Cochrane Library, 143 (1.21%) had left-sided P-value <0.01 for the asymptotic Q statistic and 1,004 (8.50%) had left-sided P-value <0.10. The frequency of extreme between-study homogeneity did not depend on the number of studies in the meta-analyses. We identified examples where extreme between-study homogeneity (left-sided P-value <0.01) could result from various possibilities beyond chance. These included inappropriate statistical inference (asymptotic vs. Monte Carlo), use of a specific effect metric, correlated data or stratification using strong predictors of outcome, and biases and potential fraud. Extreme between-study homogeneity may provide useful insights about a meta-analysis and its constituent studies.

  11. Extreme events in total ozone over the Northern mid-latitudes: an analysis based on long-term data sets from five European ground-based stations

    NASA Astrophysics Data System (ADS)

    Rieder, Harald E.; Jancso, Leonhardt M.; Rocco, Stefania Di; Staehelin, Johannes; Maeder, Joerg A.; Peter, Thomas; Ribatet, Mathieu; Davison, Anthony C.; de Backer, Hugo; Koehler, Ulf; Krzyścin, Janusz; Vaníček, Karel

    2011-11-01

    We apply methods from extreme value theory to identify extreme events in high (termed EHOs) and low (termed ELOs) total ozone and to describe the distribution tails (i.e. very high and very low values) of five long-term European ground-based total ozone time series. The influence of these extreme events on observed mean values, long-term trends and changes is analysed. The results show a decrease in EHOs and an increase in ELOs during the last decades, and establish that the observed downward trend in column ozone during the 1970-1990s is strongly dominated by changes in the frequency of extreme events. Furthermore, it is shown that clear ‘fingerprints’ of atmospheric dynamics (NAO, ENSO) and chemistry [ozone depleting substances (ODSs), polar vortex ozone loss] can be found in the frequency distribution of ozone extremes, even if no attribution is possible from standard metrics (e.g. annual mean values). The analysis complements earlier analysis for the world's longest total ozone record at Arosa, Switzerland, confirming and revealing the strong influence of atmospheric dynamics on observed ozone changes. The results provide clear evidence that in addition to ODS, volcanic eruptions and strong/moderate ENSO and NAO events had significant influence on column ozone in the European sector.

  12. Hodge Decomposition of Information Flow on Small-World Networks.

    PubMed

    Haruna, Taichi; Fujiki, Yuuya

    2016-01-01

    We investigate the influence of the small-world topology on the composition of information flow on networks. By appealing to the combinatorial Hodge theory, we decompose information flow generated by random threshold networks on the Watts-Strogatz model into three components: gradient, harmonic and curl flows. The harmonic and curl flows represent globally circular and locally circular components, respectively. The Watts-Strogatz model bridges the two extreme network topologies, a lattice network and a random network, by a single parameter that is the probability of random rewiring. The small-world topology is realized within a certain range between them. By numerical simulation we found that as networks become more random the ratio of harmonic flow to the total magnitude of information flow increases whereas the ratio of curl flow decreases. Furthermore, both quantities are significantly enhanced from the level when only network structure is considered for the network close to a random network and a lattice network, respectively. Finally, the sum of these two ratios takes its maximum value within the small-world region. These findings suggest that the dynamical information counterpart of global integration and that of local segregation are the harmonic flow and the curl flow, respectively, and that a part of the small-world region is dominated by internal circulation of information flow.

  13. Non-negative infrared patch-image model: Robust target-background separation via partial sum minimization of singular values

    NASA Astrophysics Data System (ADS)

    Dai, Yimian; Wu, Yiquan; Song, Yu; Guo, Jun

    2017-03-01

    To further enhance the small targets and suppress the heavy clutters simultaneously, a robust non-negative infrared patch-image model via partial sum minimization of singular values is proposed. First, the intrinsic reason behind the undesirable performance of the state-of-the-art infrared patch-image (IPI) model when facing extremely complex backgrounds is analyzed. We point out that it lies in the mismatching of IPI model's implicit assumption of a large number of observations with the reality of deficient observations of strong edges. To fix this problem, instead of the nuclear norm, we adopt the partial sum of singular values to constrain the low-rank background patch-image, which could provide a more accurate background estimation and almost eliminate all the salient residuals in the decomposed target image. In addition, considering the fact that the infrared small target is always brighter than its adjacent background, we propose an additional non-negative constraint to the sparse target patch-image, which could not only wipe off more undesirable components ulteriorly but also accelerate the convergence rate. Finally, an algorithm based on inexact augmented Lagrange multiplier method is developed to solve the proposed model. A large number of experiments are conducted demonstrating that the proposed model has a significant improvement over the other nine competitive methods in terms of both clutter suppressing performance and convergence rate.

  14. Compilation of 1986 annual reports of the Navy ELF (Extremely Low Frequency) communications system ecological monitoring program, volume 2

    NASA Astrophysics Data System (ADS)

    1987-07-01

    The U.S. Navy is conducting a long-term program to monitor for possible effects from the operation of its Extremely Low Frequency (ELF) Communications System to resident biota and their ecological relationships. This report documents progress of the following studies: soil amoeba; soil and litter arthropoda and earthworm studies; biological studies on pollinating insects: megachilid bees; and small vertebrates: small mammals and nesting birds.

  15. Extreme weather exposure identification for road networks - a comparative assessment of statistical methods

    NASA Astrophysics Data System (ADS)

    Schlögl, Matthias; Laaha, Gregor

    2017-04-01

    The assessment of road infrastructure exposure to extreme weather events is of major importance for scientists and practitioners alike. In this study, we compare the different extreme value approaches and fitting methods with respect to their value for assessing the exposure of transport networks to extreme precipitation and temperature impacts. Based on an Austrian data set from 25 meteorological stations representing diverse meteorological conditions, we assess the added value of partial duration series (PDS) over the standardly used annual maxima series (AMS) in order to give recommendations for performing extreme value statistics of meteorological hazards. Results show the merits of the robust L-moment estimation, which yielded better results than maximum likelihood estimation in 62 % of all cases. At the same time, results question the general assumption of the threshold excess approach (employing PDS) being superior to the block maxima approach (employing AMS) due to information gain. For low return periods (non-extreme events) the PDS approach tends to overestimate return levels as compared to the AMS approach, whereas an opposite behavior was found for high return levels (extreme events). In extreme cases, an inappropriate threshold was shown to lead to considerable biases that may outperform the possible gain of information from including additional extreme events by far. This effect was visible from neither the square-root criterion nor standardly used graphical diagnosis (mean residual life plot) but rather from a direct comparison of AMS and PDS in combined quantile plots. We therefore recommend performing AMS and PDS approaches simultaneously in order to select the best-suited approach. This will make the analyses more robust, not only in cases where threshold selection and dependency introduces biases to the PDS approach but also in cases where the AMS contains non-extreme events that may introduce similar biases. For assessing the performance of extreme events we recommend the use of conditional performance measures that focus on rare events only in addition to standardly used unconditional indicators. The findings of the study directly address road and traffic management but can be transferred to a range of other environmental variables including meteorological and hydrological quantities.

  16. Effect of size on bulk and surface cohesion energy of metallic nano-particles

    NASA Astrophysics Data System (ADS)

    Yaghmaee, M. S.; Shokri, B.

    2007-04-01

    The knowledge of nano-material properties not only helps us to understand the extreme behaviour of small-scale materials better (expected to be different from what we observe from their bulk value) but also helps us to analyse and design new advanced functionalized materials through different nano technologies. Among these fundamental properties, the cohesion (binding) energy mainly describes most behaviours of materials in different environments. In this work, we discuss this fundamental property through a nano-thermodynamical approach using two algorithms, where in the first approach the size dependence of the inner (bulk) cohesion energy is studied, and in the second approach the surface cohesion energy is considered too. The results, which are presented through a computational demonstration (for four different metals: Al, Ga, W and Ag), can be compared with some experimental values for W metallic nano-particles.

  17. On the relationship between cumulative correlation coefficients and the quality of crystallographic data sets.

    PubMed

    Wang, Jimin; Brudvig, Gary W; Batista, Victor S; Moore, Peter B

    2017-12-01

    In 2012, Karplus and Diederichs demonstrated that the Pearson correlation coefficient CC 1/2 is a far better indicator of the quality and resolution of crystallographic data sets than more traditional measures like merging R-factor or signal-to-noise ratio. More specifically, they proposed that CC 1/2 be computed for data sets in thin shells of increasing resolution so that the resolution dependence of that quantity can be examined. Recently, however, the CC 1/2 values of entire data sets, i.e., cumulative correlation coefficients, have been used as a measure of data quality. Here, we show that the difference in cumulative CC 1/2 value between a data set that has been accurately measured and a data set that has not is likely to be small. Furthermore, structures obtained by molecular replacement from poorly measured data sets are likely to suffer from extreme model bias. © 2017 The Protein Society.

  18. Determining return water levels at ungauged coastal sites: a case study for northern Germany

    NASA Astrophysics Data System (ADS)

    Arns, Arne; Wahl, Thomas; Haigh, Ivan D.; Jensen, Jürgen

    2015-04-01

    We estimate return periods and levels of extreme still water levels for the highly vulnerable and historically and culturally important small marsh islands known as the Halligen, located in the Wadden Sea offshore of the coast of northern Germany. This is a challenging task as only few water level records are available for this region, and they are currently too short to apply traditional extreme value analysis methods. Therefore, we use the Regional Frequency Analysis (RFA) approach. This originates from hydrology but has been used before in several coastal studies and is also currently applied by the local federal administration responsible for coastal protection in the study area. The RFA enables us to indirectly estimate return levels by transferring hydrological information from gauged to related ungauged sites. Our analyses highlight that this methodology has some drawbacks and may over- or underestimate return levels compared to direct analyses using station data. To overcome these issues, we present an alternative approach, combining numerical and statistical models. First, we produced a numerical multidecadal model hindcast of water levels for the entire North Sea. Predicted water levels from the hindcast are bias corrected using the information from the available tide gauge records. Hence, the simulated water levels agree well with the measured water levels at gauged sites. The bias correction is then interpolated spatially to obtain correction functions for the simulated water levels at each coastal and island model grid point in the study area. Using a recommended procedure to conduct extreme value analyses from a companion study, return water levels suitable for coastal infrastructure design are estimated continuously along the entire coastline of the study area, including the offshore islands. A similar methodology can be applied in other regions of the world where tide gauge observations are sparse.

  19. Introduction of a Novel Loss Data Normalization Method for Improved Estimation of Extreme Losses from Natural Catastrophes

    NASA Astrophysics Data System (ADS)

    Eichner, J. F.; Steuer, M.; Loew, P.

    2016-12-01

    Past natural catastrophes offer valuable information for present-day risk assessment. To make use of historic loss data one has to find a setting that enables comparison (over place and time) of historic events happening under today's conditions. By means of loss data normalization the influence of socio-economic development, as the fundamental driver in this context, can be eliminated and the data gives way to the deduction of risk-relevant information and allows the study of other driving factors such as influences from climate variability and climate change or changes of vulnerability. Munich Re's NatCatSERVICE database includes for each historic loss event the geographic coordinates of all locations and regions that were affected in a relevant way. These locations form the basis for what is known as the loss footprint of an event. Here we introduce a state of the art and robust method for global loss data normalization. The presented peril-specific loss footprint normalization method adjusts direct economic loss data to the influence of economic growth within each loss footprint (by using gross cell product data as proxy for local economic growth) and makes loss data comparable over time. To achieve a comparative setting for supra-regional economic differences, we categorize the normalized loss values (together with information on fatalities) based on the World Bank income groups into five catastrophe classes, from minor to catastrophic. The data treated in such way allows (a) for studying the influence of improved reporting of small scale loss events over time and (b) for application of standard (stationary) extreme value statistics (here: peaks over threshold method) to compile estimates for extreme and extrapolated loss magnitudes such as a "100 year event" on global scale. Examples of such results will be shown.

  20. Polygenic determinants in extremes of high-density lipoprotein cholesterol[S

    PubMed Central

    Dron, Jacqueline S.; Wang, Jian; Low-Kam, Cécile; Khetarpal, Sumeet A.; Robinson, John F.; McIntyre, Adam D.; Ban, Matthew R.; Cao, Henian; Rhainds, David; Dubé, Marie-Pierre; Rader, Daniel J.; Lettre, Guillaume; Tardif, Jean-Claude

    2017-01-01

    HDL cholesterol (HDL-C) remains a superior biochemical predictor of CVD risk, but its genetic basis is incompletely defined. In patients with extreme HDL-C concentrations, we concurrently evaluated the contributions of multiple large- and small-effect genetic variants. In a discovery cohort of 255 unrelated lipid clinic patients with extreme HDL-C levels, we used a targeted next-generation sequencing panel to evaluate rare variants in known HDL metabolism genes, simultaneously with common variants bundled into a polygenic trait score. Two additional cohorts were used for validation and included 1,746 individuals from the Montréal Heart Institute Biobank and 1,048 individuals from the University of Pennsylvania. Findings were consistent between cohorts: we found rare heterozygous large-effect variants in 18.7% and 10.9% of low- and high-HDL-C patients, respectively. We also found common variant accumulation, indicated by extreme polygenic trait scores, in an additional 12.8% and 19.3% of overall cases of low- and high-HDL-C extremes, respectively. Thus, the genetic basis of extreme HDL-C concentrations encountered clinically is frequently polygenic, with contributions from both rare large-effect and common small-effect variants. Multiple types of genetic variants should be considered as contributing factors in patients with extreme dyslipidemia. PMID:28870971

  1. Polygenic determinants in extremes of high-density lipoprotein cholesterol.

    PubMed

    Dron, Jacqueline S; Wang, Jian; Low-Kam, Cécile; Khetarpal, Sumeet A; Robinson, John F; McIntyre, Adam D; Ban, Matthew R; Cao, Henian; Rhainds, David; Dubé, Marie-Pierre; Rader, Daniel J; Lettre, Guillaume; Tardif, Jean-Claude; Hegele, Robert A

    2017-11-01

    HDL cholesterol (HDL-C) remains a superior biochemical predictor of CVD risk, but its genetic basis is incompletely defined. In patients with extreme HDL-C concentrations, we concurrently evaluated the contributions of multiple large- and small-effect genetic variants. In a discovery cohort of 255 unrelated lipid clinic patients with extreme HDL-C levels, we used a targeted next-generation sequencing panel to evaluate rare variants in known HDL metabolism genes, simultaneously with common variants bundled into a polygenic trait score. Two additional cohorts were used for validation and included 1,746 individuals from the Montréal Heart Institute Biobank and 1,048 individuals from the University of Pennsylvania. Findings were consistent between cohorts: we found rare heterozygous large-effect variants in 18.7% and 10.9% of low- and high-HDL-C patients, respectively. We also found common variant accumulation, indicated by extreme polygenic trait scores, in an additional 12.8% and 19.3% of overall cases of low- and high-HDL-C extremes, respectively. Thus, the genetic basis of extreme HDL-C concentrations encountered clinically is frequently polygenic, with contributions from both rare large-effect and common small-effect variants. Multiple types of genetic variants should be considered as contributing factors in patients with extreme dyslipidemia. Copyright © 2017 by the American Society for Biochemistry and Molecular Biology, Inc.

  2. Analysis of extreme precipitation characteristics in low mountain areas based on three-dimensional copulas—taking Kuandian County as an example

    NASA Astrophysics Data System (ADS)

    Wang, Cailin; Ren, Xuehui; Li, Ying

    2017-04-01

    We defined the threshold of extreme precipitation using detrended fluctuation analysis based on daily precipitation during 1955-2013 in Kuandian County, Liaoning Province. Three-dimensional copulas were introduced to analyze the characteristics of four extreme precipitation factors: the annual extreme precipitation day, extreme precipitation amount, annual average extreme precipitation intensity, and extreme precipitation rate of contribution. The results show that (1) the threshold is 95.0 mm, extreme precipitation events generally occur 1-2 times a year, the average extreme precipitation intensity is 100-150 mm, and the extreme precipitation amount is 100-270 mm accounting for 10 to 37 % of annual precipitation. (2) The generalized extreme value distribution, extreme value distribution, and generalized Pareto distribution are suitable for fitting the distribution function for each element of extreme precipitation. The Ali-Mikhail-Haq (AMH) copula function reflects the joint characteristics of extreme precipitation factors. (3) The return period of the three types has significant synchronicity, and the joint return period and co-occurrence return period have long delay when the return period of the single factor is long. This reflects the inalienability of extreme precipitation factors. The co-occurrence return period is longer than that of the single factor and joint return period. (4) The single factor fitting only reflects single factor information of extreme precipitation but is unrelated to the relationship between factors. Three-dimensional copulas represent the internal information of extreme precipitation factors and are closer to the actual. The copula function is potentially widely applicable for the multiple factors of extreme precipitation.

  3. Extreme Value Theory and the New Sunspot Number Series

    NASA Astrophysics Data System (ADS)

    Acero, F. J.; Carrasco, V. M. S.; Gallego, M. C.; García, J. A.; Vaquero, J. M.

    2017-04-01

    Extreme value theory was employed to study solar activity using the new sunspot number index. The block maxima approach was used at yearly (1700-2015), monthly (1749-2016), and daily (1818-2016) scales, selecting the maximum sunspot number value for each solar cycle, and the peaks-over-threshold (POT) technique was used after a declustering process only for the daily data. Both techniques led to negative values for the shape parameters. This implies that the extreme sunspot number value distribution has an upper bound. The return level (RL) values obtained from the POT approach were greater than when using the block maxima technique. Regarding the POT approach, the 110 year (550 and 1100 year) RLs were lower (higher) than the daily maximum observed sunspot number value of 528. Furthermore, according to the block maxima approach, the 10-cycle RL lay within the block maxima daily sunspot number range, as expected, but it was striking that the 50- and 100-cycle RLs were also within that range. Thus, it would seem that the RL is reaching a plateau, and, although one must be cautious, it would be difficult to attain sunspot number values greater than 550. The extreme value trends from the four series (yearly, monthly, and daily maxima per solar cycle, and POT after declustering the daily data) were analyzed with the Mann-Kendall test and Sen’s method. Only the negative trend of the daily data with the POT technique was statistically significant.

  4. GPS FOM Chimney Analysis using Generalized Extreme Value Distribution

    NASA Technical Reports Server (NTRS)

    Ott, Rick; Frisbee, Joe; Saha, Kanan

    2004-01-01

    Many a time an objective of a statistical analysis is to estimate a limit value like 3-sigma 95% confidence upper limit from a data sample. The generalized Extreme Value Distribution method can be profitably employed in many situations for such an estimate. . .. It is well known that according to the Central Limit theorem the mean value of a large data set is normally distributed irrespective of the distribution of the data from which the mean value is derived. In a somewhat similar fashion it is observed that many times the extreme value of a data set has a distribution that can be formulated with a Generalized Distribution. In space shuttle entry with 3-string GPS navigation the Figure Of Merit (FOM) value gives a measure of GPS navigated state accuracy. A GPS navigated state with FOM of 6 or higher is deemed unacceptable and is said to form a FOM 6 or higher chimney. A FOM chimney is a period of time during which the FOM value stays higher than 5. A longer period of FOM of value 6 or higher causes navigated state to accumulate more error for a lack of state update. For an acceptable landing it is imperative that the state error remains low and hence at low altitude during entry GPS data of FOM greater than 5 must not last more than 138 seconds. I To test the GPS performAnce many entry test cases were simulated at the Avionics Development Laboratory. Only high value FoM chimneys are consequential. The extreme value statistical technique is applied to analyze high value FOM chimneys. The Maximum likelihood method is used to determine parameters that characterize the GEV distribution, and then the limit value statistics are estimated.

  5. Nonparametric Regression Subject to a Given Number of Local Extreme Value

    DTIC Science & Technology

    2001-07-01

    compilation report: ADP013708 thru ADP013761 UNCLASSIFIED Nonparametric regression subject to a given number of local extreme value Ali Majidi and Laurie...locations of the local extremes for the smoothing algorithm. 280 A. Majidi and L. Davies 3 The smoothing problem We make the smoothing problem precise...is the solution of QP3. k--oo 282 A. Majidi and L. Davies FiG. 2. The captions top-left, top-right, bottom-left, bottom-right show the result of the

  6. Persistence Mapping Using EUV Solar Imager Data

    NASA Technical Reports Server (NTRS)

    Thompson, B. J.; Young, C. A.

    2016-01-01

    We describe a simple image processing technique that is useful for the visualization and depiction of gradually evolving or intermittent structures in solar physics extreme-ultraviolet imagery. The technique is an application of image segmentation, which we call "Persistence Mapping," to isolate extreme values in a data set, and is particularly useful for the problem of capturing phenomena that are evolving in both space and time. While integration or "time-lapse" imaging uses the full sample (of size N ), Persistence Mapping rejects (N - 1)/N of the data set and identifies the most relevant 1/N values using the following rule: if a pixel reaches an extreme value, it retains that value until that value is exceeded. The simplest examples isolate minima and maxima, but any quantile or statistic can be used. This paper demonstrates how the technique has been used to extract the dynamics in long-term evolution of comet tails, erupting material, and EUV dimming regions.

  7. Research in Stochastic Processes.

    DTIC Science & Technology

    1982-12-01

    constant high level boundary. References 1. Jurg Husler , Extremie values of non-stationary sequ-ences ard the extr-rmal index, Center for Stochastic...A. Weron, Oct. 82. 20. "Extreme values of non-stationary sequences and the extremal index." Jurg Husler , Oct. 82. 21. "A finitely additive white noise...string model, Y. Miyahara, Carleton University and Nagoya University. Sept. 22 On extremfe values of non-stationary sequences, J. Husler , University of

  8. More tornadoes in the most extreme U.S. tornado outbreaks

    NASA Astrophysics Data System (ADS)

    Tippett, Michael K.; Lepore, Chiara; Cohen, Joel E.

    2016-12-01

    Tornadoes and severe thunderstorms kill people and damage property every year. Estimated U.S. insured losses due to severe thunderstorms in the first half of 2016 were $8.5 billion (US). The largest U.S. effects of tornadoes result from tornado outbreaks, which are sequences of tornadoes that occur in close succession. Here, using extreme value analysis, we find that the frequency of U.S. outbreaks with many tornadoes is increasing and that it is increasing faster for more extreme outbreaks. We model this behavior by extreme value distributions with parameters that are linear functions of time or of some indicators of multidecadal climatic variability. Extreme meteorological environments associated with severe thunderstorms show consistent upward trends, but the trends do not resemble those currently expected to result from global warming.

  9. High Resolution Hydro-climatological Projections for Western Canada

    NASA Astrophysics Data System (ADS)

    Erler, Andre Richard

    Accurate identification of the impact of global warming on water resources and hydro-climatic extremes represents a significant challenge to the understanding of climate change on the regional scale. Here an analysis of hydro-climatic changes in western Canada is presented, with specific focus on the Fraser and Athabasca River basins and on changes in hydro-climatic extremes. The analysis is based on a suite of simulations designed to characterize internal variability, as well as model uncertainty. A small ensemble of Community Earth System Model version 1 (CESM1) simulations was employed to generate global climate projections, which were downscaled to 10 km resolution using the Weather Research and Forecasting model (WRF V3.4.1) with several sets of physical parameterizations. Downscaling was performed for a historical validation period and a mid- and end-21st-century projection period, using the RCP8.5 greenhouse gas trajectory. Daily station observations and monthly gridded datasets were used for validation. Changes in hydro-climatic extremes are characterized using Extreme Value Analysis. A novel method of aggregating data from climatologically similar stations was employed to increase the statistical power of the analysis. Changes in mean and extreme precipitation are found to differ strongly between seasons and regions, but (relative) changes in extremes generally follow changes in the (seasonal) mean. At the end of the 21st century, precipitation and precipitation extremes are projected to increase by 30% at the coast in fall and land-inwards in winter, while the projected increase in summer precipitation is smaller and changes in extremes are often not statistically significant. Reasons for the differences between seasons, the role of precipitation recycling in atmospheric water transport, and the sensitivity to physics parameterizations are discussed. Major changes are projected for the Fraser River basin, including earlier snowmelt and a 50% reduction in peak runoff. Combined with higher evapotranspiration, a significant increase in late summer drought risk is likely, but increasing fall precipitation might also increase the risk of moderate flooding. In the Athabasca River basin, increasing winter precipitation and snowmelt is balanced by increasing evapotranspiration in summer and no significant change in flood or drought risk is projected.

  10. Compilation of 1986 annual reports of the Navy ELF (extremely low frequency) communications system ecological-monitoring program. Volume 2. Tabs D-G. Annual progress report, January-December 1986

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1987-07-01

    The U.S. Navy is conducting a long-term program to monitor for possible effects from the operation of its Extremely Low Frequency (ELF) Communications System to resident biota and their ecological relationships. This report documents progress of the following studies: Soil Amoeba; Soil and Litter Arthropoda and Earthworm Studies; Biological Studies on Pollinating insects: Megachilid Bees; and Small Vertebrates: Small Mammals and Nesting Birds.

  11. Extreme value modelling of Ghana stock exchange index.

    PubMed

    Nortey, Ezekiel N N; Asare, Kwabena; Mettle, Felix Okoe

    2015-01-01

    Modelling of extreme events has always been of interest in fields such as hydrology and meteorology. However, after the recent global financial crises, appropriate models for modelling of such rare events leading to these crises have become quite essential in the finance and risk management fields. This paper models the extreme values of the Ghana stock exchange all-shares index (2000-2010) by applying the extreme value theory (EVT) to fit a model to the tails of the daily stock returns data. A conditional approach of the EVT was preferred and hence an ARMA-GARCH model was fitted to the data to correct for the effects of autocorrelation and conditional heteroscedastic terms present in the returns series, before the EVT method was applied. The Peak Over Threshold approach of the EVT, which fits a Generalized Pareto Distribution (GPD) model to excesses above a certain selected threshold, was employed. Maximum likelihood estimates of the model parameters were obtained and the model's goodness of fit was assessed graphically using Q-Q, P-P and density plots. The findings indicate that the GPD provides an adequate fit to the data of excesses. The size of the extreme daily Ghanaian stock market movements were then computed using the value at risk and expected shortfall risk measures at some high quantiles, based on the fitted GPD model.

  12. Marine Stratocumulus Cloud Fields off the Coast of Southern California Observed Using LANDSAT Imagery. Part II: Textural Analysis.

    NASA Astrophysics Data System (ADS)

    Welch, R. M.; Sengupta, S. K.; Kuo, K. S.

    1988-04-01

    Statistical measures of the spatial distributions of gray levels (cloud reflectivities) are determined for LANDSAT Multispectral Scanner digital data. Textural properties for twelve stratocumulus cloud fields, seven cumulus fields, and two cirrus fields are examined using the Spatial Gray Level Co-Occurrence Matrix method. The co-occurrence statistics are computed for pixel separations ranging from 57 m to 29 km and at angles of 0°, 45°, 90° and 135°. Nine different textual measures are used to define the cloud field spatial relationships. However, the measures of contrast and correlation appear to be most useful in distinguishing cloud structure.Cloud field macrotexture describes general cloud field characteristics at distances greater than the size of typical cloud elements. It is determined from the spatial asymptotic values of the texture measures. The slope of the texture curves at small distances provides a measure of the microtexture of individual cloud cells. Cloud fields composed primarily of small cells have very steep slopes and reach their asymptotic values at short distances from the origin. As the cells composing the cloud field grow larger, the slope becomes more gradual and the asymptotic distance increases accordingly. Low asymptotic values of correlation show that stratocumulus cloud fields have no large scale organized structure.Besides the ability to distinguish cloud field structure, texture appears to be a potentially valuable tool in cloud classification. Stratocumulus clouds are characterized by low values of angular second moment and large values of entropy. Cirrus clouds appear to have extremely low values of contrast, low values of entropy, and very large values of correlation.Finally, we propose that sampled high spatial resolution satellite data be used in conjunction with coarser resolution operational satellite data to detect and identify cloud field structure and directionality and to locate regions of subresolution scale cloud contamination.

  13. Rare event simulation in radiation transport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kollman, Craig

    1993-10-01

    This dissertation studies methods for estimating extremely small probabilities by Monte Carlo simulation. Problems in radiation transport typically involve estimating very rare events or the expected value of a random variable which is with overwhelming probability equal to zero. These problems often have high dimensional state spaces and irregular geometries so that analytic solutions are not possible. Monte Carlo simulation must be used to estimate the radiation dosage being transported to a particular location. If the area is well shielded the probability of any one particular particle getting through is very small. Because of the large number of particles involved,more » even a tiny fraction penetrating the shield may represent an unacceptable level of radiation. It therefore becomes critical to be able to accurately estimate this extremely small probability. Importance sampling is a well known technique for improving the efficiency of rare event calculations. Here, a new set of probabilities is used in the simulation runs. The results are multiple by the likelihood ratio between the true and simulated probabilities so as to keep the estimator unbiased. The variance of the resulting estimator is very sensitive to which new set of transition probabilities are chosen. It is shown that a zero variance estimator does exist, but that its computation requires exact knowledge of the solution. A simple random walk with an associated killing model for the scatter of neutrons is introduced. Large deviation results for optimal importance sampling in random walks are extended to the case where killing is present. An adaptive ``learning`` algorithm for implementing importance sampling is given for more general Markov chain models of neutron scatter. For finite state spaces this algorithm is shown to give with probability one, a sequence of estimates converging exponentially fast to the true solution.« less

  14. A new approach for the description of discharge extremes in small catchments

    NASA Astrophysics Data System (ADS)

    Pavia Santolamazza, Daniela; Lebrenz, Henning; Bárdossy, András

    2017-04-01

    Small catchment basins in Northwestern Switzerland, characterized by small concentration times, are frequently targeted by floods. The peak and the volume of these floods are commonly estimated by a frequency analysis of occurrence and described by a random variable, assuming a uniform distributed probability and stationary input drivers (e.g. precipitation, temperature). For these small catchments, we attempt to describe and identify the underlying mechanisms and dynamics at the occurrence of extremes by means of available high temporal resolution (10 min) observations and to explore the possibilities to regionalize hydrological parameters for short intervals. Therefore, we investigate new concepts for the flood description such as entropy as a measure of disorder and dispersion of precipitation. First findings and conclusions of this ongoing research are presented.

  15. Long-term trend of satellite-observed significant wave height and impact on ecosystem in the East/Japan Sea

    NASA Astrophysics Data System (ADS)

    Woo, Hye-Jin; Park, Kyung-Ae

    2017-09-01

    Significant wave height (SWH) data of nine satellite altimeters were validated with in-situ SWH measurements from buoy stations in the East/Japan Sea (EJS) and the Northwest Pacific Ocean. The spatial and temporal variability of extreme SWHs was investigated by defining the 90th, 95th, and 99th percentiles based on percentile analysis. The annual mean of extreme SWHs was dramatically increased by 3.45 m in the EJS, which is significantly higher than the normal mean of about 1.44 m. The spatial distributions of SWHs showed significantly higher values in the eastern region of the EJS than those in the western part. Characteristic seasonality was found from the time-series SWHs with high SWHs (>2.5 m) in winter but low values (<1 m) in summer. The trends of the normal and extreme (99th percentile) SWHs in the EJS had a positive value of 0.0056 m year-1 and 0.0125 m year-1, respectively. The long-term trend demonstrated that higher SWH values were more extreme with time during the past decades. The predominant spatial distinctions between the coastal regions in the marginal seas of the Northwest Pacific Ocean and open ocean regions were presented. In spring, both normal and extreme SWHs showed substantially increasing trends in the EJS. Finally, we first presented the impact of the long-term trend of extreme SWHs on the marine ecosystem through vertical mixing enhancement in the upper ocean of the EJS.

  16. The end of trend-estimation for extreme floods under climate change?

    NASA Astrophysics Data System (ADS)

    Schulz, Karsten; Bernhardt, Matthias

    2016-04-01

    An increased risk of flood events is one of the major threats under future climate change conditions. Therefore, many recent studies have investigated trends in flood extreme occurences using historic long-term river discharge data as well as simulations from combined global/regional climate and hydrological models. Severe floods are relatively rare events and the robust estimation of their probability of occurrence requires long time series of data (6). Following a method outlined by the IPCC research community, trends in extreme floods are calculated based on the difference of discharge values exceeding e.g. a 100-year level (Q100) between two 30-year windows, which represents prevailing conditions in a reference and a future time period, respectively. Following this approach, we analysed multiple, synthetically derived 2,000-year trend-free, yearly maximum runoff data generated using three different extreme value distributions (EDV). The parameters were estimated from long term runoff data of four large European watersheds (Danube, Elbe, Rhine, Thames). Both, Q100-values estimated from 30-year moving windows, as well as the subsequently derived trends showed enormous variations with time: for example, estimating the Extreme Value (Gumbel) - distribution for the Danube data, trends of Q100 in the synthetic time-series range from -4,480 to 4,028 m³/s per 100 years (Q100 =10,071m³/s, for reference). Similar results were found when applying other extreme value distributions (Weibull, and log-Normal) to all of the watersheds considered. This variability or "background noise" of estimating trends in flood extremes makes it almost impossible to significantly distinguish any real trend in observed as well as modelled data when such an approach is applied. These uncertainties, even though known in principle are hardly addressed and discussed by the climate change impact community. Any decision making and flood risk management, including the dimensioning of flood protection measures, that is based on such studies might therefore be fundamentally flawed.

  17. Beyond Traditional Extreme Value Theory Through a Metastatistical Approach: Lessons Learned from Precipitation, Hurricanes, and Storm Surges

    NASA Astrophysics Data System (ADS)

    Marani, M.; Zorzetto, E.; Hosseini, S. R.; Miniussi, A.; Scaioni, M.

    2017-12-01

    The Generalized Extreme Value (GEV) distribution is widely adopted irrespective of the properties of the stochastic process generating the extreme events. However, GEV presents several limitations, both theoretical (asymptotic validity for a large number of events/year or hypothesis of Poisson occurrences of Generalized Pareto events), and practical (fitting uses just yearly maxima or a few values above a high threshold). Here we describe the Metastatistical Extreme Value Distribution (MEVD, Marani & Ignaccolo, 2015), which relaxes asymptotic or Poisson/GPD assumptions and makes use of all available observations. We then illustrate the flexibility of the MEVD by applying it to daily precipitation, hurricane intensity, and storm surge magnitude. Application to daily rainfall from a global raingauge network shows that MEVD estimates are 50% more accurate than those from GEV when the recurrence interval of interest is much greater than the observational period. This makes MEVD suited for application to satellite rainfall observations ( 20 yrs length). Use of MEVD on TRMM data yields extreme event patterns that are in better agreement with surface observations than corresponding GEV estimates.Applied to the HURDAT2 Atlantic hurricane intensity dataset, MEVD significantly outperforms GEV estimates of extreme hurricanes. Interestingly, the Generalized Pareto distribution used for "ordinary" hurricane intensity points to the existence of a maximum limit wind speed that is significantly smaller than corresponding physically-based estimates. Finally, we applied the MEVD approach to water levels generated by tidal fluctuations and storm surges at a set of coastal sites spanning different storm-surge regimes. MEVD yields accurate estimates of large quantiles and inferences on tail thickness (fat vs. thin) of the underlying distribution of "ordinary" surges. In summary, the MEVD approach presents a number of theoretical and practical advantages, and outperforms traditional approaches in several applications. We conclude that the MEVD is a significant contribution to further generalize extreme value theory, with implications for a broad range of Earth Sciences.

  18. Geographic Information System and Geoportal «River basins of the European Russia»

    NASA Astrophysics Data System (ADS)

    Yermolaev, O. P.; Mukharamova, S. S.; Maltsev, K. A.; Ivanov, M. A.; Ermolaeva, P. O.; Gayazov, A. I.; Mozzherin, V. V.; Kharchenko, S. V.; Marinina, O. A.; Lisetskii, F. N.

    2018-01-01

    Geographic Information System (GIS) and Geoportal with open access «River basins of the European Russia» were implemented. GIS and Geoportal are based on the map of basins of small rivers of the European Russia with information about natural and anthropogenic characteristics, namely geomorphometry of basins relief; climatic parameters, representing averages, variation, seasonal variation, extreme values of temperature and precipitation; land cover types; soil characteristics; type and subtype of landscape; population density. The GIS includes results of spatial analysis and modelling, in particular, assessment of anthropogenic impact on river basins; evaluation of water runoff and sediment runoff; climatic, geomorphological and landscape zoning for the European part of Russia.

  19. Heterodimer Autorepression Loop: A Robust and Flexible Pulse-Generating Genetic Module

    NASA Astrophysics Data System (ADS)

    Lannoo, B.; Carlon, E.; Lefranc, M.

    2016-07-01

    We investigate the dynamics of the heterodimer autorepression loop (HAL), a small genetic module in which a protein A acts as an autorepressor and binds to a second protein B to form an A B dimer. For suitable values of the rate constants, the HAL produces pulses of A alternating with pulses of B . By means of analytical and numerical calculations, we show that the duration of A pulses is extremely robust against variation of the rate constants while the duration of the B pulses can be flexibly adjusted. The HAL is thus a minimal genetic module generating robust pulses with a tunable duration, an interesting property for cellular signaling.

  20. Choosing the Mean versus an Extreme Resolution for Intrapersonal Values Conflicts: Is the Mean Usually More Golden?

    ERIC Educational Resources Information Center

    Kinnier, Richard T.

    1984-01-01

    Examined the resolution of value conflicts in 60 adults who wrote a solution to their conflicts. Compared extreme resolutions with those representing compromise. Compromisers and extremists did not differ in how rationally resolved they were about their solutions but compromisers felt better about their solutions. (JAC)

  1. 33 CFR 80.105 - Calais, ME to Cape Small, ME.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Calais, ME to Cape Small, ME. 80... INTERNATIONAL NAVIGATION RULES COLREGS DEMARCATION LINES Atlantic Coast § 80.105 Calais, ME to Cape Small, ME... International Bridge at Calais, ME to the southwesternmost extremity of Bald Head at Cape Small. ...

  2. 33 CFR 80.105 - Calais, ME to Cape Small, ME.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 33 Navigation and Navigable Waters 1 2012-07-01 2012-07-01 false Calais, ME to Cape Small, ME. 80... INTERNATIONAL NAVIGATION RULES COLREGS DEMARCATION LINES Atlantic Coast § 80.105 Calais, ME to Cape Small, ME... International Bridge at Calais, ME to the southwesternmost extremity of Bald Head at Cape Small. ...

  3. 33 CFR 80.105 - Calais, ME to Cape Small, ME.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 33 Navigation and Navigable Waters 1 2013-07-01 2013-07-01 false Calais, ME to Cape Small, ME. 80... INTERNATIONAL NAVIGATION RULES COLREGS DEMARCATION LINES Atlantic Coast § 80.105 Calais, ME to Cape Small, ME... International Bridge at Calais, ME to the southwesternmost extremity of Bald Head at Cape Small. ...

  4. 33 CFR 80.105 - Calais, ME to Cape Small, ME.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 33 Navigation and Navigable Waters 1 2011-07-01 2011-07-01 false Calais, ME to Cape Small, ME. 80... INTERNATIONAL NAVIGATION RULES COLREGS DEMARCATION LINES Atlantic Coast § 80.105 Calais, ME to Cape Small, ME... International Bridge at Calais, ME to the southwesternmost extremity of Bald Head at Cape Small. ...

  5. Modeling extreme PM10 concentration in Malaysia using generalized extreme value distribution

    NASA Astrophysics Data System (ADS)

    Hasan, Husna; Mansor, Nadiah; Salleh, Nur Hanim Mohd

    2015-05-01

    Extreme PM10 concentration from the Air Pollutant Index (API) at thirteen monitoring stations in Malaysia is modeled using the Generalized Extreme Value (GEV) distribution. The data is blocked into monthly selection period. The Mann-Kendall (MK) test suggests a non-stationary model so two models are considered for the stations with trend. The likelihood ratio test is used to determine the best fitted model and the result shows that only two stations favor the non-stationary model (Model 2) while the other eleven stations favor stationary model (Model 1). The return level of PM10 concentration that is expected to exceed the maximum once within a selected period is obtained.

  6. Calcium isotope systematics in small upland catchments affected by spruce dieback in the period of extreme acid rain (1970-1990)

    NASA Astrophysics Data System (ADS)

    Novak, Martin; Farkas, Juraj; Holmden, Chris; Hruska, Jakub; Curik, Jan; Stepanova, Marketa; Prechova, Eva; Veselovsky, Frantisek; Komarek, Arnost

    2017-04-01

    Recently, new isotope tools have become available to study the behavior of nutrients in stressed ecosystems. In this study, we focus on changes in the abundance ratio of calcium (Ca) isotopes accompanying biogeochemical processes in small forested catchments. We monitored del44Ca values in ecosystem pools and fluxes in four upland sites situated in the Czech Republic, Central Europe. A heavily acidified site in the Eagle Mts. (northern Czech Republic) experienced 13 times higher atmospheric Ca inputs, compared to the other three sites, which were less affected by forest decline. Industrial dust was responsible for the elevated Ca input. Del44Ca values of individual poos/fluxes were used to identify Ca sources for the bioavailable Ca soil reservoir and for runoff. The bedrock of the study sites differed (leucogranite, orthogneiss vs. serpentinite and amphibolite). Across the sites, mean del44Ca values increased in the order: spruce bark < fine roots < needles < soil < bedrock < canopy throughfall < open-area precipitation < runoff < soil water. Plant preferentially took up isotopically light Ca, while residual isotopically heavy Ca was sorbed to soil particles or exported via runoff. Even at sites with a low del44Ca values of bedrock, runoff had a high del44Ca value. At the base-poor site, most runoff came from atmospheric deposition and residual Ca following plant uptake. It appeared that bedrock weathering did not supply enough Ca to replenish the bioavailable Ca pool in the soil. Currently, we are analyzing Ca isotope composition of individual rock-forming minerals to better assess the effect of different weathering rates of minerals with low/high radiogenic 40Ca contents on runoff del44Ca.

  7. Quick Tips Guide for Small Manufacturing Businesses

    EPA Pesticide Factsheets

    Small manufacturing businesses can use this Quick Tips Guide to be better prepared for future extreme weather events. This guide discusses keeping good records, improving housekeeping procedures, and training employees.

  8. More tornadoes in the most extreme U.S. tornado outbreaks.

    PubMed

    Tippett, Michael K; Lepore, Chiara; Cohen, Joel E

    2016-12-16

    Tornadoes and severe thunderstorms kill people and damage property every year. Estimated U.S. insured losses due to severe thunderstorms in the first half of 2016 were $8.5 billion (US). The largest U.S. effects of tornadoes result from tornado outbreaks, which are sequences of tornadoes that occur in close succession. Here, using extreme value analysis, we find that the frequency of U.S. outbreaks with many tornadoes is increasing and that it is increasing faster for more extreme outbreaks. We model this behavior by extreme value distributions with parameters that are linear functions of time or of some indicators of multidecadal climatic variability. Extreme meteorological environments associated with severe thunderstorms show consistent upward trends, but the trends do not resemble those currently expected to result from global warming. Copyright © 2016, American Association for the Advancement of Science.

  9. Multiple sessions of transcranial direct current stimulation and upper extremity rehabilitation in stroke: A review and meta-analysis.

    PubMed

    Tedesco Triccas, L; Burridge, J H; Hughes, A M; Pickering, R M; Desikan, M; Rothwell, J C; Verheyden, G

    2016-01-01

    To systematically review the methodology in particular treatment options and outcomes and the effect of multiple sessions of transcranial direct current stimulation (tDCS) with rehabilitation programmes for upper extremity recovery post stroke. A search was conducted for randomised controlled trials involving tDCS and rehabilitation for the upper extremity in stroke. Quality of included studies was analysed using the Modified Downs and Black form. The extent of, and effect of variation in treatment parameters such as anodal, cathodal and bi-hemispheric tDCS on upper extremity outcome measures of impairment and activity were analysed using meta-analysis. Nine studies (371 participants with acute, sub-acute and chronic stroke) were included. Different methodologies of tDCS and upper extremity intervention, outcome measures and timing of assessments were identified. Real tDCS combined with rehabilitation had a small non-significant effect of +0.11 (p=0.44) and +0.24 (p=0.11) on upper extremity impairments and activities at post-intervention respectively. Various tDCS methods have been used in stroke rehabilitation. The evidence so far is not statistically significant, but is suggestive of, at best, a small beneficial effect on upper extremity impairment. Future research should focus on which patients and rehabilitation programmes are likely to respond to different tDCS regimes. Copyright © 2015 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  10. MEASUREMENT OF SMALL MECHANICAL VIBRATIONS OF BRAIN TISSUE EXPOSED TO EXTREMELY-LOW-FREQUENCY ELECTRIC FIELDS

    EPA Science Inventory

    Electromagnetic fields can interact with biological tissue both electrically and mechanically. This study investigated the mechanical interaction between brain tissue and an extremely-low-frequency (ELF) electric field by measuring the resultant vibrational amplitude. The exposur...

  11. Jimsphere wind and turbulence exceedance statistic

    NASA Technical Reports Server (NTRS)

    Adelfang, S. I.; Court, A.

    1972-01-01

    Exceedance statistics of winds and gusts observed over Cape Kennedy with Jimsphere balloon sensors are described. Gust profiles containing positive and negative departures, from smoothed profiles, in the wavelength ranges 100-2500, 100-1900, 100-860, and 100-460 meters were computed from 1578 profiles with four 41 weight digital high pass filters. Extreme values of the square root of gust speed are normally distributed. Monthly and annual exceedance probability distributions of normalized rms gust speeds in three altitude bands (2-7, 6-11, and 9-14 km) are log-normal. The rms gust speeds are largest in the 100-2500 wavelength band between 9 and 14 km in late winter and early spring. A study of monthly and annual exceedance probabilities and the number of occurrences per kilometer of level crossings with positive slope indicates significant variability with season, altitude, and filter configuration. A decile sampling scheme is tested and an optimum approach is suggested for drawing a relatively small random sample that represents the characteristic extreme wind speeds and shears of a large parent population of Jimsphere wind profiles.

  12. Classification-Assisted Memetic Algorithms for Equality-Constrained Optimization Problems

    NASA Astrophysics Data System (ADS)

    Handoko, Stephanus Daniel; Kwoh, Chee Keong; Ong, Yew Soon

    Regressions has successfully been incorporated into memetic algorithm (MA) to build surrogate models for the objective or constraint landscape of optimization problems. This helps to alleviate the needs for expensive fitness function evaluations by performing local refinements on the approximated landscape. Classifications can alternatively be used to assist MA on the choice of individuals that would experience refinements. Support-vector-assisted MA were recently proposed to alleviate needs for function evaluations in the inequality-constrained optimization problems by distinguishing regions of feasible solutions from those of the infeasible ones based on some past solutions such that search efforts can be focussed on some potential regions only. For problems having equality constraints, however, the feasible space would obviously be extremely small. It is thus extremely difficult for the global search component of the MA to produce feasible solutions. Hence, the classification of feasible and infeasible space would become ineffective. In this paper, a novel strategy to overcome such limitation is proposed, particularly for problems having one and only one equality constraint. The raw constraint value of an individual, instead of its feasibility class, is utilized in this work.

  13. Thermoelectric Properties of Nanograined Si-Ge-Au Thin Films Grown by Molecular Beam Deposition

    NASA Astrophysics Data System (ADS)

    Nishino, Shunsuke; Ekino, Satoshi; Inukai, Manabu; Omprakash, Muthusamy; Adachi, Masahiro; Kiyama, Makoto; Yamamoto, Yoshiyuki; Takeuchi, Tsunehiro

    2018-06-01

    Conditions to achieve extremely large Seebeck coefficient and extremely small thermal conductivity in Si-Ge-Au thin films formed of nanosized grains precipitated in amorphous matrix have been investigated. We employed molecular beam deposition to prepare Si1- x Ge x Au y thin films on sapphire substrate. The deposited films were annealed under nitrogen gas atmosphere at 300°C to 500°C for 15 min to 30 min. Nanocrystals dispersed in amorphous matrix were clearly observed by transmission electron microscopy. We did not observe anomalously large Seebeck coefficient, but very low thermal conductivity of nearly 1.0 W K-1 m-1 was found at around 0.2 < x < 0.6. The compositional dependence of the thermal conductivity was well accounted for by the compositional dependence of the mixing entropy. Some of these values agree exactly with the amorphous limit predicted by theoretical calculations. The smallest lattice thermal conductivity found for the present samples is lower than that of nanostructured Si-Ge bulk material for which dimensionless figure of merit of ZT ≈ 1 was reported at high temperature.

  14. A Generalized Framework for Non-Stationary Extreme Value Analysis

    NASA Astrophysics Data System (ADS)

    Ragno, E.; Cheng, L.; Sadegh, M.; AghaKouchak, A.

    2017-12-01

    Empirical trends in climate variables including precipitation, temperature, snow-water equivalent at regional to continental scales are evidence of changes in climate over time. The evolving climate conditions and human activity-related factors such as urbanization and population growth can exert further changes in weather and climate extremes. As a result, the scientific community faces an increasing demand for updated appraisal of the time-varying climate extremes. The purpose of this study is to offer a robust and flexible statistical tool for non-stationary extreme value analysis which can better characterize the severity and likelihood of extreme climatic variables. This is critical to ensure a more resilient environment in a changing climate. Following the positive feedback on the first version of Non-Stationary Extreme Value Analysis (NEVA) Toolbox by Cheng at al. 2014, we present an improved version, i.e. NEVA2.0. The upgraded version herein builds upon a newly-developed hybrid evolution Markov Chain Monte Carlo (MCMC) approach for numerical parameters estimation and uncertainty assessment. This addition leads to a more robust uncertainty estimates of return levels, return periods, and risks of climatic extremes under both stationary and non-stationary assumptions. Moreover, NEVA2.0 is flexible in incorporating any user-specified covariate other than the default time-covariate (e.g., CO2 emissions, large scale climatic oscillation patterns). The new feature will allow users to examine non-stationarity of extremes induced by physical conditions that underlie the extreme events (e.g. antecedent soil moisture deficit, large-scale climatic teleconnections, urbanization). In addition, the new version offers an option to generate stationary and/or non-stationary rainfall Intensity - Duration - Frequency (IDF) curves that are widely used for risk assessment and infrastructure design. Finally, a Graphical User Interface (GUI) of the package is provided, making NEVA accessible to a broader audience.

  15. Extreme Value Analysis of hydro meteorological extremes in the ClimEx Large-Ensemble

    NASA Astrophysics Data System (ADS)

    Wood, R. R.; Martel, J. L.; Willkofer, F.; von Trentini, F.; Schmid, F. J.; Leduc, M.; Frigon, A.; Ludwig, R.

    2017-12-01

    Many studies show an increase in the magnitude and frequency of hydrological extreme events in the course of climate change. However the contribution of natural variability to the magnitude and frequency of hydrological extreme events is not yet settled. A reliable estimate of extreme events is from great interest for water management and public safety. In the course of the ClimEx Project (www.climex-project.org) a new single-model large-ensemble was created by dynamically downscaling the CanESM2 large-ensemble with the Canadian Regional Climate Model version 5 (CRCM5) for an European Domain and a Northeastern North-American domain. By utilizing the ClimEx 50-Member Large-Ensemble (CRCM5 driven by CanESM2 Large-Ensemble) a thorough analysis of natural variability in extreme events is possible. Are the current extreme value statistical methods able to account for natural variability? How large is the natural variability for e.g. a 1/100 year return period derived from a 50-Member Large-Ensemble for Europe and Northeastern North-America? These questions should be answered by applying various generalized extreme value distributions (GEV) to the ClimEx Large-Ensemble. Hereby various return levels (5-, 10-, 20-, 30-, 60- and 100-years) based on various lengths of time series (20-, 30-, 50-, 100- and 1500-years) should be analyzed for the maximum one day precipitation (RX1d), the maximum three hourly precipitation (RX3h) and the streamflow for selected catchments in Europe. The long time series of the ClimEx Ensemble (7500 years) allows us to give a first reliable estimate of the magnitude and frequency of certain extreme events.

  16. Large scale variability, long-term trends and extreme events in total ozone over the northern mid-latitudes based on satellite time series

    NASA Astrophysics Data System (ADS)

    Rieder, H. E.; Staehelin, J.; Maeder, J. A.; Ribatet, M.; Davison, A. C.

    2009-04-01

    Various generations of satellites (e.g. TOMS, GOME, OMI) made spatial datasets of column ozone available to the scientific community. This study has a special focus on column ozone over the northern mid-latitudes. Tools from geostatistics and extreme value theory are applied to analyze variability, long-term trends and frequency distributions of extreme events in total ozone. In a recent case study (Rieder et al., 2009) new tools from extreme value theory (Coles, 2001; Ribatet, 2007) have been applied to the world's longest total ozone record from Arosa, Switzerland (e.g. Staehelin 1998a,b), in order to describe extreme events in low and high total ozone. Within the current study this analysis is extended to satellite datasets for the northern mid-latitudes. Further special emphasis is given on patterns and spatial correlations and the influence of changes in atmospheric dynamics (e.g. tropospheric and lower stratospheric pressure systems) on column ozone. References: Coles, S.: An Introduction to Statistical Modeling of Extreme Values, Springer Series in Statistics, ISBN:1852334592, Springer, Berlin, 2001. Ribatet, M.: POT: Modelling peaks over a threshold, R News, 7, 34-36, 2007. Rieder, H.E., Staehelin, J., Maeder, J.A., Ribatet, M., Stübi, R., Weihs, P., Holawe, F., Peter, T., and Davison, A.C.: From ozone mini holes and mini highs towards extreme value theory: New insights from extreme events and non stationarity, submitted to J. Geophys. Res., 2009. Staehelin, J., Kegel, R., and Harris, N. R.: Trend analysis of the homogenized total ozone series of Arosa (Switzerland), 1929-1996, J. Geophys. Res., 103(D7), 8389-8400, doi:10.1029/97JD03650, 1998a. Staehelin, J., Renaud, A., Bader, J., McPeters, R., Viatte, P., Hoegger, B., Bugnion, V., Giroud, M., and Schill, H.: Total ozone series at Arosa (Switzerland): Homogenization and data comparison, J. Geophys. Res., 103(D5), 5827-5842, doi:10.1029/97JD02402, 1998b.

  17. Minimum important differences for the patient-specific functional scale, 4 region-specific outcome measures, and the numeric pain rating scale.

    PubMed

    Abbott, J Haxby; Schmitt, John

    2014-08-01

    Multicenter, prospective, longitudinal cohort study. To investigate the minimum important difference (MID) of the Patient-Specific Functional Scale (PSFS), 4 region-specific outcome measures, and the numeric pain rating scale (NPRS) across 3 levels of patient-perceived global rating of change in a clinical setting. The MID varies depending on the external anchor defining patient-perceived "importance." The MID for the PSFS has not been established across all body regions. One thousand seven hundred eight consecutive patients with musculoskeletal disorders were recruited from 5 physical therapy clinics. The PSFS, NPRS, and 4 region-specific outcome measures-the Oswestry Disability Index, Neck Disability Index, Upper Extremity Functional Index, and Lower Extremity Functional Scale-were assessed at the initial and final physical therapy visits. Global rating of change was assessed at the final visit. MID was calculated for the PSFS and NPRS (overall and for each body region), and for each region-specific outcome measure, across 3 levels of change defined by the global rating of change (small, medium, large change) using receiver operating characteristic curve methodology. The MID for the PSFS (on a scale from 0 to 10) ranged from 1.3 (small change) to 2.3 (medium change) to 2.7 (large change), and was relatively stable across body regions. MIDs for the NPRS (-1.5 to -3.5), Oswestry Disability Index (-12), Neck Disability Index (-14), Upper Extremity Functional Index (6 to 11), and Lower Extremity Functional Scale (9 to 16) are also reported. We reported the MID for small, medium, and large patient-perceived change on the PSFS, NPRS, Oswestry Disability Index, Neck Disability Index, Upper Extremity Functional Index, and Lower Extremity Functional Scale for use in clinical practice and research.

  18. Matter-Radiation Interactions in Extremes

    Science.gov Websites

    to resolve this capability gap. An experimental explosive is shown igniting during small-scale impact testing. An experimental explosive is shown igniting during small-scale impact testing. Accelerating in to

  19. Probabilistic forecasting of extreme weather events based on extreme value theory

    NASA Astrophysics Data System (ADS)

    Van De Vyver, Hans; Van Schaeybroeck, Bert

    2016-04-01

    Extreme events in weather and climate such as high wind gusts, heavy precipitation or extreme temperatures are commonly associated with high impacts on both environment and society. Forecasting extreme weather events is difficult, and very high-resolution models are needed to describe explicitly extreme weather phenomena. A prediction system for such events should therefore preferably be probabilistic in nature. Probabilistic forecasts and state estimations are nowadays common in the numerical weather prediction community. In this work, we develop a new probabilistic framework based on extreme value theory that aims to provide early warnings up to several days in advance. We consider the combined events when an observation variable Y (for instance wind speed) exceeds a high threshold y and its corresponding deterministic forecasts X also exceeds a high forecast threshold y. More specifically two problems are addressed:} We consider pairs (X,Y) of extreme events where X represents a deterministic forecast, and Y the observation variable (for instance wind speed). More specifically two problems are addressed: Given a high forecast X=x_0, what is the probability that Y>y? In other words: provide inference on the conditional probability: [ Pr{Y>y|X=x_0}. ] Given a probabilistic model for Problem 1, what is the impact on the verification analysis of extreme events. These problems can be solved with bivariate extremes (Coles, 2001), and the verification analysis in (Ferro, 2007). We apply the Ramos and Ledford (2009) parametric model for bivariate tail estimation of the pair (X,Y). The model accommodates different types of extremal dependence and asymmetry within a parsimonious representation. Results are presented using the ensemble reforecast system of the European Centre of Weather Forecasts (Hagedorn, 2008). Coles, S. (2001) An Introduction to Statistical modelling of Extreme Values. Springer-Verlag.Ferro, C.A.T. (2007) A probability model for verifying deterministic forecasts of extreme events. Wea. Forecasting {22}, 1089-1100.Hagedorn, R. (2008) Using the ECMWF reforecast dataset to calibrate EPS forecasts. ECMWF Newsletter, {117}, 8-13.Ramos, A., Ledford, A. (2009) A new class of models for bivariate joint tails. J.R. Statist. Soc. B {71}, 219-241.

  20. Extreme secular excitation of eccentricity inside mean motion resonance. Small bodies driven into star-grazing orbits by planetary perturbations

    NASA Astrophysics Data System (ADS)

    Pichierri, Gabriele; Morbidelli, Alessandro; Lai, Dong

    2017-09-01

    Context. It is well known that asteroids and comets fall into the Sun. Metal pollution of white dwarfs and transient spectroscopic signatures of young stars like β-Pic provide growing evidence that extra solar planetesimals can attain extreme orbital eccentricities and fall into their parent stars. Aims: We aim to develop a general, implementable, semi-analytical theory of secular eccentricity excitation of small bodies (planetesimals) in mean motion resonances with an eccentric planet valid for arbitrary values of the eccentricities and including the short-range force due to General Relativity. Methods: Our semi-analytic model for the restricted planar three-body problem does not make use of series expansion and therefore is valid for any eccentricity value and semi-major axis ratio. The model is based on the application of the adiabatic principle, which is valid when the precession period of the longitude of pericentre of the planetesimal is much longer than the libration period in the mean motion resonance. In resonances of order larger than 1 this is true except for vanishingly small eccentricities. We provide prospective users with a Mathematica notebook with implementation of the model allowing direct use. Results: We confirm that the 4:1 mean motion resonance with a moderately eccentric (e' ≲ 0.1) planet is the most powerful one to lift the eccentricity of planetesimals from nearly circular orbits to star-grazing ones. However, if the planet is too eccentric, we find that this resonance is unable to pump the planetesimal's eccentricity to a very high value. The inclusion of the General Relativity effect imposes a condition on the mass of the planet to drive the planetesimals into star-grazing orbits. For a planetesimal at 1 AU around a solar mass star (or white dwarf), we find a threshold planetary mass of about 17 Earth masses. We finally derive an analytical formula for this critical mass. Conclusions: Planetesimals can easily fall into the central star even in the presence of a single moderately eccentric planet, but only from the vicinity of the 4:1 mean motion resonance. For sufficiently high planetary masses the General Relativity effect does not prevent the achievement of star-grazing orbits. The Mathematica notebook is only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/605/A23

  1. Hot bats: extreme thermal tolerance in a desert heat wave

    NASA Astrophysics Data System (ADS)

    Bondarenco, Artiom; Körtner, Gerhard; Geiser, Fritz

    2014-08-01

    Climate change is predicted to increase temperature extremes and thus thermal stress on organisms. Animals living in hot deserts are already exposed to high ambient temperatures ( T a) making them especially vulnerable to further warming. However, little is known about the effect of extreme heat events on small desert mammals, especially tree-roosting microbats that are not strongly protected from environmental temperature fluctuations. During a heat wave with record T as at Sturt National Park, we quantified the thermal physiology and behaviour of a single free-ranging little broad-nosed ( Scotorepens greyii, henceforth Scotorepens) and two inland freetail bats ( Mormopterus species 3, henceforth Mormopterus) using temperature telemetry over 3 days. On 11 and 13 January, maximum T a was ˜45.0 °C, and all monitored bats were thermoconforming. On 12 January 2013, when T a exceeded 48.0 °C, Scotorepens abandoned its poorly insulated roost during the daytime, whereas both Mormopterus remained in their better insulated roosts and were mostly thermoconforming. Maximum skin temperatures ( T skin) ranged from 44.0 to 44.3 °C in Scotorepens and from 40.0 to 45.8 °C in Mormopterus, and these are the highest T skin values reported for any free-ranging bat. Our study provides the first evidence of extensive heat tolerance in free-ranging desert microbats. It shows that these bats can tolerate the most extreme T skin range known for mammals (3.3 to 45.8 °C) and delay regulation of T skin by thermoconforming over a wide temperature range and thus decrease the risks of dehydration and consequently death.

  2. New Insights into the Estimation of Extreme Geomagnetic Storm Occurrences

    NASA Astrophysics Data System (ADS)

    Ruffenach, Alexis; Winter, Hugo; Lavraud, Benoit; Bernardara, Pietro

    2017-04-01

    Space weather events such as intense geomagnetic storms are major disturbances of the near-Earth environment that may lead to serious impacts on our modern society. As such, it is of great importance to estimate their probability, and in particular that of extreme events. One approach largely used in statistical sciences for extreme events probability estimates is Extreme Value Analysis (EVA). Using this rigorous statistical framework, estimations of the occurrence of extreme geomagnetic storms are performed here based on the most relevant global parameters related to geomagnetic storms, such as ground parameters (e.g. geomagnetic Dst and aa indexes), and space parameters related to the characteristics of Coronal Mass Ejections (CME) (velocity, southward magnetic field component, electric field). Using our fitted model, we estimate the annual probability of a Carrington-type event (Dst = -850nT) to be on the order of 10-3, with a lower limit of the uncertainties on the return period of ˜500 years. Our estimate is significantly higher than that of most past studies, which typically had a return period of a few 100 years at maximum. Thus precautions are required when extrapolating intense values. Currently, the complexity of the processes and the length of available data inevitably leads to significant uncertainties in return period estimates for the occurrence of extreme geomagnetic storms. However, our application of extreme value models for extrapolating into the tail of the distribution provides a mathematically justified framework for the estimation of extreme return periods, thereby enabling the determination of more accurate estimates and reduced associated uncertainties.

  3. The Robust Relationship Between Extreme Precipitation and Convective Organization in Idealized Numerical Modeling Simulations

    NASA Astrophysics Data System (ADS)

    Bao, Jiawei; Sherwood, Steven C.; Colin, Maxime; Dixit, Vishal

    2017-10-01

    The behavior of tropical extreme precipitation under changes in sea surface temperatures (SSTs) is investigated with the Weather Research and Forecasting Model (WRF) in three sets of idealized simulations: small-domain tropical radiative-convective equilibrium (RCE), quasi-global "aquapatch", and RCE with prescribed mean ascent from the tropical band in the aquapatch. We find that, across the variations introduced including SST, large-scale circulation, domain size, horizontal resolution, and convective parameterization, the change in the degree of convective organization emerges as a robust mechanism affecting extreme precipitation. Higher ratios of change in extreme precipitation to change in mean surface water vapor are associated with increases in the degree of organization, while lower ratios correspond to decreases in the degree of organization. The spread of such changes is much larger in RCE than aquapatch tropics, suggesting that small RCE domains may be unreliable for assessing the temperature-dependence of extreme precipitation or convective organization. When the degree of organization does not change, simulated extreme precipitation scales with surface water vapor. This slightly exceeds Clausius-Clapeyron (CC) scaling, because the near-surface air warms 10-25% faster than the SST in all experiments. Also for simulations analyzed here with convective parameterizations, there is an increasing trend of organization with SST.

  4. Visual Analysis among Novices: Training and Trend Lines as Graphic Aids

    ERIC Educational Resources Information Center

    Nelson, Peter M.; Van Norman, Ethan R.; Christ, Theodore J.

    2017-01-01

    The current study evaluated the degree to which novice visual analysts could discern trends in simulated time-series data across differing levels of variability and extreme values. Forty-five novice visual analysts were trained in general principles of visual analysis. One group received brief training on how to identify and omit extreme values.…

  5. Implementing Extreme Value Analysis in a Geospatial Workflow for Storm Surge Hazard Assessment

    NASA Astrophysics Data System (ADS)

    Catelli, J.; Nong, S.

    2014-12-01

    Gridded data of 100-yr (1%) and 500-yr (0.2%) storm surge flood elevations for the United States, Gulf of Mexico, and East Coast are critical to understanding this natural hazard. Storm surge heights were calculated across the study area utilizing SLOSH (Sea, Lake, and Overland Surges from Hurricanes) model data for thousands of synthetic US landfalling hurricanes. Based on the results derived from SLOSH, a series of interpolations were performed using spatial analysis in a geographic information system (GIS) at both the SLOSH basin and the synthetic event levels. The result was a single grid of maximum flood elevations for each synthetic event. This project addresses the need to utilize extreme value theory in a geospatial environment to analyze coincident cells across multiple synthetic events. The results are 100-yr (1%) and 500-yr (0.2%) values for each grid cell in the study area. This talk details a geospatial approach to move raster data to SciPy's NumPy Array structure using the Python programming language. The data are then connected through a Python library to an outside statistical package like R to fit cell values to extreme value theory distributions and return values for specified recurrence intervals. While this is not a new process, the value behind this work is the ability to keep this process in a single geospatial environment and be able to easily replicate this process for other natural hazard applications and extreme event modeling.

  6. Predicting the cosmological constant with the scale-factor cutoff measure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    De Simone, Andrea; Guth, Alan H.; Salem, Michael P.

    2008-09-15

    It is well known that anthropic selection from a landscape with a flat prior distribution of cosmological constant {lambda} gives a reasonable fit to observation. However, a realistic model of the multiverse has a physical volume that diverges with time, and the predicted distribution of {lambda} depends on how the spacetime volume is regulated. A very promising method of regulation uses a scale-factor cutoff, which avoids a number of serious problems that arise in other approaches. In particular, the scale-factor cutoff avoids the 'youngness problem' (high probability of living in a much younger universe) and the 'Q and G catastrophes'more » (high probability for the primordial density contrast Q and gravitational constant G to have extremely large or small values). We apply the scale-factor cutoff measure to the probability distribution of {lambda}, considering both positive and negative values. The results are in good agreement with observation. In particular, the scale-factor cutoff strongly suppresses the probability for values of {lambda} that are more than about 10 times the observed value. We also discuss qualitatively the prediction for the density parameter {omega}, indicating that with this measure there is a possibility of detectable negative curvature.« less

  7. NSRD-15:Computational Capability to Substantiate DOE-HDBK-3010 Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Louie, David; Bignell, John; Dingreville, Remi Philippe Michel

    Safety basis analysts throughout the U.S. Department of Energy (DOE) complex rely heavily on the information provided in the DOE Handbook, DOE-HDBK-3010, Airborne Release Fractions/Rates and Respirable Fractions for Nonreactor Nuclear Facilities, to determine radionuclide source terms from postulated accident scenarios. In calculating source terms, analysts tend to use the DOE Handbook’s bounding values on airborne release fractions (ARFs) and respirable fractions (RFs) for various categories of insults (representing potential accident release categories). This is typically due to both time constraints and the avoidance of regulatory critique. Unfortunately, these bounding ARFs/RFs represent extremely conservative values. Moreover, they were derived frommore » very limited small-scale bench/laboratory experiments and/or from engineered judgment. Thus, the basis for the data may not be representative of the actual unique accident conditions and configurations being evaluated. The goal of this research is to develop a more accurate and defensible method to determine bounding values for the DOE Handbook using state-of-art multi-physics-based computer codes.« less

  8. Monitoring physical and chemical parameters of Delaware Bay waters with an ERTS-1 data collection platform

    NASA Technical Reports Server (NTRS)

    Klemas, V. (Principal Investigator); Wethe, C.

    1975-01-01

    The author has identified the following significant results. Results of the analysis of data collected during the summer of 1974 demonstrate that the ERTS Data Collection Platform (DCP) is quite responsive to changing water parameters and that this information can be successfully transmitted under all weather conditions. The monitoring of on-site probe outputs reveals a rapid response to changing water temperature, salinity, and turbidity conditions on incoming tides as the tidal salt wedge passes the probe location. The changes in water properties were corroborated by simultaneously sampling the water for subsequent laboratory analysis. Fluctuations observed in the values of salinity, conductivity, temperature and water depth over short time intervals were extremely small. Due to the nature of the probe, 10% to 20% fluctuations were observed in the turbidity values. The use of the average of the values observed during an overpass provided acceptable results. Good quality data was obtained from the satellite on each overpass regardless of weather conditions. Continued use of the DCP will help provide an indication of the accuracy of the probes and transmission system during long term use.

  9. The Small Nuclear Genomes of Selaginella Are Associated with a Low Rate of Genome Size Evolution.

    PubMed

    Baniaga, Anthony E; Arrigo, Nils; Barker, Michael S

    2016-06-03

    The haploid nuclear genome size (1C DNA) of vascular land plants varies over several orders of magnitude. Much of this observed diversity in genome size is due to the proliferation and deletion of transposable elements. To date, all vascular land plant lineages with extremely small nuclear genomes represent recently derived states, having ancestors with much larger genome sizes. The Selaginellaceae represent an ancient lineage with extremely small genomes. It is unclear how small nuclear genomes evolved in Selaginella We compared the rates of nuclear genome size evolution in Selaginella and major vascular plant clades in a comparative phylogenetic framework. For the analyses, we collected 29 new flow cytometry estimates of haploid genome size in Selaginella to augment publicly available data. Selaginella possess some of the smallest known haploid nuclear genome sizes, as well as the lowest rate of genome size evolution observed across all vascular land plants included in our analyses. Additionally, our analyses provide strong support for a history of haploid nuclear genome size stasis in Selaginella Our results indicate that Selaginella, similar to other early diverging lineages of vascular land plants, has relatively low rates of genome size evolution. Further, our analyses highlight that a rapid transition to a small genome size is only one route to an extremely small genome. © The Author 2016. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  10. Value-at-risk estimation with wavelet-based extreme value theory: Evidence from emerging markets

    NASA Astrophysics Data System (ADS)

    Cifter, Atilla

    2011-06-01

    This paper introduces wavelet-based extreme value theory (EVT) for univariate value-at-risk estimation. Wavelets and EVT are combined for volatility forecasting to estimate a hybrid model. In the first stage, wavelets are used as a threshold in generalized Pareto distribution, and in the second stage, EVT is applied with a wavelet-based threshold. This new model is applied to two major emerging stock markets: the Istanbul Stock Exchange (ISE) and the Budapest Stock Exchange (BUX). The relative performance of wavelet-based EVT is benchmarked against the Riskmetrics-EWMA, ARMA-GARCH, generalized Pareto distribution, and conditional generalized Pareto distribution models. The empirical results show that the wavelet-based extreme value theory increases predictive performance of financial forecasting according to number of violations and tail-loss tests. The superior forecasting performance of the wavelet-based EVT model is also consistent with Basel II requirements, and this new model can be used by financial institutions as well.

  11. PERSISTENCE MAPPING USING EUV SOLAR IMAGER DATA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thompson, B. J.; Young, C. A., E-mail: barbara.j.thompson@nasa.gov

    We describe a simple image processing technique that is useful for the visualization and depiction of gradually evolving or intermittent structures in solar physics extreme-ultraviolet imagery. The technique is an application of image segmentation, which we call “Persistence Mapping,” to isolate extreme values in a data set, and is particularly useful for the problem of capturing phenomena that are evolving in both space and time. While integration or “time-lapse” imaging uses the full sample (of size N ), Persistence Mapping rejects ( N − 1)/ N of the data set and identifies the most relevant 1/ N values using themore » following rule: if a pixel reaches an extreme value, it retains that value until that value is exceeded. The simplest examples isolate minima and maxima, but any quantile or statistic can be used. This paper demonstrates how the technique has been used to extract the dynamics in long-term evolution of comet tails, erupting material, and EUV dimming regions.« less

  12. Trees, soils, and food security

    PubMed Central

    Sanchez, P. A.; Buresh, R. J.; Leakey, R. R. B.

    1997-01-01

    Trees have a different impact on soil properties than annual crops, because of their longer residence time, larger biomass accumulation, and longer-lasting, more extensive root systems. In natural forests nutrients are efficiently cycled with very small inputs and outputs from the system. In most agricultural systems the opposite happens. Agroforestry encompasses the continuum between these extremes, and emerging hard data is showing that successful agroforestry systems increase nutrient inputs, enhance internal flows, decrease nutrient losses and provide environmental benefits: when the competition for growth resources between the tree and the crop component is well managed. The three main determinants for overcoming rural poverty in Africa are (i) reversing soil fertility depletion, (ii) intensifying and diversifying land use with high-value products, and (iii) providing an enabling policy environment for the smallholder farming sector. Agroforestry practices can improve food production in a sustainable way through their contribution to soil fertility replenishment. The use of organic inputs as a source of biologically-fixed nitrogen, together with deep nitrate that is captured by trees, plays a major role in nitrogen replenishment. The combination of commercial phosphorus fertilizers with available organic resources may be the key to increasing and sustaining phosphorus capital. High-value trees, 'Cinderella' species, can fit in specific niches on farms, thereby making the system ecologically stable and more rewarding economically, in addition to diversifying and increasing rural incomes and improving food security. In the most heavily populated areas of East Africa, where farm size is extremely small, the number of trees on farms is increasing as farmers seek to reduce labour demands, compatible with the drift of some members of the family into the towns to earn off-farm income. Contrary to the concept that population pressure promotes deforestation, there is evidence that demonstrates that there are conditions under which increasing tree planting is occurring on farms in the tropics through successful agroforestry as human population density increases.

  13. Ultra-low current biosensor output detection using portable electronic reader

    NASA Astrophysics Data System (ADS)

    Yahaya, N. A. N.; Rajapaksha, R. D. A. A.; Uda, M. N. Afnan; Hashim, U.

    2017-09-01

    Generally, the electrical biosensor usually shows extremely low current signal output around pico ampere to microampere range. In this research, electronic reader with amplifier has been demonstrated to detect ultra low current via the biosensor. The operational amplifier Burr-Brown OPA 128 and Arduino Uno board were used to construct the portable electronic reader. There are two cascaded inverting amplifier were used to detect ultra low current through the biosensor from pico amperes (pA) to nano amperes ranges (nA). A small known input current was form by applying variable voltage between 0.1V to 5.0V across a 5GΩ high resistor to check the amplifier circuit. The amplifier operation was measured with the high impedance current source and has been compared with the theoretical measurement. The Arduino Uno was used to convert the analog signal to digital signal and process the data to display on reader screen. In this project, Proteus software was used to design and test the circuit. Then it was implemented together with Arduino Uno board. Arduino board was programmed using C programming language to make whole circuit communicate each order. The current was measured then it shows a small difference values compared to theoretical values, which is approximately 14pA.

  14. Performance verification of the Gravity and Extreme Magnetism Small explorer (GEMS) x-ray polarimeter

    NASA Astrophysics Data System (ADS)

    Enoto, Teruaki; Black, J. Kevin; Kitaguchi, Takao; Hayato, Asami; Hill, Joanne E.; Jahoda, Keith; Tamagawa, Toru; Kaneko, Kenta; Takeuchi, Yoko; Yoshikawa, Akifumi; Marlowe, Hannah; Griffiths, Scott; Kaaret, Philip E.; Kenward, David; Khalid, Syed

    2014-07-01

    Polarimetry is a powerful tool for astrophysical observations that has yet to be exploited in the X-ray band. For satellite-borne and sounding rocket experiments, we have developed a photoelectric gas polarimeter to measure X-ray polarization in the 2-10 keV range utilizing a time projection chamber (TPC) and advanced micro-pattern gas electron multiplier (GEM) techniques. We carried out performance verification of a flight equivalent unit (1/4 model) which was planned to be launched on the NASA Gravity and Extreme Magnetism Small Explorer (GEMS) satellite. The test was performed at Brookhaven National Laboratory, National Synchrotron Light Source (NSLS) facility in April 2013. The polarimeter was irradiated with linearly-polarized monochromatic X-rays between 2.3 and 10.0 keV and scanned with a collimated beam at 5 different detector positions. After a systematic investigation of the detector response, a modulation factor >=35% above 4 keV was obtained with the expected polarization angle. At energies below 4 keV where the photoelectron track becomes short, diffusion in the region between the GEM and readout strips leaves an asymmetric photoelectron image. A correction method retrieves an expected modulation angle, and the expected modulation factor, ~20% at 2.7 keV. Folding the measured values of modulation through an instrument model gives sensitivity, parameterized by minimum detectable polarization (MDP), nearly identical to that assumed at the preliminary design review (PDR).

  15. Reliability of the mangled extremity severity score in combat-related upper and lower extremity injuries.

    PubMed

    Ege, Tolga; Unlu, Aytekin; Tas, Huseyin; Bek, Dogan; Turkan, Selim; Cetinkaya, Aytac

    2015-01-01

    Decision of limb salvage or amputation is generally aided with several trauma scoring systems such as the mangled extremity severity score (MESS). However, the reliability of the injury scores in the settling of open fractures due to explosives and missiles is challenging. Mortality and morbidity of the extremity trauma due to firearms are generally associated with time delay in revascularization, injury mechanism, anatomy of the injured site, associated injuries, age and the environmental circumstance. The purpose of the retrospective study was to evaluate the extent of extremity injuries due to ballistic missiles and to detect the reliability of mangled extremity severity score (MESS) in both upper and lower extremities. Between 2004 and 2014, 139 Gustillo Anderson Type III open fractures of both the upper and lower extremities were enrolled in the study. Data for patient age, fire arm type, transporting time from the field to the hospital (and the method), injury severity scores, MESS scores, fracture types, amputation levels, bone fixation methods and postoperative infections and complications retrieved from the two level-2 trauma center's data base. Sensitivity, specificity, positive and negative predictive values of the MESS were calculated to detect the ability in deciding amputation in the mangled limb. Amputation was performed in 39 extremities and limb salvage attempted in 100 extremities. The mean followup time was 14.6 months (range 6-32 months). In the amputated group, the mean MESS scores for upper and lower extremity were 8.8 (range 6-11) and 9.24 (range 6-11), respectively. In the limb salvage group, the mean MESS scores for upper and lower extremities were 5.29 (range 4-7) and 5.19 (range 3-8), respectively. Sensitivity of MESS in upper and lower extremities were calculated as 80% and 79.4% and positive predictive values detected as 55.55% and 83.3%, respectively. Specificity of MESS score for upper and lower extremities was 84% and 86.6%; negative predictive values were calculated as 95.45% and 90.2%, respectively. MESS is not predictive in combat related extremity injuries especially if between a score of 6-8. Limb ischemia and presence or absence of shock can be used in initial decision-making for amputation.

  16. Reliability of the mangled extremity severity score in combat-related upper and lower extremity injuries

    PubMed Central

    Ege, Tolga; Unlu, Aytekin; Tas, Huseyin; Bek, Dogan; Turkan, Selim; Cetinkaya, Aytac

    2015-01-01

    Background: Decision of limb salvage or amputation is generally aided with several trauma scoring systems such as the mangled extremity severity score (MESS). However, the reliability of the injury scores in the settling of open fractures due to explosives and missiles is challenging. Mortality and morbidity of the extremity trauma due to firearms are generally associated with time delay in revascularization, injury mechanism, anatomy of the injured site, associated injuries, age and the environmental circumstance. The purpose of the retrospective study was to evaluate the extent of extremity injuries due to ballistic missiles and to detect the reliability of mangled extremity severity score (MESS) in both upper and lower extremities. Materials and Methods: Between 2004 and 2014, 139 Gustillo Anderson Type III open fractures of both the upper and lower extremities were enrolled in the study. Data for patient age, fire arm type, transporting time from the field to the hospital (and the method), injury severity scores, MESS scores, fracture types, amputation levels, bone fixation methods and postoperative infections and complications retrieved from the two level-2 trauma center's data base. Sensitivity, specificity, positive and negative predictive values of the MESS were calculated to detect the ability in deciding amputation in the mangled limb. Results: Amputation was performed in 39 extremities and limb salvage attempted in 100 extremities. The mean followup time was 14.6 months (range 6–32 months). In the amputated group, the mean MESS scores for upper and lower extremity were 8.8 (range 6–11) and 9.24 (range 6–11), respectively. In the limb salvage group, the mean MESS scores for upper and lower extremities were 5.29 (range 4–7) and 5.19 (range 3–8), respectively. Sensitivity of MESS in upper and lower extremities were calculated as 80% and 79.4% and positive predictive values detected as 55.55% and 83.3%, respectively. Specificity of MESS score for upper and lower extremities was 84% and 86.6%; negative predictive values were calculated as 95.45% and 90.2%, respectively. Conclusion: MESS is not predictive in combat related extremity injuries especially if between a score of 6–8. Limb ischemia and presence or absence of shock can be used in initial decision-making for amputation. PMID:26806974

  17. Statistical Modeling of Extreme Values and Evidence of Presence of Dragon King (DK) in Solar Wind

    NASA Astrophysics Data System (ADS)

    Gomes, T.; Ramos, F.; Rempel, E. L.; Silva, S.; C-L Chian, A.

    2017-12-01

    The solar wind constitutes a nonlinear dynamical system, presenting intermittent turbulence, multifractality and chaotic dynamics. One characteristic shared by many such complex systems is the presence of extreme events, that play an important role in several Geophysical phenomena and their statistical characterization is a problem of great practical relevance. This work investigates the presence of extreme events in time series of the modulus of the interplanetary magnetic field measured by Cluster spacecraft on February 2, 2002. One of the main results is that the solar wind near the Earth's bow shock can be modeled by the Generalized Pareto (GP) and Generalized Extreme Values (GEV) distributions. Both models present a statistically significant positive shape parameter which implyies a heavy tail in the probability distribution functions and an unbounded growth in return values as return periods become too long. There is evidence that current sheets are the main responsible for positive values of the shape parameter. It is also shown that magnetic reconnection at the interface between two interplanetary magnetic flux ropes in the solar wind can be considered as Dragon Kings (DK), a class of extreme events whose formation mechanisms are fundamentally different from others. As long as magnetic reconnection can be classified as a Dragon King, there is the possibility of its identification and even its prediction. Dragon kings had previously been identified in time series of financial crashes, nuclear power generation accidents, stock market and so on. It is believed that they are associated with the occurrence of extreme events in dynamical systems at phase transition, bifurcation, crises or tipping points.

  18. The Total Ozone Series of Arosa: History, Homogenization and new results using statistical extreme value theory

    NASA Astrophysics Data System (ADS)

    Staehelin, J.; Rieder, H. E.; Maeder, J. A.; Ribatet, M.; Davison, A. C.; Stübi, R.

    2009-04-01

    Atmospheric ozone protects the biota living at the Earth's surface from harmful solar UV-B and UV-C radiation. The global ozone shield is expected to gradually recover from the anthropogenic disturbance of ozone depleting substances (ODS) in the coming decades. The stratospheric ozone layer at extratropics might significantly increase above the thickness of the chemically undisturbed atmosphere which might enhance ozone concentrations at the tropopause altitude where ozone is an important greenhouse gas. At Arosa, a resort village in the Swiss Alps, total ozone measurements started in 1926 leading to the longest total ozone series of the world. One Fery spectrograph and seven Dobson spectrophotometers were operated at Arosa and the method used to homogenize the series will be presented. Due to its unique length the series allows studying total ozone in the chemically undisturbed as well as in the ODS loaded stratosphere. The series is particularly valuable to study natural variability in the period prior to 1970, when ODS started to affect stratospheric ozone. Concepts developed by extreme value statistics allow objective definitions of "ozone extreme high" and "ozone extreme low" values by fitting the (daily mean) time series using the Generalized Pareto Distribution (GPD). Extreme high ozone events can be attributed to effects of ElNino and/or NAO, whereas in the chemically disturbed stratosphere high frequencies of extreme low total ozone values simultaneously occur with periods of strong polar ozone depletion (identified by statistical modeling with Equivalent Stratospheric Chlorine times Volume of Stratospheric Polar Clouds) and volcanic eruptions (such as El Chichon and Pinatubo).

  19. Determination of boundaries between ranges of high and low gradient of beam profile.

    PubMed

    Wendykier, Jacek; Bieniasiewicz, Marcin; Grządziel, Aleksandra; Jedynak, Tadeusz; Kośniewski, Wiktor; Reudelsdorf, Marta; Wendykier, Piotr

    2016-01-01

    This work addresses the problem of treatment planning system commissioning by introducing a new method of determination of boundaries between high and low gradient in beam profile. The commissioning of a treatment planning system is a very important task in the radiation therapy. One of the main goals of this task is to compare two field profiles: measured and calculated. Applying points of 80% and 120% of nominal field size can lead to the incorrect determination of boundaries, especially for small field sizes. The method that is based on the beam profile gradient allows for proper assignment of boundaries between high and low gradient regions even for small fields. TRS 430 recommendations for commissioning were used. The described method allows a separation between high and low gradient, because it directly uses the value of the gradient of a profile. For small fields, the boundaries determined by the new method allow a commissioning of a treatment planning system according to the TRS 430, while the point of 80% of nominal field size is already in the high gradient region. The method of determining the boundaries by using the beam profile gradient can be extremely helpful during the commissioning of the treatment planning system for Intensity Modulated Radiation Therapy or for other techniques which require very small field sizes.

  20. Incidence of elbow injuries in adolescent baseball players: screening by a low field magnetic resonance imaging system specialized for small joints.

    PubMed

    Okamoto, Yoshikazu; Maehara, Kiyoshi; Kanahori, Tetsuya; Hiyama, Takashi; Kawamura, Takashi; Minami, Manabu

    2016-04-01

    The aim of this preliminary study was to examine the capability of screening for elbow injuries induced by baseball using a low field small joint MRI system. Sixty-two players in the 4th-6th elementary school grades, with ages ranging from 9 to 12 years, participated in this study. Screening for elbow injuries was performed using a low-magnetic-field (0.2-T) magnetic resonance imaging (MRI) system designed for examinations of small joints of the extremities. Gradient-echo coronal, sagittal, and short-tau inversion recovery (STIR) coronal images of the dominant arm used for pitching were obtained to identify medial collateral ligament (MCL) injuries with or without avulsion fracture and osteochondritis dissecans. All 62 examinations were performed successfully, with 26 players (41.9 %) showing positive findings, all being confined to the MCL. No child showed bone damage. All criteria in the MRI evaluation of injuries showed high agreement rates and kappa values between two radiologists. Screening for early detection of elbow injuries in junior Japanese baseball players can be successfully performed using a low-field MRI system specialized for small joints. The percentage of MCL injury without avulsion fracture was unexpectedly high (41.9 %).

  1. Evidence for multidecadal variability in US extreme sea level records

    NASA Astrophysics Data System (ADS)

    Wahl, Thomas; Chambers, Don P.

    2015-03-01

    We analyze a set of 20 tide gauge records covering the contiguous United States (US) coastline and the period from 1929 to 2013 to identify long-term trends and multidecadal variations in extreme sea levels (ESLs) relative to changes in mean sea level (MSL). Different data sampling and analysis techniques are applied to test the robustness of the results against the selected methodology. Significant but small long-term trends in ESLs above/below MSL are found at individual sites along most coastline stretches, but are mostly confined to the southeast coast and the winter season when storm surges are primarily driven by extratropical cyclones. We identify six regions with broadly coherent and considerable multidecadal ESL variations unrelated to MSL changes. Using a quasi-nonstationary extreme value analysis, we show that the latter would have caused variations in design relevant return water levels (50-200 year return periods) ranging from ˜10 cm to as much as 110 cm across the six regions. The results raise questions as to the applicability of the "MSL offset method," assuming that ESL changes are primarily driven by changes in MSL without allowing for distinct long-term trends or low-frequency variations. Identifying the coherent multidecadal ESL variability is crucial in order to understand the physical driving factors. Ultimately, this information must be included into coastal design and adaptation processes.

  2. Variability in winter climate and winter extremes reduces population growth of an alpine butterfly.

    PubMed

    Roland, Jens; Matter, Stephen F

    2013-01-01

    We examined the long-term, 15-year pattern of population change in a network of 21 Rocky Mountain populations of Parnassius smintheus butterflies in response to climatic variation. We found that winter values of the broadscale climate variable, the Pacific Decadal Oscillation (PDO) index, were a strong predictor of annual population growth, much more so than were endogenous biotic factors related to population density. The relationship between PDO and population growth was nonlinear. Populations declined in years with extreme winter PDO values, when there were either extremely warm or extremely cold sea surface temperatures in the eastern Pacific relative to that in the western Pacific. Results suggest that more variable winters, and more frequent extremely cold or warm winters, will result in more frequent decline of these populations, a pattern exacerbated by the trend for increasingly variable winters seen over the past century.

  3. Validation of extremes within the Perfect-Predictor Experiment of the COST Action VALUE

    NASA Astrophysics Data System (ADS)

    Hertig, Elke; Maraun, Douglas; Wibig, Joanna; Vrac, Mathieu; Soares, Pedro; Bartholy, Judith; Pongracz, Rita; Mares, Ileana; Gutierrez, Jose Manuel; Casanueva, Ana; Alzbutas, Robertas

    2016-04-01

    Extreme events are of widespread concern due to their damaging consequences on natural and anthropogenic systems. From science to applications the statistical attributes of rare and infrequent occurrence and low probability become connected with the socio-economic aspect of strong impact. Specific end-user needs regarding information about extreme events depend on the type of application, but as a joining element there is always the request for easily accessible climate change information with a clear description of their uncertainties and limitations. Within the Perfect-Predictor Experiment of the COST Action VALUE extreme indices modelled from a wide range of downscaling methods are compared to reference indices calculated from observational data. The experiment uses reference data from a selection of 86 weather stations representative of the different climates in Europe. Results are presented for temperature and precipitation extremes and include aspects of the marginal distribution as well as spell-length related aspects.

  4. Extremes in ecology: Avoiding the misleading effects of sampling variation in summary analyses

    USGS Publications Warehouse

    Link, W.A.; Sauer, J.R.

    1996-01-01

    Surveys such as the North American Breeding Bird Survey (BBS) produce large collections of parameter estimates. One's natural inclination when confronted with lists of parameter estimates is to look for the extreme values: in the BBS, these correspond to the species that appear to have the greatest changes in population size through time. Unfortunately, extreme estimates are liable to correspond to the most poorly estimated parameters. Consequently, the most extreme parameters may not match up with the most extreme parameter estimates. The ranking of parameter values on the basis of their estimates are a difficult statistical problem. We use data from the BBS and simulations to illustrate the potential misleading effects of sampling variation in rankings of parameters. We describe empirical Bayes and constrained empirical Bayes procedures which provide partial solutions to the problem of ranking in the presence of sampling variation.

  5. Avian thermoregulation in the heat: metabolism, evaporative cooling and gular flutter in two small owls.

    PubMed

    Talbot, William A; Gerson, Alexander R; Smith, Eric Krabbe; McKechnie, Andrew E; Wolf, Blair O

    2018-06-20

    The thermoregulatory responses of owls to heat stress have been the subject of few studies. Although nocturnality buffers desert-dwelling owls from significant heat stress during activity, roost sites in tree and cactus cavities or in deep shade provide only limited refuge from high environmental temperatures during the day. We measured thermoregulatory responses to acute heat stress in two species of small owls, the elf owl ( Micrathene whitneyi ) and the western screech-owl ( Megascops kennicottii ), which occupy the Sonoran Desert of southwestern North America, an area of extreme heat and aridity. We exposed wild-caught birds to progressively increasing air temperatures ( T a ) and measured resting metabolic rate (RMR), evaporative water loss (EWL), body temperature ( T b ) and heat tolerance limits (HTL; the maximum T a reached). Comparatively low RMR values were observed in both species, T b approximated T a at 40°C and mild hyperthermia occurred as T a was increased toward the HTL. Elf owls and screech-owls reached HTLs of 48 and 52°C, respectively, and RMR increased to 1.5 and 1.9 times thermoneutral values. Rates of EWL at the HTL allowed for the dissipation of 167-198% of metabolic heat production (MHP). Gular flutter was used as the primary means of evaporative heat dissipation and produced large increases in evaporative heat loss (44-100%), accompanied by only small increases (<5%) in RMR. These small, cavity-nesting owls have thermoregulatory capacities that are intermediate between those of the open-ground nesting nightjars and the passerines that occupy the same ecosystem. © 2018. Published by The Company of Biologists Ltd.

  6. Power laws and extreme values in antibody repertoires

    NASA Astrophysics Data System (ADS)

    Boyer, Sebastien; Biswas, Dipanwita; Scaramozzino, Natale; Kumar, Ananda Soshee; Nizak, Clément; Rivoire, Olivier

    2015-03-01

    Evolution by natural selection involves the succession of three steps: mutations, selection and proliferation. We are interested in describing and characterizing the result of selection over a population of many variants. After selection, this population will be dominated by the few best variants, with highest propensity to be selected, or highest ``selectivity.'' We ask the following question: how is the selectivity of the best variants distributed in the population? Extreme value theory, which characterizes the extreme tail of probability distributions in terms of a few universality class, has been proposed to describe it. To test this proposition and identify the relevant universality class, we performed quantitative in vitro experimental selections of libraries of >105 antibodies using the technique of phage display. Data obtained by high-throughput sequencing allows us to fit the selectivity distribution over more than two decades. In most experiments, the results show a striking power law for the selectivity distribution of the top antibodies, consistent with extreme value theory.

  7. Evaluation of the best fit distribution for partial duration series of daily rainfall in Madinah, western Saudi Arabia

    NASA Astrophysics Data System (ADS)

    Alahmadi, F.; Rahman, N. A.; Abdulrazzak, M.

    2014-09-01

    Rainfall frequency analysis is an essential tool for the design of water related infrastructure. It can be used to predict future flood magnitudes for a given magnitude and frequency of extreme rainfall events. This study analyses the application of rainfall partial duration series (PDS) in the vast growing urban Madinah city located in the western part of Saudi Arabia. Different statistical distributions were applied (i.e. Normal, Log Normal, Extreme Value type I, Generalized Extreme Value, Pearson Type III, Log Pearson Type III) and their distribution parameters were estimated using L-moments methods. Also, different selection criteria models are applied, e.g. Akaike Information Criterion (AIC), Corrected Akaike Information Criterion (AICc), Bayesian Information Criterion (BIC) and Anderson-Darling Criterion (ADC). The analysis indicated the advantage of Generalized Extreme Value as the best fit statistical distribution for Madinah partial duration daily rainfall series. The outcome of such an evaluation can contribute toward better design criteria for flood management, especially flood protection measures.

  8. Variance analysis of forecasted streamflow maxima in a wet temperate climate

    NASA Astrophysics Data System (ADS)

    Al Aamery, Nabil; Fox, James F.; Snyder, Mark; Chandramouli, Chandra V.

    2018-05-01

    Coupling global climate models, hydrologic models and extreme value analysis provides a method to forecast streamflow maxima, however the elusive variance structure of the results hinders confidence in application. Directly correcting the bias of forecasts using the relative change between forecast and control simulations has been shown to marginalize hydrologic uncertainty, reduce model bias, and remove systematic variance when predicting mean monthly and mean annual streamflow, prompting our investigation for maxima streamflow. We assess the variance structure of streamflow maxima using realizations of emission scenario, global climate model type and project phase, downscaling methods, bias correction, extreme value methods, and hydrologic model inputs and parameterization. Results show that the relative change of streamflow maxima was not dependent on systematic variance from the annual maxima versus peak over threshold method applied, albeit we stress that researchers strictly adhere to rules from extreme value theory when applying the peak over threshold method. Regardless of which method is applied, extreme value model fitting does add variance to the projection, and the variance is an increasing function of the return period. Unlike the relative change of mean streamflow, results show that the variance of the maxima's relative change was dependent on all climate model factors tested as well as hydrologic model inputs and calibration. Ensemble projections forecast an increase of streamflow maxima for 2050 with pronounced forecast standard error, including an increase of +30(±21), +38(±34) and +51(±85)% for 2, 20 and 100 year streamflow events for the wet temperate region studied. The variance of maxima projections was dominated by climate model factors and extreme value analyses.

  9. Relationship between diffusion tensor fractional anisotropy and motor outcome in patients with hemiparesis after corona radiata infarct.

    PubMed

    Koyama, Tetsuo; Marumoto, Kohei; Miyake, Hiroji; Domen, Kazuhisa

    2013-11-01

    This study examined the relationship between fractional anisotropy (FA) values of magnetic resonance-diffusion tensor imaging (DTI) and motor outcome (1 month after onset) in 15 patients with hemiparesis after ischemic stroke of corona radiata lesions. DTI data were obtained on days 14-18. FA values within the cerebral peduncle were analyzed using a computer-automated method. Motor outcome of hemiparesis was evaluated according to Brunnstrom stage (BRS; 6-point scale: severe to normal) for separate shoulder/elbow/forearm, wrist/hand, and lower extremity functions. The ratio of FA values in the affected hemisphere to those in the unaffected hemisphere (rFA) was assessed in relation to the BRS data (Spearman rank correlation test, P<.05). rFA values ranged from .715 to 1.002 (median=.924). BRS ranged from 1 to 6 (median=4) for shoulder/elbow/forearm, from 1 to 6 (median=5) for wrist/hand, and from 2 to 6 (median=4) for the lower extremities. Analysis revealed statistically significant relationships between rFA and upper extremity functions (correlation coefficient=.679 for shoulder/elbow/forearm and .706 for wrist/hand). Although slightly less evident, the relationship between rFA and lower extremity function was also statistically significant (correlation coefficient=.641). FA values within the cerebral peduncle are moderately associated with the outcome of both upper and lower extremity functions, suggesting that DTI may be applicable for outcome prediction in stroke patients with corona radiata infarct. Copyright © 2013 National Stroke Association. Published by Elsevier Inc. All rights reserved.

  10. Nearly extremal apparent horizons in simulations of merging black holes

    NASA Astrophysics Data System (ADS)

    Lovelace, Geoffrey; Scheel, Mark A.; Owen, Robert; Giesler, Matthew; Katebi, Reza; Szilágyi, Béla; Chu, Tony; Demos, Nicholas; Hemberger, Daniel A.; Kidder, Lawrence E.; Pfeiffer, Harald P.; Afshari, Nousha

    2015-03-01

    The spin angular momentum S of an isolated Kerr black hole is bounded by the surface area A of its apparent horizon: 8π S≤slant A, with equality for extremal black holes. In this paper, we explore the extremality of individual and common apparent horizons for merging, rapidly spinning binary black holes. We consider simulations of merging black holes with equal masses M and initial spin angular momenta aligned with the orbital angular momentum, including new simulations with spin magnitudes up to S/{{M}2}=0.994. We measure the area and (using approximate Killing vectors) the spin on the individual and common apparent horizons, finding that the inequality 8π S\\lt A is satisfied in all cases but is very close to equality on the common apparent horizon at the instant it first appears. We also evaluate the Booth-Fairhurst extremality, whose value for a given apparent horizon depends on the scaling of the horizon’s null normal vectors. In particular, we introduce a gauge-invariant lower bound on the extremality by computing the smallest value that Booth and Fairhurst’s extremality parameter can take for any scaling. Using this lower bound, we conclude that the common horizons are at least moderately close to extremal just after they appear. Finally, following Lovelace et al (2008 Phys. Rev. D 78 084017), we construct quasiequilibrium binary-black hole initial data with ‘overspun’ marginally trapped surfaces with 8π S\\gt A. We show that the overspun surfaces are indeed superextremal: our lower bound on their Booth-Fairhurst extremality exceeds unity. However, we confirm that these superextremal surfaces are always surrounded by marginally outer trapped surfaces (i.e., by apparent horizons) with 8π S\\lt A. The extremality lower bound on the enclosing apparent horizon is always less than unity but can exceed the value for an extremal Kerr black hole.

  11. The balance and harmony of control power for a combat aircraft in tactical maneuvering

    NASA Technical Reports Server (NTRS)

    Bocvarov, Spiro; Cliff, Eugene M.; Lutze, Frederick H.

    1992-01-01

    An analysis is presented for a family of regular extremal attitude-maneuvers for the High Angle-of-Attack Research Vehicle that has thrust-vectoring capability. Different levels of dynamic coupling are identified in the combat aircraft attitude model, and the characteristic extremal-family motion is explained. It is shown why the extremal-family trajectories develop small sideslip-angles, a highly desirable feature from a practical viewpoint.

  12. Risk-based consequences of extreme natural hazard processes in mountain regions - Multi-hazard analysis in Tyrol (Austria)

    NASA Astrophysics Data System (ADS)

    Huttenlau, Matthias; Stötter, Johann

    2010-05-01

    Reinsurance companies are stating a high increase in natural hazard related losses, both insured and economic losses, within the last decades on a global scale. This ongoing trend can be described as a product of the dynamic in the natural and in the anthroposphere. To analyze the potential impact of natural hazard process to a certain insurance portfolio or to the society in general, reinsurance companies or risk management consultants have developed loss models. However, those models are generally not fitting the scale dependent demand on regional scales like it is appropriate (i) for analyses on the scale of a specific province or (ii) for portfolio analyses of regional insurance companies. Moreover, the scientific basis of most of the models is not transparent documented and therefore scientific evaluations concerning the methodology concepts are not possible (black box). This is contrary to the scientific principles of transparency and traceability. Especially in mountain regions like the European Alps with their inherent (i) specific characteristic on small scales, (ii) the relative high process dynamics in general, (iii) the occurrence of gravitative mass movements which are related to high relief energy and thus only exists in mountain regions, (iv) the small proportion of the area of permanent settlement on the overall area, (v) the high value concentration in the valley floors, (vi) the exposition of important infrastructures and lifelines, and others, analyses must consider these circumstances adequately. Therefore, risk-based analyses are methodically estimating the potential consequences of hazard process on the built environment standardized with the risk components (i) hazard, (ii) elements at risk, and (iii) vulnerability. However, most research and progress have been made in the field of hazard analyses, whereas the other both components are not developed accordingly. Since these three general components are influencing factors without any weighting within the risk concept, this has sufficient implications on the results of risk analyses. Thus, an equal and scale appropriated balance of those risk components is a fundamental key factor for effective natural hazard risk analyses. The results of such analyses inform especially decision makers in the insurance industry, the administration, and politicians on potential consequences and are the basis for appropriate risk management strategies. Thereby, results (i) on an annual or probabilistic risk comprehension have to be distinguished from (ii) scenario-based analyses. The first analyses are based on statistics of periodically or episodically occurring events whereas the latter approach is especially applied for extreme, non-linear, stochastic events. Focusing on the needs especially of insurance companies, the first approaches are appropriate for premium pricing and reinsurance strategies with an annual perspective, whereas the latter is focusing on events with extreme loss burdens under worst-case criteria to guarantee accordant reinsurance coverage. Moreover, the demand of adequate loss model approaches and methods is strengthened by the risk-based requirements of the upcoming capital requirement directive Solvency II. The present study estimates the potential elements at risk, their corresponding damage potentials and the Probable Maximum Losses (PMLs) of extreme natural hazards events in Tyrol (Austria) and considers adequatly the scale dependency and balanced application of the introduced risk components. Beside the introduced analysis an additionally portfolio analysis of a regional insurance company was executed. The geocoded insurance contracts of this portfolio analysis were the basis to estimate spatial, socio-economical and functional differentiated mean insurance values for the different risk categories of (i) buildings, (ii) contents or inventory, (iii) vehicles, and (iv) persons in the study area. The estimated mean insurance values were incorporated with additional GIS and statistic data to a comprehensive property-by-property geodatabase of the existing elements and values. This stock of elements and values geodatabase is furthermore the consistent basis for all natural hazard analyses and enables the comparison of the results. The study follows the general accepted moduls (i) hazard analysis, (ii) exposition analysis, and (iii) consequence analysis, whereas the exposition analysis estimates the elements at risk with their corresponding damage potentials and the consequence analysis estimates the PMLs. This multi-hazard analysis focuses on process types with a high to extreme potential of negative consequences on a regional scale. In this context, (i) floodings, (ii) rockslides with the potential of corresponding consequence effects (backwater ponding and outburst flood), (iii) earthquakes, (iv) hail events, and (v) winter storms were considered as hazard processes. Based on general hazard analyses (hazard maps) concrete scenarios and their spatial affectedness were determined. For the different hazard processes, different vulnerability approaches were considered to demonstrate their sensitivity and implication on the results. Thus, no absolute values of losses but probable loss ranges were estimated. It can be shown, that the most serious amount of losses would arise from extreme earthquake events with loss burdens up to more than € 7 bn. solely on buildings and inventory. Possible extreme flood events could lead to losses between € 2 and 2.5 bn., whereas a severe hail swath which affects the central Inn valley could result in losses of ca. € 455 mill. (thereof € 285 mill. on vehicles). The potential most serious rockslide with additional consequence effects would result in losses up to ca. € 185 mill. and extreme winter storms can induce losses between € 100 mill. and 150 mill..

  13. An Application of Extreme Value Theory to Learning Analytics: Predicting Collaboration Outcome from Eye-Tracking Data

    ERIC Educational Resources Information Center

    Sharma, Kshitij; Chavez-Demoulin, Valérie; Dillenbourg, Pierre

    2017-01-01

    The statistics used in education research are based on central trends such as the mean or standard deviation, discarding outliers. This paper adopts another viewpoint that has emerged in statistics, called extreme value theory (EVT). EVT claims that the bulk of normal distribution is comprised mainly of uninteresting variations while the most…

  14. Empirical Bayes estimation of proportions with application to cowbird parasitism rates

    USGS Publications Warehouse

    Link, W.A.; Hahn, D.C.

    1996-01-01

    Bayesian models provide a structure for studying collections of parameters such as are considered in the investigation of communities, ecosystems, and landscapes. This structure allows for improved estimation of individual parameters, by considering them in the context of a group of related parameters. Individual estimates are differentially adjusted toward an overall mean, with the magnitude of their adjustment based on their precision. Consequently, Bayesian estimation allows for a more credible identification of extreme values in a collection of estimates. Bayesian models regard individual parameters as values sampled from a specified probability distribution, called a prior. The requirement that the prior be known is often regarded as an unattractive feature of Bayesian analysis and may be the reason why Bayesian analyses are not frequently applied in ecological studies. Empirical Bayes methods provide an alternative approach that incorporates the structural advantages of Bayesian models while requiring a less stringent specification of prior knowledge. Rather than requiring that the prior distribution be known, empirical Bayes methods require only that it be in a certain family of distributions, indexed by hyperparameters that can be estimated from the available data. This structure is of interest per se, in addition to its value in allowing for improved estimation of individual parameters; for example, hypotheses regarding the existence of distinct subgroups in a collection of parameters can be considered under the empirical Bayes framework by allowing the hyperparameters to vary among subgroups. Though empirical Bayes methods have been applied in a variety of contexts, they have received little attention in the ecological literature. We describe the empirical Bayes approach in application to estimation of proportions, using data obtained in a community-wide study of cowbird parasitism rates for illustration. Since observed proportions based on small sample sizes are heavily adjusted toward the mean, extreme values among empirical Bayes estimates identify those species for which there is the greatest evidence of extreme parasitism rates. Applying a subgroup analysis to our data on cowbird parasitism rates, we conclude that parasitism rates for Neotropical Migrants as a group are no greater than those of Resident/Short-distance Migrant species in this forest community. Our data and analyses demonstrate that the parasitism rates for certain Neotropical Migrant species are remarkably low (Wood Thrush and Rose-breasted Grosbeak) while those for others are remarkably high (Ovenbird and Red-eyed Vireo).

  15. Susceptibility to Mortality in Weather Extremes: Effect Modification by Personal and Small Area Characteristics In a Multi-City Case-Only Analysis

    PubMed Central

    Zanobetti, Antonella; O’Neill, Marie S.; Gronlund, Carina J.; Schwartz, Joel D

    2015-01-01

    Background Extremes of temperature have been associated with short-term increases in daily mortality. We identified subpopulations with increased susceptibility to dying during temperature extremes, based on personal demographics, small-area characteristics and preexisting medical conditions. Methods We examined Medicare participants in 135 U.S. cities and identified preexisting conditions based on hospitalization records prior to their deaths, from 1985–2006. Personal characteristics were obtained from the Medicare records, and area characteristics were assigned based on zip-code of residence. We conducted a case-only analysis of over 11 million deaths, and evaluated modification of the risk of dying associated with extremely hot days and extremely cold days, continuous temperatures, and water-vapor pressure. Modifiers included preexisting conditions, personal characteristics, zip-code-level population characteristics, and land-cover characteristics. For each effect modifier, a city-specific logistic regression model was fitted and then an overall national estimate was calculated using meta-analysis. Results People with certain preexisting conditions were more susceptible to extreme heat, with an additional 6% (95% confidence interval= 4% – 8%) increase in the risk of dying on an extremely hot day in subjects with previous admission for atrial fibrillation, an additional 8% (4%–12%) in subjects with Alzheimer disease, and an additional 6% (3%–9%) in subjects with dementia. Zip-code level and personal characteristics were also associated with increased susceptibility to temperature. Conclusions We identified several subgroups of the population who are particularly susceptible to temperature extremes, including persons with atrial fibrillation. PMID:24045717

  16. A comparison of observed extreme water levels at the German Bight elaborated through an extreme value analysis (EVA) with extremes derived from a regionally coupled ocean-atmospheric climate model (MPI-OM)

    NASA Astrophysics Data System (ADS)

    Möller, Jens; Heinrich, Hartmut

    2017-04-01

    As a consequence of climate change atmospheric and oceanographic extremes and their potential impacts on coastal regions are of growing concern for governmental authorities responsible for the transportation infrastructure. Highest risks for shipping as well as for rail and road traffic originate from combined effects of extremes of storm surges and heavy rainfall which sometimes lead to insufficient dewatering of inland waterways. The German Ministry of Transport and digital Infrastructure therefore has tasked its Network of Experts to investigate the possible evolutions of extreme threats for low lands and especially for Kiel Canal, which is an important shortcut for shipping between the North and Baltic Seas. In this study we present results of a comparison of an Extreme Value Analysis (EVA) carried out on gauge observations and values derived from a coupled Regional Ocean-Atmosphere Climate Model (MPI-OM). High water levels at the coasts of the North and Baltic Seas are one of the most important hazards which increase the risk of flooding of the low-lying land and prevents such areas from an adequate dewatering. In this study changes in the intensity (magnitude of the extremes) and duration of extreme water levels (above a selected threshold) are investigated for several gauge stations with data partly reaching back to 1843. Different methods are used for the extreme value statistics, (1) a stationary general Pareto distribution (GPD) model as well as (2) an instationary statistical model for better reproduction of the impact of climate change. Most gauge stations show an increase of the mean water level of about 1-2 mm/year, with a stronger increase of the highest water levels and a decrease (or lower increase) of the lowest water levels. Also, the duration of possible dewatering time intervals for the Kiel-Canal was analysed. The results for the historical gauge station observations are compared to the statistics of modelled water levels from the coupled atmosphere-ocean climate model MPI-OM for the time interval from 1951 to 2000. We demonstrate that for high water levels the observations and MPI-OM results are in good agreement, and we provide an estimate on the decreasing dewatering potential for Kiel Canal until the end of the 21st century.

  17. Weak linkage between the heaviest rainfall and tallest storms.

    PubMed

    Hamada, Atsushi; Takayabu, Yukari N; Liu, Chuntao; Zipser, Edward J

    2015-02-24

    Conventionally, the heaviest rainfall has been linked to the tallest, most intense convective storms. However, the global picture of the linkage between extreme rainfall and convection remains unclear. Here we analyse an 11-year record of spaceborne precipitation radar observations and establish that a relatively small fraction of extreme convective events produces extreme rainfall rates in any region of the tropics and subtropics. Robust differences between extreme rainfall and convective events are found in the rainfall characteristics and environmental conditions, irrespective of region; most extreme rainfall events are characterized by less intense convection with intense radar echoes not extending to extremely high altitudes. Rainfall characteristics and environmental conditions both indicate the importance of warm-rain processes in producing extreme rainfall rates. Our results demonstrate that, even in regions where severe convective storms are representative extreme weather events, the heaviest rainfall events are mostly associated with less intense convection.

  18. Trends in 1970-2010 southern California surface maximum temperatures: extremes and heat waves

    NASA Astrophysics Data System (ADS)

    Ghebreegziabher, Amanuel T.

    Daily maximum temperatures from 1970-2010 were obtained from the National Climatic Data Center (NCDC) for 28 South Coast Air Basin (SoCAB) Cooperative Network (COOP) sites. Analyses were carried out on the entire data set, as well as on the 1970-1974 and 2006-2010 sub-periods, including construction of spatial distributions and time-series trends of both summer-average and annual-maximum values and of the frequency of two and four consecutive "daytime" heat wave events. Spatial patterns of average and extreme values showed three areas consistent with climatological SoCAB flow patterns: cold coastal, warm inland low-elevation, and cool further-inland mountain top. Difference (2006-2010 minus 1970-1974) distributions of both average and extreme-value trends were consistent with the shorter period (1970-2005) study of previous study, as they showed the expected inland regional warming and a "reverse-reaction" cooling in low elevation coastal and inland areas open to increasing sea breeze flows. Annual-extreme trends generally showed cooling at sites below 600 m and warming at higher elevations. As the warming trends of the extremes were larger than those of the averages, regional warming thus impacts extremes more than averages. Spatial distributions of hot-day frequencies showed expected maximum at inland low-elevation sites. Regional warming again thus induced increases at both elevated-coastal areas, but low-elevation areas showed reverse-reaction decreases.

  19. The magnitude and colour of noise in genetic negative feedback systems.

    PubMed

    Voliotis, Margaritis; Bowsher, Clive G

    2012-08-01

    The comparative ability of transcriptional and small RNA-mediated negative feedback to control fluctuations or 'noise' in gene expression remains unexplored. Both autoregulatory mechanisms usually suppress the average (mean) of the protein level and its variability across cells. The variance of the number of proteins per molecule of mean expression is also typically reduced compared with the unregulated system, but is almost never below the value of one. This relative variance often substantially exceeds a recently obtained, theoretical lower limit for biochemical feedback systems. Adding the transcriptional or small RNA-mediated control has different effects. Transcriptional autorepression robustly reduces both the relative variance and persistence (lifetime) of fluctuations. Both benefits combine to reduce noise in downstream gene expression. Autorepression via small RNA can achieve more extreme noise reduction and typically has less effect on the mean expression level. However, it is often more costly to implement and is more sensitive to rate parameters. Theoretical lower limits on the relative variance are known to decrease slowly as a measure of the cost per molecule of mean expression increases. However, the proportional increase in cost to achieve substantial noise suppression can be different away from the optimal frontier-for transcriptional autorepression, it is frequently negligible.

  20. Metal-Organic Framework Thin Film Coated Optical Fiber Sensors: A Novel Waveguide-Based Chemical Sensing Platform.

    PubMed

    Kim, Ki-Joong; Lu, Ping; Culp, Jeffrey T; Ohodnicki, Paul R

    2018-02-23

    Integration of optical fiber with sensitive thin films offers great potential for the realization of novel chemical sensing platforms. In this study, we present a simple design strategy and high performance of nanoporous metal-organic framework (MOF) based optical gas sensors, which enables detection of a wide range of concentrations of small molecules based upon extremely small differences in refractive indices as a function of analyte adsorption within the MOF framework. Thin and compact MOF films can be uniformly formed and tightly bound on the surface of etched optical fiber through a simple solution method which is critical for manufacturability of MOF-based sensor devices. The resulting sensors show high sensitivity/selectivity to CO 2 gas relative to other small gases (H 2 , N 2 , O 2 , and CO) with rapid (

  1. Spatiotemporal changes in precipitation extremes over Yangtze River basin, China, considering the rainfall shift in the late 1970s

    NASA Astrophysics Data System (ADS)

    Gao, Tao; Xie, Lian

    2016-12-01

    Precipitation extremes are the dominated causes for the formation of severe flood disasters at regional and local scales under the background of global climate change. In the present study, five annual extreme precipitation events, including 1, 7 and 30 day annual maximum rainfall and 95th and 97.5th percentile threshold levels, are analyzed relating to the reference period 1960-2011 from 140 meteorological stations over Yangtze River basin (YRB). A generalized extreme value (GEV) distribution is applied to fit annual and percentile extreme precipitation events at each station with return periods up to 200 years. The entire time period is divided into preclimatic (preceding climatic) period 1960-1980 and aftclimatic (after climatic) period 1981-2011 by considering distinctly abrupt shift of precipitation regime in the late 1970s across YRB. And the Mann-Kendall trend test is adopted to conduct trend analysis during pre- and aftclimatic periods, respectively, for the purpose of exploring possible increasing/decreasing patterns in precipitation extremes. The results indicate that the increasing trends for return values during aftclimatic period change significantly in time and space in terms of different magnitudes of extreme precipitation, while the stations with significantly positive trends are mainly distributed in the vicinity of the mainstream and major tributaries as well as large lakes, this would result in more tremendous flood disasters in the mid-lower reaches of YRB, especially in southeast coastal regions. The increasing/decreasing linear trends based on annual maximum precipitation are also investigated in pre- and aftclimatic periods, respectively, whereas those changes are not significantly similar to the variations of return values during both subperiods. Moreover, spatiotemporal patterns of precipitation extremes become more uneven and unstable in the second half period over YRB.

  2. How fit are children and adolescents with haemophilia in Germany? Results of a prospective study assessing the sport-specific motor performance by means of modern test procedures of sports science.

    PubMed

    Seuser, A; Boehm, P; Ochs, S; Trunz-Carlisi, E; Halimeh, S; Klamroth, R

    2015-07-01

    There are a lot of publications on the physical fitness of patients with haemophilia (PWH), however, most studies only reflect individual sport-specific motor capacities or focus on a single fitness ability. They involve small patient populations. In this respect principal objective of this study was to compare the physical fitness in all respects and the body composition of young PWH to healthy peers based on the most valid data we could get. Twenty-one German haemophilia treatment centres were visited from 2002 to 2009. PWH between 8 and 25 years were included. They performed a five-stage fitness test covering the sport-specific motor capacities for coordination, measured by one leg stand, strength, aerobic fitness and mobility as well as body composition. The patients' results were compared with age- and gender-specific reference values of healthy subjects. Two hundred and eighty-five PWH (mean age 13.2 ± 4.5 years, 164 PWH with severe disease) were included prospectively in the study. PWH are significantly below the reference values of healthy subjects in the one-leg stand test, the mobility of the lower extremity, the strength ratio of chest and back muscles and the endurance test. In body composition, the back strength and the mobility of the upper extremity PWH are significantly above the reference values. There are no significant differences in abdominal strength. In conclusion we found specific differences in different fitness abilities between PWH and healthy subjects. Knowing this, we are able to work out exercise programmes to compensate the diminished fitness abilities for our PWH. © 2015 John Wiley & Sons Ltd.

  3. The transformed-stationary approach: a generic and simplified methodology for non-stationary extreme value analysis

    NASA Astrophysics Data System (ADS)

    Mentaschi, Lorenzo; Vousdoukas, Michalis; Voukouvalas, Evangelos; Sartini, Ludovica; Feyen, Luc; Besio, Giovanni; Alfieri, Lorenzo

    2016-09-01

    Statistical approaches to study extreme events require, by definition, long time series of data. In many scientific disciplines, these series are often subject to variations at different temporal scales that affect the frequency and intensity of their extremes. Therefore, the assumption of stationarity is violated and alternative methods to conventional stationary extreme value analysis (EVA) must be adopted. Using the example of environmental variables subject to climate change, in this study we introduce the transformed-stationary (TS) methodology for non-stationary EVA. This approach consists of (i) transforming a non-stationary time series into a stationary one, to which the stationary EVA theory can be applied, and (ii) reverse transforming the result into a non-stationary extreme value distribution. As a transformation, we propose and discuss a simple time-varying normalization of the signal and show that it enables a comprehensive formulation of non-stationary generalized extreme value (GEV) and generalized Pareto distribution (GPD) models with a constant shape parameter. A validation of the methodology is carried out on time series of significant wave height, residual water level, and river discharge, which show varying degrees of long-term and seasonal variability. The results from the proposed approach are comparable with the results from (a) a stationary EVA on quasi-stationary slices of non-stationary series and (b) the established method for non-stationary EVA. However, the proposed technique comes with advantages in both cases. For example, in contrast to (a), the proposed technique uses the whole time horizon of the series for the estimation of the extremes, allowing for a more accurate estimation of large return levels. Furthermore, with respect to (b), it decouples the detection of non-stationary patterns from the fitting of the extreme value distribution. As a result, the steps of the analysis are simplified and intermediate diagnostics are possible. In particular, the transformation can be carried out by means of simple statistical techniques such as low-pass filters based on the running mean and the standard deviation, and the fitting procedure is a stationary one with a few degrees of freedom and is easy to implement and control. An open-source MATLAB toolbox has been developed to cover this methodology, which is available at https://github.com/menta78/tsEva/ (Mentaschi et al., 2016).

  4. Analyzing phenological extreme events over the past five decades in Germany

    NASA Astrophysics Data System (ADS)

    Schleip, Christoph; Menzel, Annette; Estrella, Nicole; Graeser, Philipp

    2010-05-01

    As climate change may alter the frequency and intensity of extreme temperatures, we analysed whether warming of the last 5 decades has already changed the statistics of phenological extreme events. In this context, two extreme value statistical concepts are discussed and applied to existing phenological datasets of German Weather Service (DWD) in order to derive probabilities of occurrence for extreme early or late phenological events. We analyse four phenological groups; "begin of flowering, "leaf foliation", "fruit ripening" and "leaf colouring" as well as DWD indicator phases of the "phenological year". Additionally we put an emphasis on a between-species analysis; a comparison of differences in extreme onsets between three common northern conifers. Furthermore we conducted a within-species analysis with different phases of horse chestnut throughout a year. The first statistical approach fits data to a Gaussian model using traditional statistical techniques, and then analyses the extreme quantile. The key point of this approach is the adoption of an appropriate probability density function (PDF) to the observed data and the assessment of the PDF parameters change in time. The full analytical description in terms of the estimated PDF for defined time steps of the observation period allows probability assessments of extreme values for e.g. annual or decadal time steps. Related with this approach is the possibility of counting out the onsets which fall in our defined extreme percentiles. The estimation of the probability of extreme events on the basis of the whole data set is in contrast to analyses with the generalized extreme value distribution (GEV). The second approach deals with the extreme PDFs itself and fits the GEV distribution to annual minima of phenological series to provide useful estimates about return levels. For flowering and leaf unfolding phases exceptionally early extremes are seen since the mid 1980s and especially for the single years 1961, 1990 and 2007 whereas exceptionally extreme late events are seen in the year 1970. Summer phases such as fruit ripening exhibit stronger shifts to early extremes than spring phases. Leaf colouring phases reveal increasing probability for late extremes. The with GEV estimated 100-year event of Picea, Pinus and Larix amount to extreme early events of about -27, -31.48 and -32.79 days, respectively. If we assume non-stationary minimum data we get a more extreme 100-year event of about -35.40 for Picea but associated with wider confidence intervals. The GEV is simply another probability distribution but for purposes of extreme analysis in phenology it should be considered as equally important as (if not more important than) the Gaussian PDF approach.

  5. Nanosatellites : A paradigm change for space weather studies.

    NASA Astrophysics Data System (ADS)

    Barthelemy, Mathieu

    2016-04-01

    Nanosatellites are changing the paradigm of space exploration and engineering. The past 15 years have seen a growing activity in this field, with a marked acceleration in the last 3 years. Whereas the educational value of nanosatellites is well recognized, their scientific and technological use is potentially extremely rich but not fully explored. Conventional attitudes towards space engineering need to be reviewed in light of the capabilities and characteristics of these miniature devices that enable approaches and applications not possible with traditional satellite platforms. After an evaluation of the past and near future nanosatellites missions in the domain of space weather and from the example of the Zegrensat/ATISE mission, we will give some perspectives on the possibilities opened by these small satellites.

  6. Extreme sea storm in the Mediterranean Sea. Trends during the 2nd half of the 20th century.

    NASA Astrophysics Data System (ADS)

    Pino, C.; Lionello, P.; Galati, M. B.

    2009-04-01

    Extreme sea storm in the Mediterranean Sea. Trends during the 2nd half of the 20th century Piero Lionello, University of Salento, piero.lionello@unisalento.it Maria Barbara Galati, University of Salento, mariabarbara.galati@unisalento.it Cosimo Pino, University of Salento, pino@le.infn.it The analysis of extreme Significant Wave Height (SWH) values and their trend is crucial for planning and managing coastal defences and off-shore activities. The analysis provided by this study covers a 44-year long period (1958-2001). First the WW3 (Wave Watch 3) model forced with the REMO-Hipocas regional model wind fields has been used for the hindcast of extreme SWH values over the Mediterranean basin with a 0.25 deg lat-lon resolution. Subsequently, the model results have been processed with an ad hoc software to detect storms. GEV analysis has been perfomed and a set of indicators for extreme SWH have been computed, using the Mann Kendall test for assessing statistical significance of trends for different parameter such as the number of extreme events, their duration and their intensity. Results suggest a transition towards weaker extremes and a milder climate over most of the Mediterranean Sea.

  7. Research in Stochastic Processes

    DTIC Science & Technology

    1988-08-31

    stationary sequence, Stochastic Proc. Appl. 29, 1988, 155-169 T. Hsing, J. Husler and M.R. Leadbetter, On the exceedance point process for a stationary...Nandagopalan, On exceedance point processes for "regular" sample functions, Proc. Volume, Oberxolfach Conf. on Extreme Value Theory, J. Husler and R. Reiss...exceedance point processes for stationary sequences under mild oscillation restrictions, Apr. 88. Obermotfach Conf. on Extremal Value Theory. Ed. J. HUsler

  8. A direct method for computing extreme value (Gumbel) parameters for gapped biological sequence alignments.

    PubMed

    Quinn, Terrance; Sinkala, Zachariah

    2014-01-01

    We develop a general method for computing extreme value distribution (Gumbel, 1958) parameters for gapped alignments. Our approach uses mixture distribution theory to obtain associated BLOSUM matrices for gapped alignments, which in turn are used for determining significance of gapped alignment scores for pairs of biological sequences. We compare our results with parameters already obtained in the literature.

  9. Ensemble-based evaluation of extreme water levels for the eastern Baltic Sea

    NASA Astrophysics Data System (ADS)

    Eelsalu, Maris; Soomere, Tarmo

    2016-04-01

    The risks and damages associated with coastal flooding that are naturally associated with an increase in the magnitude of extreme storm surges are one of the largest concerns of countries with extensive low-lying nearshore areas. The relevant risks are even more contrast for semi-enclosed water bodies such as the Baltic Sea where subtidal (weekly-scale) variations in the water volume of the sea substantially contribute to the water level and lead to large spreading of projections of future extreme water levels. We explore the options for using large ensembles of projections to more reliably evaluate return periods of extreme water levels. Single projections of the ensemble are constructed by means of fitting several sets of block maxima with various extreme value distributions. The ensemble is based on two simulated data sets produced in the Swedish Meteorological and Hydrological Institute. A hindcast by the Rossby Centre Ocean model is sampled with a resolution of 6 h and a similar hindcast by the circulation model NEMO with a resolution of 1 h. As the annual maxima of water levels in the Baltic Sea are not always uncorrelated, we employ maxima for calendar years and for stormy seasons. As the shape parameter of the Generalised Extreme Value distribution changes its sign and substantially varies in magnitude along the eastern coast of the Baltic Sea, the use of a single distribution for the entire coast is inappropriate. The ensemble involves projections based on the Generalised Extreme Value, Gumbel and Weibull distributions. The parameters of these distributions are evaluated using three different ways: maximum likelihood method and method of moments based on both biased and unbiased estimates. The total number of projections in the ensemble is 40. As some of the resulting estimates contain limited additional information, the members of pairs of projections that are highly correlated are assigned weights 0.6. A comparison of the ensemble-based projection of extreme water levels and their return periods with similar estimates derived from local observations reveals an interesting pattern of match and mismatch. The match is almost perfect in measurement sites where local effects (e.g., wave-induced set-up or local surge in very shallow areas that are not resolved by circulation models) do not contribute to the observed values of water level. There is, however, substantial mismatch between projected and observed extreme values for most of the Estonian coast. The mismatch is largest for sections that are open to high waves and for several bays that are deeply cut into mainland but open for predominant strong wind directions. Detailed quantification of this mismatch eventually makes it possible to develop substantially improved estimates of extreme water levels in sections where local effects considerably contribute into the total water level.

  10. Extreme ultraviolet index due to broken clouds at a midlatitude site, Granada (southeastern Spain)

    NASA Astrophysics Data System (ADS)

    Antón, M.; Piedehierro, A. A.; Alados-Arboledas, L.; Wolfran, E.; Olmo, F. J.

    2012-11-01

    Cloud cover usually attenuates the ultraviolet (UV) solar radiation but, under certain sky conditions, the clouds may produce an enhancement effect increasing the UV levels at surface. The main objective of this paper is to analyze an extreme UV enhancement episode recorded on 16 June 2009 at Granada (southeastern Spain). This phenomenon was characterized by a quick and intense increase in surface UV radiation under broken cloud fields (5-7 oktas) in which the Sun was surrounded by cumulus clouds (confirmed with sky images). Thus, the UV index (UVI) showed an enhancement of a factor 4 in the course of only 30 min around midday, varying from 2.6 to 10.4 (higher than the corresponding clear-sky UVI value). Additionally, the UVI presented values higher than 10 (extreme erythemal risk) for about 20 min running, with a maximum value around 11.5. The use of an empirical model and the total ozone column (TOC) derived from the Global Ozone Monitoring Experiment (GOME) for the period 1995-2011 showed that the value of UVI ~ 11.5 is substantially larger than the highest index that could origin the natural TOC variations over Granada. Finally, the UV erythemal dose accumulated during the period of 20 min with the extreme UVI values under broken cloud fields was 350 J/m2 which surpass the energy required to produce sunburn of the most human skin types.

  11. Optimal analytic method for the nonlinear Hasegawa-Mima equation

    NASA Astrophysics Data System (ADS)

    Baxter, Mathew; Van Gorder, Robert A.; Vajravelu, Kuppalapalle

    2014-05-01

    The Hasegawa-Mima equation is a nonlinear partial differential equation that describes the electric potential due to a drift wave in a plasma. In the present paper, we apply the method of homotopy analysis to a slightly more general Hasegawa-Mima equation, which accounts for hyper-viscous damping or viscous dissipation. First, we outline the method for the general initial/boundary value problem over a compact rectangular spatial domain. We use a two-stage method, where both the convergence control parameter and the auxiliary linear operator are optimally selected to minimize the residual error due to the approximation. To do the latter, we consider a family of operators parameterized by a constant which gives the decay rate of the solutions. After outlining the general method, we consider a number of concrete examples in order to demonstrate the utility of this approach. The results enable us to study properties of the initial/boundary value problem for the generalized Hasegawa-Mima equation. In several cases considered, we are able to obtain solutions with extremely small residual errors after relatively few iterations are computed (residual errors on the order of 10-15 are found in multiple cases after only three iterations). The results demonstrate that selecting a parameterized auxiliary linear operator can be extremely useful for minimizing residual errors when used concurrently with the optimal homotopy analysis method, suggesting that this approach can prove useful for a number of nonlinear partial differential equations arising in physics and nonlinear mechanics.

  12. Isokinetic profile of elbow flexion and extension strength in elite junior tennis players.

    PubMed

    Ellenbecker, Todd S; Roetert, E Paul

    2003-02-01

    Descriptive study. To determine whether bilateral differences exist in concentric elbow flexion and extension strength in elite junior tennis players. The repetitive nature of tennis frequently produces upper extremity overuse injuries. Prior research has identified tennis-specific strength adaptation in the dominant shoulder and distal upper extremity musculature of elite players. No previous study has addressed elbow flexion and extension strength. Thirty-eight elite junior tennis players were bilaterally tested for concentric elbow flexion and extension muscle performance on a Cybex 6000 isokinetic dynamometer at 90 degrees/s, 210 degrees/s, and 300 degrees/s. Repeated-measures ANOVAs were used to test for differences between extremities, muscle groups, and speed. Significantly greater (P<0.002) dominant-arm elbow extension peak torque values were measured at 90 degrees/s, 210 degrees/s, and 300 degrees/s for males. Significantly greater (P<0.002) dominant-arm single-repetition work values were also measured at 90 degrees/s and 210 degrees/s for males. No significant difference was measured between extremities in elbow flexion muscular performance in males and for elbow flexion or extension peak torque and single-repetition work values in females. No significant difference between extremities was measured in elbow flexion/extension strength ratios in females and significant differences between extremities in this ratio were only present at 210 degrees/s in males (P<0.002). These data indicate muscular adaptations around the dominant elbow in male elite junior tennis players but not females. These data have ramifications for clinicians rehabilitating upper extremity injuries in patients from this population.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghosh, Subimal; Das, Debasish; Kao, Shih-Chieh

    Recent studies disagree on how rainfall extremes over India have changed in space and time over the past half century, as well as on whether the changes observed are due to global warming or regional urbanization. Although a uniform and consistent decrease in moderate rainfall has been reported, a lack of agreement about trends in heavy rainfall may be due in part to differences in the characterization and spatial averaging of extremes. Here we use extreme value theory to examine trends in Indian rainfall over the past half century in the context of long-term, low-frequency variability.We show that when generalizedmore » extreme value theory is applied to annual maximum rainfall over India, no statistically significant spatially uniform trends are observed, in agreement with previous studies using different approaches. Furthermore, our space time regression analysis of the return levels points to increasing spatial variability of rainfall extremes over India. Our findings highlight the need for systematic examination of global versus regional drivers of trends in Indian rainfall extremes, and may help to inform flood hazard preparedness and water resource management in the region.« less

  14. Study designs for identification of rare disease variants in complex diseases: the utility of family-based designs.

    PubMed

    Ionita-Laza, Iuliana; Ottman, Ruth

    2011-11-01

    The recent progress in sequencing technologies makes possible large-scale medical sequencing efforts to assess the importance of rare variants in complex diseases. The results of such efforts depend heavily on the use of efficient study designs and analytical methods. We introduce here a unified framework for association testing of rare variants in family-based designs or designs based on unselected affected individuals. This framework allows us to quantify the enrichment in rare disease variants in families containing multiple affected individuals and to investigate the optimal design of studies aiming to identify rare disease variants in complex traits. We show that for many complex diseases with small values for the overall sibling recurrence risk ratio, such as Alzheimer's disease and most cancers, sequencing affected individuals with a positive family history of the disease can be extremely advantageous for identifying rare disease variants. In contrast, for complex diseases with large values of the sibling recurrence risk ratio, sequencing unselected affected individuals may be preferable.

  15. Diagnostic and prognostic value of history-taking and physical examination in undifferentiated peripheral inflammatory arthritis: a systematic review.

    PubMed

    Kuriya, Bindee; Villeneuve, Edith; Bombardier, Claire

    2011-03-01

    To review the diagnostic and prognostic value of history/physical examination among patients with undifferentiated peripheral inflammatory arthritis (UPIA). We conducted a systematic review evaluating the association between history/physical examination features and a diagnostic or prognostic outcome. Nineteen publications were included. Advanced age, female sex, and morning stiffness were predictive of a diagnosis of rheumatoid arthritis (RA) from UPIA. A higher number of tender and swollen joints, small/large joint involvement in the upper/lower extremities, and symmetrical involvement were associated with progression to RA. Similar features were associated with persistent disease and erosions, while disability at baseline and extraarticular features were predictive of future disability. History/physical examination features are heterogeneously reported. Several features predict progression from UPIA to RA or a poor prognosis. Continued measurements in the UPIA population are needed to determine if these features are valid and reliable predictors of outcomes, especially as new definitions for RA and disease states emerge.

  16. Pulling adsorbed polymers at an angle: A low temperature theory

    NASA Astrophysics Data System (ADS)

    Iliev, Gerasim; Whittington, Stuart

    2012-02-01

    We consider several partially-directed walk models in two- and three-dimensions to study the problem of a homopolymer interacting with a surface while subject to a force at the terminal monomer. The force is applied with a component parallel to the surface as well as a component perpendicular to the surface. Depending on the relative values of the force in each direction, the force can either enhance the adsorption transition or lead to desorption in an adsorbed polymer. For each model, we determine the associated generating function and extract the phase diagram, identifying states where the polymer is thermally desorbed, adsorbed, and under the influence of the force. We note the different regimes that appear in the problem and provide a low temperature approximation to describe them. The approximation is exact at T=0 and models the exact results extremely well for small values of T. This work is an extension of a model considered by S. Whittington and E. Orlandini.

  17. Sequences of extremal radially excited rotating black holes.

    PubMed

    Blázquez-Salcedo, Jose Luis; Kunz, Jutta; Navarro-Lérida, Francisco; Radu, Eugen

    2014-01-10

    In the Einstein-Maxwell-Chern-Simons theory the extremal Reissner-Nordström solution is no longer the single extremal solution with vanishing angular momentum, when the Chern-Simons coupling constant reaches a critical value. Instead a whole sequence of rotating extremal J=0 solutions arises, labeled by the node number of the magnetic U(1) potential. Associated with the same near horizon solution, the mass of these radially excited extremal solutions converges to the mass of the extremal Reissner-Nordström solution. On the other hand, not all near horizon solutions are also realized as global solutions.

  18. Suitability of the isolated chicken eye test for classification of extreme pH detergents and cleaning products.

    PubMed

    Cazelle, Elodie; Eskes, Chantra; Hermann, Martina; Jones, Penny; McNamee, Pauline; Prinsen, Menk; Taylor, Hannah; Wijnands, Marcel V W

    2015-04-01

    A.I.S.E. investigated the suitability of the regulatory adopted ICE in vitro test method (OECD TG 438) with or without histopathology to identify detergent and cleaning formulations having extreme pH that require classification as EU CLP/UN GHS Category 1. To this aim, 18 extreme pH detergent and cleaning formulations were tested covering both alkaline and acidic extreme pHs. The ICE standard test method following OECD Test Guideline 438 showed good concordance with in vivo classification (83%) and good and balanced specificity and sensitivity values (83%) which are in line with the performances of currently adopted in vitro test guidelines, confirming its suitability to identify Category 1 extreme pH detergent and cleaning products. In contrast to previous findings obtained with non-extreme pH formulations, the use of histopathology did not improve the sensitivity of the assay whilst it strongly decreased its specificity for the extreme pH formulations. Furthermore, use of non-testing prediction rules for classification showed poor concordance values (33% for the extreme pH rule and 61% for the EU CLP additivity approach) with high rates of over-prediction (100% for the extreme pH rule and 50% for the additivity approach), indicating that these non-testing prediction rules are not suitable to predict Category 1 hazards of extreme pH detergent and cleaning formulations. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Long-term statistics of extreme tsunami height at Crescent City

    NASA Astrophysics Data System (ADS)

    Dong, Sheng; Zhai, Jinjin; Tao, Shanshan

    2017-06-01

    Historically, Crescent City is one of the most vulnerable communities impacted by tsunamis along the west coast of the United States, largely attributed to its offshore geography. Trans-ocean tsunamis usually produce large wave runup at Crescent Harbor resulting in catastrophic damages, property loss and human death. How to determine the return values of tsunami height using relatively short-term observation data is of great significance to assess the tsunami hazards and improve engineering design along the coast of Crescent City. In the present study, the extreme tsunami heights observed along the coast of Crescent City from 1938 to 2015 are fitted using six different probabilistic distributions, namely, the Gumbel distribution, the Weibull distribution, the maximum entropy distribution, the lognormal distribution, the generalized extreme value distribution and the generalized Pareto distribution. The maximum likelihood method is applied to estimate the parameters of all above distributions. Both Kolmogorov-Smirnov test and root mean square error method are utilized for goodness-of-fit test and the better fitting distribution is selected. Assuming that the occurrence frequency of tsunami in each year follows the Poisson distribution, the Poisson compound extreme value distribution can be used to fit the annual maximum tsunami amplitude, and then the point and interval estimations of return tsunami heights are calculated for structural design. The results show that the Poisson compound extreme value distribution fits tsunami heights very well and is suitable to determine the return tsunami heights for coastal disaster prevention.

  20. Extreme geomagnetically induced currents

    NASA Astrophysics Data System (ADS)

    Kataoka, Ryuho; Ngwira, Chigomezyo

    2016-12-01

    We propose an emergency alert framework for geomagnetically induced currents (GICs), based on the empirically extreme values and theoretical upper limits of the solar wind parameters and of d B/d t, the time derivative of magnetic field variations at ground. We expect this framework to be useful for preparing against extreme events. Our analysis is based on a review of various papers, including those presented during Extreme Space Weather Workshops held in Japan in 2011, 2012, 2013, and 2014. Large-amplitude d B/d t values are the major cause of hazards associated with three different types of GICs: (1) slow d B/d t with ring current evolution (RC-type), (2) fast d B/d t associated with auroral electrojet activity (AE-type), and (3) transient d B/d t of sudden commencements (SC-type). We set "caution," "warning," and "emergency" alert levels during the main phase of superstorms with the peak Dst index of less than -300 nT (once per 10 years), -600 nT (once per 60 years), or -900 nT (once per 100 years), respectively. The extreme d B/d t values of the AE-type GICs are 2000, 4000, and 6000 nT/min at caution, warning, and emergency levels, respectively. For the SC-type GICs, a "transient alert" is also proposed for d B/d t values of 40 nT/s at low latitudes and 110 nT/s at high latitudes, especially when the solar energetic particle flux is unusually high.

  1. Total ozone patterns over the southern mid-latitudes: spatial correlations, extreme events and dynamical contributions

    NASA Astrophysics Data System (ADS)

    Rieder, Harald E.; di Rocco, Stefania; Staehelin, Johannes; Maeder, Jörg A.; Ribatet, Mathieu; Peter, Thomas; Davison, Anthony C.

    2010-05-01

    Tools from geostatistics and extreme value theory are applied to analyze spatial correlations in total ozone for the southern mid-latitudes. The dataset used in this study is the NIWA-assimilated total ozone dataset (Bodeker et al., 2001; Müller et al., 2008). Recently new tools from extreme value theory (Coles, 2001; Ribatet, 2007) have been applied to the world's longest total ozone record from Arosa, Switzerland (e.g. Staehelin 1998a,b) and 5 other long-term ground based stations to describe extreme events in low and high total ozone (Rieder et al., 2010a,b,c). Excursions in the frequency of extreme events reveal "fingerprints" of dynamical factors such as ENSO or NAO, and chemical factors, such as cold Arctic vortex ozone losses, as well as major volcanic eruptions of the 20th century (e.g. Gunung Agung, El Chichón, Mt. Pinatubo). Furthermore, atmospheric loading in ozone depleting substances lead to a continuous modification of column ozone in the northern hemisphere also with respect to extreme values (partly again in connection with polar vortex contributions). It is shown that application of extreme value theory allows the identification of many more of such fingerprints than conventional time series analysis on basis of annual and seasonal mean values. Especially, the analysis shows the strong influence of dynamics, revealing that even moderate ENSO and NAO events have a discernible effect on total ozone (Rieder et al., 2010b,c). Within the current study patterns in spatial correlation and frequency distributions of extreme events (e.g. ELOs and EHOs) are studied for the southern mid-latitudes. It is analyzed if "fingerprints"found for features in the northern hemisphere occur also in the southern mid-latitudes. New insights in spatial patterns of total ozone for the southern mid-latitudes are presented. Within this study the influence of changes in atmospheric dynamics (e.g. tropospheric and lower stratospheric pressure systems, ENSO) as well as influence of major volcanic eruptions (e.g. Mt. Pinatubo) and ozone depleting substances (ODS) on column ozone over the southern mid-latitudes is analyzed for the time period 1979-2007. References: Bodeker, G.E., J.C. Scott, K. Kreher, and R.L. McKenzie, Global ozone trends in potential vorticity coordinates using TOMS and GOME intercompared against the Dobson network: 1978-1998, J. Geophys. Res., 106 (D19), 23029-23042, 2001. Coles, S.: An Introduction to Statistical Modeling of Extreme Values, Springer Series in Statistics, ISBN:1852334592, Springer, Berlin, 2001. Müller, R., Grooß, J.-U., Lemmen, C., Heinze, D., Dameris, M., and Bodeker, G.: Simple measures of ozone depletion in the polar stratosphere, Atmos. Chem. Phys., 8, 251-264, 2008. Ribatet, M.: POT: Modelling peaks over a threshold, R News, 7, 34-36, 2007. Rieder ,H.E., Staehelin, J., Maeder, J.A., Ribatet, M., Stübi, R., Weihs, P., Holawe, F., Peter, T., and A.D., Davison (2010): Extreme events in total ozone over Arosa - Part I: Application of extreme value theory, to be submitted to ACPD. Rieder, H.E., Staehelin, J., Maeder, J.A., Ribatet, M., Stübi, R., Weihs, P., Holawe, F., Peter, T., and A.D., Davison (2010): Extreme events in total ozone over Arosa - Part II: Fingerprints of atmospheric dynamics and chemistry and effects on mean values and long-term changes, to be submitted to ACPD. Rieder, H.E., Jancso, L.M., Staehelin, J., Maeder, J.A., Ribatet, Peter, T., and A.D., Davison (2010): Extreme events in total ozone over the northern mid-latitudes: A case study based on long-term data sets from 5 ground-based stations, in preparation. Staehelin, J., Kegel, R., and Harris, N. R.: Trend analysis of the homogenized total ozone series of Arosa (Switzerland), 1929-1996, J. Geophys. Res., 103(D7), 8389-8400, doi:10.1029/97JD03650, 1998a. Staehelin, J., Renaud, A., Bader, J., McPeters, R., Viatte, P., Hoegger, B., Bugnion, V., Giroud, M., and Schill, H.: Total ozone series at Arosa (Switzerland): Homogenization and data comparison, J. Geophys. Res., 103(D5), 5827-5842, doi:10.1029/97JD02402, 1998b.

  2. Decadal oscillations and extreme value distribution of river peak flows in the Meuse catchment

    NASA Astrophysics Data System (ADS)

    De Niel, Jan; Willems, Patrick

    2017-04-01

    In flood risk management, flood probabilities are often quantified through Generalized Pareto distributions of river peak flows. One of the main underlying assumptions is that all data points need to originate from one single underlying distribution (i.i.d. assumption). However, this hypothesis, although generally assumed to be correct for variables such as river peak flows, remains somehow questionable: flooding might indeed be caused by different hydrological and/or meteorological conditions. This study confirms these findings from previous research by showing a clear indication of the link between atmospheric conditions and flooding for the Meuse river in The Netherlands: decadal oscillations of river peak flows can (at least partially) be attributed to the occurrence of westerly weather types. The study further proposes a method to take this correlation between atmospheric conditions and river peak flows into account when calibrating an extreme value distribution for river peak flows. Rather than calibrating one single distribution to the data and potentially violating the i.i.d. assumption, weather type depending extreme value distributions are derived and composed. The study shows that, for the Meuse river in The Netherlands, such approach results in a more accurate extreme value distribution, especially with regards to extrapolations. Comparison of the proposed method with a traditional extreme value analysis approach and an alternative model-based approach for the same case study shows strong differences in the peak flow extrapolation. The design-flood for a 1,250 year return period is estimated at 4,800 m3s-1 for the proposed method, compared with 3,450 m3s-1 and 3,900 m3s-1 for the traditional method and a previous study. The methods were validated based on instrumental and documentary flood information of the past 500 years.

  3. I know why you voted for Trump: (Over)inferring motives based on choice.

    PubMed

    Barasz, Kate; Kim, Tami; Evangelidis, Ioannis

    2018-05-10

    People often speculate about why others make the choices they do. This paper investigates how such inferences are formed as a function of what is chosen. Specifically, when observers encounter someone else's choice (e.g., of political candidate), they use the chosen option's attribute values (e.g., a candidate's specific stance on a policy issue) to infer the importance of that attribute (e.g., the policy issue) to the decision-maker. Consequently, when a chosen option has an attribute whose value is extreme (e.g., an extreme policy stance), observers infer-sometimes incorrectly-that this attribute disproportionately motivated the decision-maker's choice. Seven studies demonstrate how observers use an attribute's value to infer its weight-the value-weight heuristic-and identify the role of perceived diagnosticity: more extreme attribute values give observers the subjective sense that they know more about a decision-maker's preferences, and in turn, increase the attribute's perceived importance. The paper explores how this heuristic can produce erroneous inferences and influence broader beliefs about decision-makers. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. A new method of small target detection based on neural network

    NASA Astrophysics Data System (ADS)

    Hu, Jing; Hu, Yongli; Lu, Xinxin

    2018-02-01

    The detection and tracking of moving dim target in infrared image have been an research hotspot for many years. The target in each frame of images only occupies several pixels without any shape and structure information. Moreover, infrared small target is often submerged in complicated background with low signal-to-clutter ratio, making the detection very difficult. Different backgrounds exhibit different statistical properties, making it becomes extremely complex to detect the target. If the threshold segmentation is not reasonable, there may be more noise points in the final detection, which is unfavorable for the detection of the trajectory of the target. Single-frame target detection may not be able to obtain the desired target and cause high false alarm rate. We believe the combination of suspicious target detection spatially in each frame and temporal association for target tracking will increase reliability of tracking dim target. The detection of dim target is mainly divided into two parts, In the first part, we adopt bilateral filtering method in background suppression, after the threshold segmentation, the suspicious target in each frame are extracted, then we use LSTM(long short term memory) neural network to predict coordinates of target of the next frame. It is a brand-new method base on the movement characteristic of the target in sequence images which could respond to the changes in the relationship between past and future values of the values. Simulation results demonstrate proposed algorithm can effectively predict the trajectory of the moving small target and work efficiently and robustly with low false alarm.

  5. Extreme Vertical Gusts in the Atmospheric Boundary Layer

    DTIC Science & Technology

    2015-07-01

    significant effect on the statistics of the rare, extreme gusts. In the lowest 5,000 ft, boundary layer effects make small to moderate vertical...4 2.4 Effects of Gust Shape ............................................................................................... 5... Definitions Adiabatic Lapse Rate The rate of change of temperature with altitude that would occur if a parcel of air was transported sufficiently

  6. Hot bats: extreme thermal tolerance in a desert heat wave.

    PubMed

    Bondarenco, Artiom; Körtner, Gerhard; Geiser, Fritz

    2014-08-01

    Climate change is predicted to increase temperature extremes and thus thermal stress on organisms. Animals living in hot deserts are already exposed to high ambient temperatures (T a) making them especially vulnerable to further warming. However, little is known about the effect of extreme heat events on small desert mammals, especially tree-roosting microbats that are not strongly protected from environmental temperature fluctuations. During a heat wave with record T as at Sturt National Park, we quantified the thermal physiology and behaviour of a single free-ranging little broad-nosed (Scotorepens greyii, henceforth Scotorepens) and two inland freetail bats (Mormopterus species 3, henceforth Mormopterus) using temperature telemetry over 3 days. On 11 and 13 January, maximum T a was ∼45.0 °C, and all monitored bats were thermoconforming. On 12 January 2013, when T a exceeded 48.0 °C, Scotorepens abandoned its poorly insulated roost during the daytime, whereas both Mormopterus remained in their better insulated roosts and were mostly thermoconforming. Maximum skin temperatures (T skin) ranged from 44.0 to 44.3 °C in Scotorepens and from 40.0 to 45.8 °C in Mormopterus, and these are the highest T skin values reported for any free-ranging bat. Our study provides the first evidence of extensive heat tolerance in free-ranging desert microbats. It shows that these bats can tolerate the most extreme T skin range known for mammals (3.3 to 45.8 °C) and delay regulation of T skin by thermoconforming over a wide temperature range and thus decrease the risks of dehydration and consequently death.

  7. Inter-model variability in hydrological extremes projections for Amazonian sub-basins

    NASA Astrophysics Data System (ADS)

    Andres Rodriguez, Daniel; Garofolo, Lucas; Lázaro de Siqueira Júnior, José; Samprogna Mohor, Guilherme; Tomasella, Javier

    2014-05-01

    Irreducible uncertainties due to knowledge's limitations, chaotic nature of climate system and human decision-making process drive uncertainties in Climate Change projections. Such uncertainties affect the impact studies, mainly when associated to extreme events, and difficult the decision-making process aimed at mitigation and adaptation. However, these uncertainties allow the possibility to develop exploratory analyses on system's vulnerability to different sceneries. The use of different climate model's projections allows to aboard uncertainties issues allowing the use of multiple runs to explore a wide range of potential impacts and its implications for potential vulnerabilities. Statistical approaches for analyses of extreme values are usually based on stationarity assumptions. However, nonstationarity is relevant at the time scales considered for extreme value analyses and could have great implications in dynamic complex systems, mainly under climate change transformations. Because this, it is required to consider the nonstationarity in the statistical distribution parameters. We carried out a study of the dispersion in hydrological extremes projections using climate change projections from several climate models to feed the Distributed Hydrological Model of the National Institute for Spatial Research, MHD-INPE, applied in Amazonian sub-basins. This model is a large-scale hydrological model that uses a TopModel approach to solve runoff generation processes at the grid-cell scale. MHD-INPE model was calibrated for 1970-1990 using observed meteorological data and comparing observed and simulated discharges by using several performance coeficients. Hydrological Model integrations were performed for present historical time (1970-1990) and for future period (2010-2100). Because climate models simulate the variability of the climate system in statistical terms rather than reproduce the historical behavior of climate variables, the performances of the model's runs during the historical period, when feed with climate model data, were tested using descriptors of the Flow Duration Curves. The analyses of projected extreme values were carried out considering the nonstationarity of the GEV distribution parameters and compared with extremes events in present time. Results show inter-model variability in a broad dispersion on projected extreme's values. Such dispersion implies different degrees of socio-economic impacts associated to extreme hydrological events. Despite the no existence of one optimum result, this variability allows the analyses of adaptation strategies and its potential vulnerabilities.

  8. A new framework for estimating return levels using regional frequency analysis

    NASA Astrophysics Data System (ADS)

    Winter, Hugo; Bernardara, Pietro; Clegg, Georgina

    2017-04-01

    We propose a new framework for incorporating more spatial and temporal information into the estimation of extreme return levels. Currently, most studies use extreme value models applied to data from a single site; an approach which is inefficient statistically and leads to return level estimates that are less physically realistic. We aim to highlight the benefits that could be obtained by using methodology based upon regional frequency analysis as opposed to classic single site extreme value analysis. This motivates a shift in thinking, which permits the evaluation of local and regional effects and makes use of the wide variety of data that are now available on high temporal and spatial resolutions. The recent winter storms over the UK during the winters of 2013-14 and 2015-16, which have caused wide-ranging disruption and damaged important infrastructure, provide the main motivation for the current work. One of the most impactful natural hazards is flooding, which is often initiated by extreme precipitation. In this presentation, we focus on extreme rainfall, but shall discuss other meteorological variables alongside potentially damaging hazard combinations. To understand the risks posed by extreme precipitation, we need reliable statistical models which can be used to estimate quantities such as the T-year return level, i.e. the level which is expected to be exceeded once every T-years. Extreme value theory provides the main collection of statistical models that can be used to estimate the risks posed by extreme precipitation events. Broadly, at a single site, a statistical model is fitted to exceedances of a high threshold and the model is used to extrapolate to levels beyond the range of the observed data. However, when we have data at many sites over a spatial domain, fitting a separate model for each separate site makes little sense and it would be better if we could incorporate all this information to improve the reliability of return level estimates. Here, we use the regional frequency analysis approach to define homogeneous regions which are affected by the same storms. Extreme value models are then fitted to the data pooled from across a region. We find that this approach leads to more spatially consistent return level estimates with reduced uncertainty bounds.

  9. Contrasting the projected change in extreme extratropical cyclones in the two hemispheres

    NASA Astrophysics Data System (ADS)

    Chang, E. K. M.

    2017-12-01

    Extratropical cyclones form an important part of the global circulation. They are responsible for much of the high impact weather in the mid-latitudes, including heavy precipitation, strong winds, and coastal storm surges. They are also the surface manifestation of baroclinic waves that are responsible for much of the transport of momentum, heat, and moisture across the mid-latitudes. Thus how these storms will change in the future is of much general interest. In particular, how the frequency of the extreme cyclones change are of most concern, since they are the ones that cause most damages. While the projection of a poleward shift of the Southern Hemisphere storm track and cyclone activity is widely accepted, together with a small decrease in the total number of extratropical cyclones, as discussed in the 5th Assessment Report of the Intergovernmental Panel on Climate Change (IPCC AR5), projected change in cyclone intensity is still rather uncertain. Several studies have suggested that cyclone intensity, in terms of absolute value of sea level pressure (SLP) minima or SLP perturbations, is projected to increase under global warming. However, other studies found no increase in wind speed around extratropical cyclones. In this study, CMIP5 multi-model projection of how the frequency of extreme cyclones in terms of near surface wind intensity may change under global warming has been examined. Results suggest significant increase in the occurrences of extreme cyclones in the Southern Hemisphere. In the Northern Hemisphere, CMIP5 models project a northeastward shift in extreme cyclone activity over the Pacific, and significant decrease over the Atlantic. Substantial differences are also found between projected changes in near surface wind intensity and wind intensity at 850 hPa, suggesting that wind change at 850 hPa is not a good proxy for change in surface wind intensity. Finally, projected changes in the large scale environment are examined to understand the dynamics behind these contrasting projected changes.

  10. Future changes of precipitation characteristics in China

    NASA Astrophysics Data System (ADS)

    Wu, S.; Wu, Y.; Wen, J.

    2017-12-01

    Global warming has the potential to alter the hydrological cycle, with significant impacts on the human society, the environment and ecosystems. This study provides a detailed assessment of potential changes in precipitation characteristics in China using a suite of 12 high-resolution CMIP5 climate models under a medium and a high Representative Concentration Pathways: RCP4.5 and RCP8.5. We examine future changes over the entire distribution of precipitation, and identify any shift in the shape and/or scale of the distribution. In addition, we use extreme-value theory to evaluate the change in probability and magnitude for extreme precipitation events. Overall, China is going to experience an increase in total precipitation (by 8% under RCP4.5 and 12% under RCP8.5). This increase is uneven spatially, with more increase in the west and less increase in the east. Precipitation frequency is projected to increase in the west and decrease in the east. Under RCP4.5, the overall precipitation frequency for the entire China remains largely unchanged (0.08%). However, RCP8.5 projects a more significant decrease in frequency for large part of China, resulting in an overall decrease of 2.08%. Precipitation intensity is likely increase more uniformly, with an overall increase of 11% for RCP4.5 and 19% for RCP8.5. Precipitation increases for all parts of the distribution, but the increase is more for higher quantiles, i.e. strong events. The relative contribution of small quantiles is likely to decrease, whereas contribution from heavy events is likely to increase. Extreme precipitation increase at much higher rates than average precipitation, and high rates of increase are expected for more extreme events. 1-year events are likely to increase by 15%, but 20-year events are going to increase by 21% under RCP4.5, 26% and 40% respectively under RCP8.5. The increase of extreme events is likely to be more spatially uniform.

  11. Forest bat population dynamics over 14 years at a climate refuge: Effects of timber harvesting and weather extremes.

    PubMed

    Law, Bradley S; Chidel, Mark; Law, Peter R

    2018-01-01

    Long-term data are needed to explore the interaction of weather extremes with habitat alteration; in particular, can 'refugia' buffer population dynamics against climate change and are they robust to disturbances such as timber harvesting. Because forest bats are good indicators of ecosystem health, we used 14 years (1999-2012) of mark-recapture data from a suite of small tree-hollow roosting bats to estimate survival, abundance and body condition in harvested and unharvested forest and over extreme El Niño and La Niña weather events in southeastern Australia. Trapping was replicated within an experimental forest, located in a climate refuge, with different timber harvesting treatments. We trapped foraging bats and banded 3043 with a 32% retrap rate. Mark-recapture analyses allowed for dependence of survival on time, species, sex, logging treatment and for transients. A large portion of the population remained resident, with a maximum time to recapture of nine years. The effect of logging history (unlogged vs 16-30 years post-logging regrowth) on apparent survival was minor and species specific, with no detectable effect for two species, a positive effect for one and negative for the other. There was no effect of logging history on abundance or body condition for any of these species. Apparent survival of residents was not strongly influenced by weather variation (except for the smallest species), unlike previous studies outside of refugia. Despite annual variation in abundance and body condition across the 14 years of the study, no relationship with extreme weather was evident. The location of our study area in a climate refuge potentially buffered bat population dynamics from extreme weather. These results support the value of climate refugia in mitigating climate change impacts, though the lack of an external control highlights the need for further studies on the functioning of climate refugia. Relatively stable population dynamics were not compromised by timber harvesting, suggesting ecologically sustainable harvesting may be compatible with climate refugia.

  12. Research in Stochastic Processes

    DTIC Science & Technology

    1988-10-10

    To appear in Proceedings Volume, Oberwolfach Conf. on Extremal Value Theory, Ed. J. HUsler and R. Reiss, Springer. 4. M.R. Leadbetter. The exceedance...Hsing, J. Husler and M.R. Leadbetter, On the exceedance point process for a stationary sequence, Probability Theor. Rel. Fields, 20, 1988, 97-112 Z.J...Oberwotfach Conf. on Extreme Value Theory. J. Husler and R. Reiss. eds.. Springer. to appear V. Mandrekar, On a limit theorem and invariance

  13. Microstructure-Sensitive Extreme Value Probabilities for High Cycle Fatigue of Ni-Base Superalloy IN100 (Preprint)

    DTIC Science & Technology

    2009-03-01

    transition fatigue regimes; however, microplasticity (i.e., heterogeneous plasticity at the scale of microstructure) is relevant to understanding fatigue...and Socie [57] considered the affect of microplastic 14 Microstructure-Sensitive Extreme Value Probabilities for High Cycle Fatigue of Ni-Base...considers the local stress state as affected by intergranular interactions and microplasticity . For the calculations given below, the volumes over which

  14. Modeling annual extreme temperature using generalized extreme value distribution: A case study in Malaysia

    NASA Astrophysics Data System (ADS)

    Hasan, Husna; Salam, Norfatin; Kassim, Suraiya

    2013-04-01

    Extreme temperature of several stations in Malaysia is modeled by fitting the annual maximum to the Generalized Extreme Value (GEV) distribution. The Augmented Dickey Fuller (ADF) and Phillips Perron (PP) tests are used to detect stochastic trends among the stations. The Mann-Kendall (MK) test suggests a non-stationary model. Three models are considered for stations with trend and the Likelihood Ratio test is used to determine the best-fitting model. The results show that Subang and Bayan Lepas stations favour a model which is linear for the location parameters while Kota Kinabalu and Sibu stations are suitable with a model in the logarithm of the scale parameters. The return level is the level of events (maximum temperature) which is expected to be exceeded once, on average, in a given number of years, is obtained.

  15. Spatial extreme value analysis to project extremes of large-scale indicators for severe weather

    PubMed Central

    Gilleland, Eric; Brown, Barbara G; Ammann, Caspar M

    2013-01-01

    Concurrently high values of the maximum potential wind speed of updrafts (Wmax) and 0–6 km wind shear (Shear) have been found to represent conducive environments for severe weather, which subsequently provides a way to study severe weather in future climates. Here, we employ a model for the product of these variables (WmSh) from the National Center for Atmospheric Research/United States National Center for Environmental Prediction reanalysis over North America conditioned on their having extreme energy in the spatial field in order to project the predominant spatial patterns of WmSh. The approach is based on the Heffernan and Tawn conditional extreme value model. Results suggest that this technique estimates the spatial behavior of WmSh well, which allows for exploring possible changes in the patterns over time. While the model enables a method for inferring the uncertainty in the patterns, such analysis is difficult with the currently available inference approach. A variation of the method is also explored to investigate how this type of model might be used to qualitatively understand how the spatial patterns of WmSh correspond to extreme river flow events. A case study for river flows from three rivers in northwestern Tennessee is studied, and it is found that advection of WmSh from the Gulf of Mexico prevails while elsewhere, WmSh is generally very low during such extreme events. © 2013 The Authors. Environmetrics published by JohnWiley & Sons, Ltd. PMID:24223482

  16. Surface atmospheric extremes (Launch and transportation areas)

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The effects of extreme values of surface and low altitude atmospheric parameters on space vehicle design, tests, and operations are discussed. Atmospheric extremes from the surface to 150 meters for geographic locations of interest to NASA are given. Thermal parameters (temperature and solar radiation), humidity, pressure, and atmospheric electricity (lighting and static) are presented. Weather charts and tables are included.

  17. Extremely preterm infants who are small for gestational age have a high risk of early hypophosphatemia and hypokalemia.

    PubMed

    Boubred, F; Herlenius, E; Bartocci, M; Jonsson, B; Vanpée, M

    2015-11-01

    Electrolyte balances have not been sufficiently evaluated in extremely preterm infants after early parenteral nutrition. We investigated the risk of early hypophosphatemia and hypokalemia in extremely preterm infants born small for gestational age (SGA) who received nutrition as currently recommended. This prospective, observational cohort study included all consecutive extremely preterm infants born at 24-27 weeks who received high amino acids and lipid perfusion from birth. We evaluated the electrolyte levels of SGA infants and infants born appropriate for gestational age (AGA) during the first five days of life. The 12 SGA infants had lower plasma potassium levels from Day One compared to the 36 AGA infants and were more likely to have hypokalemia (58% vs 17%, p = 0.001) and hypophosphatemia (40% vs 9%, p < 0.01) during the five-day observation period. After adjusting for perinatal factors, SGA remained significantly associated with hypophosphatemia (odds ratio 1.39, confidence intervals 1.07-1.81, p = 0.01). Extremely preterm infants born SGA who were managed with currently recommended early parenteral nutrition had a high risk of early hypokalemia and hypophosphatemia. Potassium and phosphorus intakes should be set at sufficient levels from birth onwards, especially in SGA infants. ©2015 Foundation Acta Paediatrica. Published by John Wiley & Sons Ltd.

  18. Cultural definitions of elder maltreatment in Portugal.

    PubMed

    Mercurio, Andrea E; Nyborn, Justin

    2006-01-01

    A small convenience sample of 34 participants (17 males, 17 females) from the Portuguese islands of the Azores and Madeira were asked to provide examples of how extreme, moderate, and mild maltreatment towards an elder would be defined in their culture and society. Neglect, especially psychological neglect, physical maltreatment, and psychological maltreatment were the most frequently reported types of maltreatment. References to neglect and physical maltreatment appeared most often as examples of extreme maltreatment. In general, men were somewhat more likely than women to provide examples of physical aggression in their examples of maltreatment. As examples of extreme maltreatment, females provided significantly more examples of abandonment than males. Although interpretations of the findings must be cautious because of the small sample size and limited statistical power, the study illustrates a procedure for assessing constructs of elder mistreatment in a way that attends to respondents' own constructions of the phenomenon.

  19. An Overview of 2014 SBIR Phase I and Phase II Materials Structures for Extreme Environments

    NASA Technical Reports Server (NTRS)

    Nguyen, Hung D.; Steele, Gynelle C.; Morris, Jessica R.

    2015-01-01

    NASA's Small Business Innovation Research (SBIR) program focuses on technological innovation by investing in development of innovative concepts and technologies to help NASA mission directorates address critical research needs for Agency programs. This report highlights nine of the innovative SBIR 2014 Phase I and Phase II projects that emphasize one of NASA Glenn Research Center's six core competencies-Materials and Structures for Extreme Environments. The technologies cover a wide spectrum of applications such as high temperature environmental barrier coating systems, deployable space structures, solid oxide fuel cells, and self-lubricating hard coatings for extreme temperatures. Each featured technology describes an innovation, technical objective, and highlights NASA commercial and industrial applications. This report provides an opportunity for NASA engineers, researchers, and program managers to learn how NASA SBIR technologies could help their programs and projects, and lead to collaborations and partnerships between the small SBIR companies and NASA that would benefit both.

  20. Extreme Events: low and high total ozone over Arosa, Switzerland

    NASA Astrophysics Data System (ADS)

    Rieder, H. E.; Staehelin, J.; Maeder, J. A.; Ribatet, M.; Stübi, R.; Weihs, P.; Holawe, F.; Peter, T.; Davison, A. C.

    2009-04-01

    The frequency distribution of days with extreme low (termed ELOs) and high (termed EHOs) total ozone is analyzed for the world's longest total ozone record (Arosa, Switzerland - for details see Staehelin et al.,1998a,b), with new tools from extreme value theory (e.g. Coles, 2001; Ribatet, 2007). A heavy-tail focused approach is used through the fitting of the Generalized Pareto Distribution (GPD) to the Arosa time series. Asymptotic arguments (Pickands, 1975) justify the use of the GPD for modeling exceedances over a high (or below a low) enough threshold (Coles, 2001). The analysis shows that the GPD is appropriate for modeling the frequency distribution in total ozone above or below a mathematically well-defined threshold. While previous studies focused on so termed ozone mini-holes and mini-highs (e.g. Bojkov and Balis, 2001, Koch et al., 2005), this study is the first to present a mathematical description of extreme events in low and high total ozone for a northern mid-latitudes site (Rieder et al., 2009). The results show (a) an increase in days with extreme low (ELOs) and (b) a decrease in days with extreme high total ozone (EHOs) during the last decades, (c) that the general trend in total ozone is strongly determined by these extreme events and (d) that fitting the GPD is an appropriate method for the estimation of the frequency distribution of so-called ozone mini-holes. Furthermore, this concept allows one to separate the effect of Arctic ozone depletion from that of in situ mid-latitude ozone loss. As shown by this study, ELOs and EHOs have a strong influence on mean values in total ozone and the "extremes concept" could be further used also for validation of Chemistry-Climate-Models (CCMs) within the scientific community. References: Bojkov, R. D., and Balis, D.S.: Characteristics of episodes with extremely low ozone values in the northern middle latitudes 1975-2000, Ann. Geophys., 19, 797-807, 2001. Coles, S.: An Introduction to Statistical Modeling of Extreme Values, Springer Series in Statistics, ISBN:1852334592, Springer, Berlin, 2001. Koch, G., H. Wernli, C. Schwierz, J. Staehelin, and T. Peter (2005), A composite study on the structure and formation of ozone miniholes and minihighs over central Europe, Geophys. Res. Lett., 32, L12810, doi:10.1029/2004GL022062. Pickands, J.: Statistical-Inference using extreme order Statistics, Ann. Stat., 3, 1, 119-131, 1975. Ribatet, M.: POT: Modelling peaks over a threshold, R News, 7, 34-36, 2007. Rieder, H.E., Staehelin, J., Maeder, J.A., Ribatet, M., Stübi, R., Weihs, P., Holawe, F., Peter, T., and Davison, A.C.: From ozone mini holes and mini highs towards extreme value theory: New insights from extreme events and non stationarity, submitted to J. Geophys. Res., 2009. Staehelin, J., Kegel, R., and Harris, N. R.: Trend analysis of the homogenized total ozone series of Arosa (Switzerland), 1929-1996, J. Geophys. Res., 103(D7), 8389-8400, doi:10.1029/97JD03650, 1998a. Staehelin, J., Renaud, A., Bader, J., McPeters, R., Viatte, P., Hoegger, B., Bugnion, V., Giroud, M., and Schill, H.: Total ozone series at Arosa (Switzerland): Homogenization and data comparison, J. Geophys. Res., 103(D5), 5827-5842, doi:10.1029/97JD02402, 1998b.

  1. Extreme habitats as refuge from parasite infections? Evidence from an extremophile fish

    NASA Astrophysics Data System (ADS)

    Tobler, Michael; Schlupp, Ingo; García de León, Francisco J.; Glaubrecht, Matthias; Plath, Martin

    2007-05-01

    Living in extreme habitats typically requires costly adaptations of any organism tolerating these conditions, but very little is known about potential benefits that trade off these costs. We suggest that extreme habitats may function as refuge from parasite infections, since parasites can become locally extinct either directly, through selection by an extreme environmental parameter on free-living parasite stages, or indirectly, through selection on other host species involved in its life cycle. We tested this hypothesis in a small freshwater fish, the Atlantic molly ( Poecilia mexicana) that inhabits normal freshwaters as well as extreme habitats containing high concentrations of toxic hydrogen sulfide. Populations from such extreme habitats are significantly less parasitized by the trematode Uvulifer sp. than a population from a non-sulfidic habitat. We suggest that reduced parasite prevalence may be a benefit of living in sulfidic habitats.

  2. Fourier fringe analysis and its application to metrology of extreme physical phenomena: a review [Invited].

    PubMed

    Takeda, Mitsuo

    2013-01-01

    The paper reviews a technique for fringe analysis referred to as Fourier fringe analysis (FFA) or the Fourier transform method, with a particular focus on its application to metrology of extreme physical phenomena. Examples include the measurement of extremely small magnetic fields with subfluxon sensitivity by electron wave interferometry, subnanometer wavefront evaluation of projection optics for extreme UV lithography, the detection of sub-Ångstrom distortion of a crystal lattice, and the measurement of ultrashort optical pulses in the femotsecond to attosecond range, which show how the advantages of FFA are exploited in these cutting edge applications.

  3. ANOMALOUS RELATIVE AR/CA CORONAL ABUNDANCES OBSERVED BY THE HINODE/EUV IMAGING SPECTROMETER NEAR SUNSPOTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doschek, G. A.; Warren, H. P.; Feldman, U.

    2015-07-20

    In determining the element abundance of argon (a high first ionization potential; FIP element) relative to calcium (a low FIP element) in flares, unexpectedly high intensities of two Ar xiv lines (194.40, 187.96 Å) relative to a Ca xiv line (193.87 Å) intensity were found in small (a few arcseconds) regions near sunspots in flare spectra recorded by the Extreme-ultraviolet Imaging Spectrometer on the Hinode spacecraft. In the most extreme case the Ar xiv line intensity relative to the Ca xiv intensity was 7 times the value expected from the photospheric abundance ratio, which is about 30 times the abundancemore » of argon relative to calcium in active regions, i.e., the measured Ar/Ca abundance ratio is about 10 instead of 0.37 as in active regions. The Ar xiv and Ca xiv lines are formed near 3.4 MK and have very similar contribution functions. This is the first observation of the inverse FIP effect in the Sun. Other regions show increases of 2–3 over photospheric abundances, or just photospheric abundances. This phenomenon appears to occur rarely and only over small areas of flares away from the regions containing multi-million degree plasma, but more work is needed to quantify the occurrences and their locations. In the bright hot regions of flares the Ar/Ca abundance ratio is coronal, i.e., the same as in active regions. In this Letter we show three examples of the inverse FIP effect.« less

  4. Head growth and neurocognitive outcomes.

    PubMed

    Wright, Charlotte M; Emond, Alan

    2015-06-01

    There is a lack of evidence on the value of head circumference (HC) as a screening measure. We aimed to describe the incidence of head centile shifting and the relationship between extremes of head size and later neurodevelopmental problems in the Avon Longitudinal Study of Parents and Children. HC was measured routinely at 2, 9, and 18 or 24 months and by researchers at ages 4, 8, 12, and 18 months. IQ according to the Wechsler Intelligence Scale for Children was measured in research clinics at age 8 for all. Neurocognitive disorders (NCDs) were identified from chart review. There were 10 851 children with ≥2 head measurements. At each age, 2% to 3% of children had scores that were < -2 or >2 SDs below or above the mean, but for most children this was only found at 1 age. More than 15% of children showed centile shifts, but less than one-third of these were sustained at subsequent measurements. Only 0.5% showed a sustained shift beyond the normal range. Children with consistently small heads were up to 7 times more likely to have an NCD, but 85% of children with small heads had no NCDs, and 93% of children with NCDs had head SD scores within the normal range. Centile shifts within the normal range occur commonly and seem mainly to reflect measurement error. This finding makes robust assessment of the head trajectory difficult and may result in many children being investigated unnecessarily. Extreme head size is neither specific nor sensitive for detecting NCDs, suggesting that routine measurement of HC is unhelpful. Copyright © 2015 by the American Academy of Pediatrics.

  5. Retainment of r-process material in dwarf galaxies

    NASA Astrophysics Data System (ADS)

    Beniamini, Paz; Dvorkin, Irina; Silk, Joe

    2018-04-01

    The synthesis of r-process elements is known to involve extremely energetic explosions. At the same time, recent observations find significant r-process enrichment even in extremely small ultra-faint dwarf (UFD) galaxies. This raises the question of retainment of those elements within their hosts. We estimate the retainment fraction and find that it is large ˜0.9, unless the r-process event is very energetic (≳ 1052erg) and / or the host has lost a large fraction of its gas prior to the event. We estimate the r-process mass per event and rate as implied by abundances in UFDs, taking into account imperfect retainment and different models of UFD evolution. The results are consistent with previous estimates (Beniamini et al. 2016b) and with the constraints from the recently detected macronova accompanying a neutron star merger (GW170817). We also estimate the distribution of abundances predicted by these models. We find that ˜0.07 of UFDs should have r-process enrichment. The results are consistent with both the mean values and the fluctuations of [Eu/Fe] in galactic metal poor stars, supporting the possibility that UFDs are the main 'building blocks' of the galactic halo population.

  6. Fasting and nonfasting triglycerides in cardiovascular and other diseases.

    PubMed

    Piťha, J; Kovář, J; Blahová, T

    2015-01-01

    Moderately elevated plasma/serum triglycerides (2-10 mmol/l) signalize increased risk for cardiovascular disease or presence of non-alcoholic steatohepatitis. Extremely elevated triglycerides (more than 10 mmol/l) signalize increased risk for pancreatitis and lipemia retinalis. The concentration of triglycerides is regulated by many genetic and nongenetic factors. Extremely elevated triglycerides not provoked by nutritional factors, especially inappropriate alcohol intake are more likely to have a monogenic cause. On the contrary, mildly to moderately elevated triglycerides are often caused by polygenic disorders; these could be also associated with central obesity, insulin resistance, and diabetes mellitus. Concentration of triglycerides is also closely interconnected with presence of atherogenic remnant lipoproteins, impaired reverse cholesterol transport and more atherogenic small LDL particles. In general, there is tight association between triglycerides and many other metabolic factors including intermediate products of lipoprotein metabolism which are frequently atherogenic. Therefore, reliable evaluation of the independent role of triglycerides especially in atherosclerosis and cardiovascular disease is difficult. In individual cases values of HDL cholesterol, non-HDL cholesterol (total minus HDL cholesterol), non-HDL/nonLDL cholesterol (total minus HDL minus LDL cholesterol, especially in nonfasting status), atherogenic index of plasma and/or apolipoprotein B could help in decisions regarding aggressiveness of treatment.

  7. Quantifying uncertainties in wind energy assessment

    NASA Astrophysics Data System (ADS)

    Patlakas, Platon; Galanis, George; Kallos, George

    2015-04-01

    The constant rise of wind energy production and the subsequent penetration in global energy markets during the last decades resulted in new sites selection with various types of problems. Such problems arise due to the variability and the uncertainty of wind speed. The study of the wind speed distribution lower and upper tail may support the quantification of these uncertainties. Such approaches focused on extreme wind conditions or periods below the energy production threshold are necessary for a better management of operations. Towards this direction, different methodologies are presented for the credible evaluation of potential non-frequent/extreme values for these environmental conditions. The approaches used, take into consideration the structural design of the wind turbines according to their lifespan, the turbine failures, the time needed for repairing as well as the energy production distribution. In this work, a multi-parametric approach for studying extreme wind speed values will be discussed based on tools of Extreme Value Theory. In particular, the study is focused on extreme wind speed return periods and the persistence of no energy production based on a weather modeling system/hind cast/10-year dataset. More specifically, two methods (Annual Maxima and Peaks Over Threshold) were used for the estimation of extreme wind speeds and their recurrence intervals. Additionally, two different methodologies (intensity given duration and duration given intensity, both based on Annual Maxima method) were implied to calculate the extreme events duration, combined with their intensity as well as the event frequency. The obtained results prove that the proposed approaches converge, at least on the main findings, for each case. It is also remarkable that, despite the moderate wind speed climate of the area, several consequent days of no energy production are observed.

  8. Minimizing metastatic risk in radiotherapy fractionation schedules

    NASA Astrophysics Data System (ADS)

    Badri, Hamidreza; Ramakrishnan, Jagdish; Leder, Kevin

    2015-11-01

    Metastasis is the process by which cells from a primary tumor disperse and form new tumors at distant anatomical locations. The treatment and prevention of metastatic cancer remains an extremely challenging problem. This work introduces a novel biologically motivated objective function to the radiation optimization community that takes into account metastatic risk instead of the status of the primary tumor. In this work, we consider the problem of developing fractionated irradiation schedules that minimize production of metastatic cancer cells while keeping normal tissue damage below an acceptable level. A dynamic programming framework is utilized to determine the optimal fractionation scheme. We evaluated our approach on a breast cancer case using the heart and the lung as organs-at-risk (OAR). For small tumor α /β values, hypo-fractionated schedules were optimal, which is consistent with standard models. However, for relatively larger α /β values, we found the type of schedule depended on various parameters such as the time when metastatic risk was evaluated, the α /β values of the OARs, and the normal tissue sparing factors. Interestingly, in contrast to standard models, hypo-fractionated and semi-hypo-fractionated schedules (large initial doses with doses tapering off with time) were suggested even with large tumor α/β values. Numerical results indicate the potential for significant reduction in metastatic risk.

  9. Linking hydraulic properties of fire-affected soils to infiltration and water repellency

    USGS Publications Warehouse

    Moody, John A.; David Kinner,; Xavier Úbeda,

    2009-01-01

    Heat from wildfires can produce a two-layer system composed of extremely dry soil covered by a layer of ash, which when subjected to rainfall, may produce extreme floods. To understand the soil physics controlling runoff for these initial conditions, we used a small, portable disk infiltrometer to measure two hydraulic properties: (1) near-saturated hydraulic conductivity, Kf and (2) sorptivity, S(θi), as a function of initial soil moisture content, θi, ranging from extremely dry conditions (θi < 0.02 cm3 cm−3) to near saturation. In the field and in the laboratory replicate measurements were made of ash, reference soils, soils unaffected by fire, and fire-affected soils. Each has a different degrees of water repellency that influences Kf and S(θi).Values of Kf ranged from 4.5 × 10−3 to 53 × 10−3 cm s−1 for ash; from 0.93 × 10−3 to 130 × 10−3 cm s−1 for reference soils; and from 0.86 × 10−3 to 3.0 × 10−3 cm s−1, for soil unaffected by fire, which had the lowest values of Kf. Measurements indicated that S(θi) could be represented by an empirical non-linear function of θi with a sorptivity maximum of 0.18–0.20 cm s−0.5, between 0.03 and 0.08 cm3 cm−3. This functional form differs from the monotonically decreasing non-linear functions often used to represent S(θi) for rainfall–runoff modeling. The sorptivity maximum may represent the combined effects of gravity, capillarity, and adsorption in a transitional domain corresponding to extremely dry soil, and moreover, it may explain the observed non-linear behavior, and the critical soil-moisture threshold of water repellent soils. Laboratory measurements of Kf and S(θi) are the first for ash and fire-affected soil, but additional measurements are needed of these hydraulic properties for in situ fire-affected soils. They provide insight into water repellency behavior and infiltration under extremely dry conditions. Most importantly, they indicate how existing rainfall–runoff models can be modified to accommodate a possible two-layer system in extremely dry conditions. These modified models can be used to predict floods from burned watersheds under these initial conditions.

  10. High-resolution stochastic generation of extreme rainfall intensity for urban drainage modelling applications

    NASA Astrophysics Data System (ADS)

    Peleg, Nadav; Blumensaat, Frank; Molnar, Peter; Fatichi, Simone; Burlando, Paolo

    2016-04-01

    Urban drainage response is highly dependent on the spatial and temporal structure of rainfall. Therefore, measuring and simulating rainfall at a high spatial and temporal resolution is a fundamental step to fully assess urban drainage system reliability and related uncertainties. This is even more relevant when considering extreme rainfall events. However, the current space-time rainfall models have limitations in capturing extreme rainfall intensity statistics for short durations. Here, we use the STREAP (Space-Time Realizations of Areal Precipitation) model, which is a novel stochastic rainfall generator for simulating high-resolution rainfall fields that preserve the spatio-temporal structure of rainfall and its statistical characteristics. The model enables a generation of rain fields at 102 m and minute scales in a fast and computer-efficient way matching the requirements for hydrological analysis of urban drainage systems. The STREAP model was applied successfully in the past to generate high-resolution extreme rainfall intensities over a small domain. A sub-catchment in the city of Luzern (Switzerland) was chosen as a case study to: (i) evaluate the ability of STREAP to disaggregate extreme rainfall intensities for urban drainage applications; (ii) assessing the role of stochastic climate variability of rainfall in flow response and (iii) evaluate the degree of non-linearity between extreme rainfall intensity and system response (i.e. flow) for a small urban catchment. The channel flow at the catchment outlet is simulated by means of a calibrated hydrodynamic sewer model.

  11. Steady inviscid transonic flows over planar airfoils: A search for a simplified procedure

    NASA Technical Reports Server (NTRS)

    Magnus, R.; Yoshihara, H.

    1973-01-01

    A finite difference procedure based upon a system of unsteady equations in proper conservation form with either exact or small disturbance steady terms is used to calculate the steady flows over several classes of airfoils. The airfoil condition is fulfilled on a slab whose upstream extremity is a semi-circle overlaying the airfoil leading edge circle. The limitations of the small disturbance equations are demonstrated in an extreme example of a blunt-nosed, aft-cambered airfoil. The necessity of using the equations in proper conservation form to capture the shock properly is stressed. Ability of the steady relaxation procedures to capture the shock is briefly examined.

  12. Pituitary, gonadal and adrenal hormones after prolonged residence at extreme altitude in man.

    PubMed

    Basu, M; Pal, K; Prasad, R; Malhotra, A S; Rao, K S; Sawhney, R C

    1997-06-01

    High altitude-induced alterations in pituitary, gonadal and adrenal hormones were studied in (i) eugonadal men from the armed forces who were resident at sea level (SL), (ii) SL residents staying at an altitude of 3542 m for periods ranging from 3 to 12 months (acclimatized lowlanders, ALL), (iii) ALL who stayed at 6300 m for 6 months, (iv) ALL who trekked from 3542 to 5080 m and stayed at an altitude of more than 6300 m in the glacier region for 6 months, and (v) high-altitude natives (HAN) resident at an altitude of 3300-3700 m. Circulating levels of LH, FSH, prolactin, cortisol, testosterone, dihydrotestosterone (DHT) and progesterone in ALL at 3542 m and in HAN were not significantly different (p > 0.05) from the SL control values. When the ALL living at 3542 m trekked to an extreme altitude of 5080 m, their testosterone levels showed a significant decrease (p < 0.01) compared to the preceding altitude values but had returned to SL values when measured after 6 months' continuous stay at 6300 m. As with testosterone, the levels of DHT and oestradiol-17 beta (E2) after prolonged stay at extreme altitude were also not significantly different (p > 0.05) from the SL values. The LH levels after trekking to 5080 m were significantly higher (p < 0.01) than at an altitude of 3542 m, but decreased to levels found at 3542 m or SL after prolonged residence at extreme altitude. Plasma levels of ACTH, prolactin, FSH and cortisol on arrival at 5080 m, and after a 6-month stay at extreme altitude, were not significantly different (p > 0.05) from the SL values. Plasma progesterone levels tended to increase on arrival at 5080 m but a significant increase (p < 0.001) was evident only after a 6-month stay at extreme altitude. These observations suggest that prolonged residence at lower as well as at extreme altitude does not appreciably alter blood levels of pituitary, gonadal or adrenal hormones except for plasma levels of progesterone. The exact mechanism and significance of this increase remains unknown, but may be important in increasing the sensitivity of the hypoxic ventilatory response and activation of haemoglobin synthesis.

  13. Population-based screening for anemia using first-time blood donors

    PubMed Central

    Mast, Alan E.; Steele, Whitney R.; Johnson, Bryce; Wright, David J.; Cable, Ritchard G.; Carey, Patricia; Gottschall, Jerome L.; Kiss, Joseph E.; Simon, Toby L.; Murphy, Edward L.

    2012-01-01

    Background Anemia is an important public health concern. Data from population-based surveys such as the National Health and Nutrition Examination Survey (NHANES) are the gold standard, but are obtained infrequently and include only small samples from certain minority groups. Objectives We assessed whether readily available databases of blood donor hemoglobin values could be used as a surrogate for population hemoglobin values from NHANES. Design Blood donor venous and fingerstick hemoglobin values were compared to 10,254 NHANES 2005-2008 venous hemoglobin values using demographically stratified analyses and ANOVA. Fingerstick hemoglobins or hematocrits were converted to venous hemoglobin estimates using regression analysis. Results Venous hemoglobin values from 1,609 first time donors correlated extremely well with NHANES data across different age, gender and demographic groups. Cigarette smoking increased hemoglobin by 0.26 to 0.59 g/dL depending on intensity. Converted fingerstick hemoglobin from 36,793 first time donors agreed well with NHANES hemoglobin (weighted mean hemoglobin of 15.53 g/dL for donors and 15.73 g/dL for NHANES) with similar variation in mean hemoglobin by age. However, compared to NHANES, the larger donor dataset showed reduced differences in mean hemoglobin between Blacks and other races/ethnicities. Conclusions Overall, first-time donor fingerstick hemoglobins approximate U.S. population data and represent a readily available public health resource for ongoing anemia surveillance. PMID:22460662

  14. Strain energy release rate analysis of the end-notched flexure specimen using the finite-element method

    NASA Technical Reports Server (NTRS)

    Salpekar, S. A.; Raju, I. S.; O'Brien, T. K.

    1988-01-01

    Two-dimensional finite-element analysis of the end-notched flexure specimen was performed using 8-node isoparametric, parabolic elements to evaluate compliance and mode II strain energy release rates, G sub II. The G sub II values were computed using two different techniques: the virtual crack-closure technique (VCCT) and the rate of change of compliance with crack length (compliance derivative method). The analysis was performed for various crack-length-to-semi-span (a/L) ratios ranging from 0.2 to 0.9. Three material systems representing a wide range of material properties were analyzed. The compliance and strain energy release rates of the specimen calculated with the present finite-element analysis agree very well with beam theory equations including transverse shear. The G sub II values calculated using the compliance derivative method compared extremely well with those calculated using the VCCT. The G sub II values obtained by the compliance derivative method using the top or bottom beam deflections agreed closely with each other. The strain energy release rates from a plane-stress analysis were higher than the plane-strain values by only a small percentage, indicating that either assumption may be used in the analysis. The G sub II values for one material system calculated from the finte-element analysis agreed with one solution in the literature and disagreed with the other solution in the literature.

  15. Strain-energy-release rate analysis of the end-notched flexure specimen using the finite-element method

    NASA Technical Reports Server (NTRS)

    Salpekar, S. A.; Raju, I. S.; Obrien, T. K.

    1987-01-01

    Two-dimensional finite-element analysis of the end-notched flexure specimen was performed using 8-node isoparametric, parabolic elements to evaluate compliance and mode II strain energy release rates, G sub II. The G sub II values were computed using two different techniques: the virtural crack-closure technique (VCCT) and the rate of change of compliance with crack length (compliance derivative method). The analysis was performed for various crack-length-to-semi-span (a/L) ratios ranging from 0.2 to 0.9. Three material systems representing a wide range of material properties were analyzed. The compliance and strain energy release rates of the specimen calculated with the present finite-element analysis agree very well with beam theory equations including transverse shear. The G sub II values calculated using the compliance derivative method compared extremely well with those calculated using the VCCT. The G sub II values obtained by the compliance derivative method using the top or bottom beam deflections agreed closely with each other. The strain energy release rates from a plane-stress analysis were higher than the plane-strain values by only a small percentage, indicating that either assumption may be used in the analysis. The G sub II values for one material system calculated from the finite-element analysis agreed with one solution in the literature and disagreed with the other solution in the literature.

  16. Feather roughness reduces flow separation during low Reynolds number glides of swifts.

    PubMed

    van Bokhorst, Evelien; de Kat, Roeland; Elsinga, Gerrit E; Lentink, David

    2015-10-01

    Swifts are aerodynamically sophisticated birds with a small arm and large hand wing that provides them with exquisite control over their glide performance. However, their hand wings have a seemingly unsophisticated surface roughness that is poised to disturb flow. This roughness of about 2% chord length is formed by the valleys and ridges of overlapping primary feathers with thick protruding rachides, which make the wing stiffer. An earlier flow study of laminar-turbulent boundary layer transition over prepared swift wings suggested that swifts can attain laminar flow at a low angle of attack. In contrast, aerodynamic design theory suggests that airfoils must be extremely smooth to attain such laminar flow. In hummingbirds, which have similarly rough wings, flow measurements on a 3D printed model suggest that the flow separates at the leading edge and becomes turbulent well above the rachis bumps in a detached shear layer. The aerodynamic function of wing roughness in small birds is, therefore, not fully understood. Here, we performed particle image velocimetry and force measurements to compare smooth versus rough 3D-printed models of the swift hand wing. The high-resolution boundary layer measurements show that the flow over rough wings is indeed laminar at a low angle of attack and a low Reynolds number, but becomes turbulent at higher values. In contrast, the boundary layer over the smooth wing forms open laminar separation bubbles that extend beyond the trailing edge. The boundary layer dynamics of the smooth surface varies non-linearly as a function of angle of attack and Reynolds number, whereas the rough surface boasts more consistent turbulent boundary layer dynamics. Comparison of the corresponding drag values, lift values and glide ratios suggests, however, that glide performance is equivalent. The increased structural performance, boundary layer robustness and equivalent aerodynamic performance of rough wings might have provided small (proto) birds with an evolutionary window to high glide performance. © 2015. Published by The Company of Biologists Ltd.

  17. Lattice Boltzmann simulation of immiscible displacement in the cavity with different channel configurations

    NASA Astrophysics Data System (ADS)

    Lou, Qin; Zang, Chenqiang; Yang, Mo; Xu, Hongtao

    In this work, the immiscible displacement in a cavity with different channel configurations is studied using an improved pseudo-potential lattice Boltzmann equation (LBE) model. This model overcomes the drawback of the dependence of the fluid properties on the grid size, which exists in the original pseudo-potential LBE model. The approach is first validated by the Laplace law. Then, it is employed to study the immiscible displacement process. The influences of different factors, such as the surface wettability, the distance between the gas cavity and liquid cavity and the surface roughness of the channel are investigated. Numerical results show that the displacement efficiency increases and the displacement time decreases with the increase of the surface contact angle. On the other hand, the displacement efficiency increases with increasing distance between the gas cavity and the liquid cavity at first and finally reaches a constant value. As for the surface roughness, two structures (a semicircular cavity and a semicircular bulge) are studied. The comprehensive results show that although the displacement processes for both the structures depend on the surface wettability, they present quite different behaviors. Specially, for the roughness structure constituted by the semicircular cavity, the displacement efficiency decreases and displacement time increases evidently with the size of the semicircular cavity for the small contact angle. The trend slows down as the increase of the contact angle. Once the contact angle exceeds a certain value, the size of the semicircular cavity almost has no influence on the displacement process. While for the roughness structure of a semicircular bulge, the displacement efficiency increases with the size of bulge first and then it decreases for the small contact angle. The displacement efficiency increases first and finally reaches a constant for the large contact angle. The results also show that the displacement time has an extreme value in these cases for the small contact angles.

  18. Dust in the Small Magellanic Cloud

    NASA Technical Reports Server (NTRS)

    Magalhaes, A. M.

    1993-01-01

    Observations of reddened stars in the Small Magellanic Cloud (SMC) indicate that the interstellar grains in that galaxy may show distinct optical properties from those in the Galaxy. In a careful study of three SMC objects, Prevot showed that the UV extinction law in the SMC is almost linear with inverse wavelength and the 2200A feature is generally absent. The first results of a program to determine the wavelength dependence of the interstellar optical polarization in the SMC indicate that highly polarized objects are scarce. Our study has uncovered, however, several objects with optical polarization greater than around 1 percent: AZV 126, AZV 211, AZV 221, AZV 398, and AZV 456. The latter two have already had their UV extinction law determined. Our aim was to obtain International Ultraviolet Explorer (IUE) data and determine the UV extinction law also for AZV 126, AZV 211, and AZV 221. AZV 456, which presents a 'galactic' extinction law, has a 'normal' value for its wavelength of maximum polarization, lmax, while AZV 398, which shows a 'typical' SMC extinction curve, shows a somewhat smaller value for such wavelength. AZV 126, AZV 211, and AZV 221 all present extreme small values of lmax but had not had its extinction curve in the W determined yet. We therefore aimed at ultimately determining the extinction law in the direction of these three objects. Such results, in combination with the optical polarization data, have an important bearing on constraining the composition and size distribution of the interstellar dust in the SMC. In the last report, the images gathered with IUE, their processing, and the extinction curves derived from them were described. Such extinction curves and the theoretical models developed to interpret the SMC extinction and polarization data are discussed. Details are presented in an enclosed preprint. The activities in our ongoing polarimetric program of determining the magnetic field structure of the SMC and the images collected at Cerro Tololo Interamerican Observatory during the period are also briefly described. Other activities are also described.

  19. Extreme values and fat tails of multifractal fluctuations

    NASA Astrophysics Data System (ADS)

    Muzy, J. F.; Bacry, E.; Kozhemyak, A.

    2006-06-01

    In this paper we discuss the problem of the estimation of extreme event occurrence probability for data drawn from some multifractal process. We also study the heavy (power-law) tail behavior of probability density function associated with such data. We show that because of strong correlations, the standard extreme value approach is not valid and classical tail exponent estimators should be interpreted cautiously. Extreme statistics associated with multifractal random processes turn out to be characterized by non-self-averaging properties. Our considerations rely upon some analogy between random multiplicative cascades and the physics of disordered systems and also on recent mathematical results about the so-called multifractal formalism. Applied to financial time series, our findings allow us to propose an unified framework that accounts for the observed multiscaling properties of return fluctuations, the volatility clustering phenomenon and the observed “inverse cubic law” of the return pdf tails.

  20. Probability distribution of extreme share returns in Malaysia

    NASA Astrophysics Data System (ADS)

    Zin, Wan Zawiah Wan; Safari, Muhammad Aslam Mohd; Jaaman, Saiful Hafizah; Yie, Wendy Ling Shin

    2014-09-01

    The objective of this study is to investigate the suitable probability distribution to model the extreme share returns in Malaysia. To achieve this, weekly and monthly maximum daily share returns are derived from share prices data obtained from Bursa Malaysia over the period of 2000 to 2012. The study starts with summary statistics of the data which will provide a clue on the likely candidates for the best fitting distribution. Next, the suitability of six extreme value distributions, namely the Gumbel, Generalized Extreme Value (GEV), Generalized Logistic (GLO) and Generalized Pareto (GPA), the Lognormal (GNO) and the Pearson (PE3) distributions are evaluated. The method of L-moments is used in parameter estimation. Based on several goodness of fit tests and L-moment diagram test, the Generalized Pareto distribution and the Pearson distribution are found to be the best fitted distribution to represent the weekly and monthly maximum share returns in Malaysia stock market during the studied period, respectively.

  1. [The informative value of the functional step test for the purpose of computed optical topography in the children presenting with the functional disorders of the musculoskeletal system].

    PubMed

    Trukhmanov, I M; Suslova, G A; Ponomarenko, G N

    This paper is devoted to the characteristic of the informative value of the functional step test with the application of the heel cushions in the children for the purpose of differential diagnostics of anatomic and functional differences in the length of the lower extremities. A total of 85 schoolchildren with different length of the lower extremities have been examined. The comparative evaluation of the results of clinical and instrumental examinations was undertaken. The data obtained with the help of the functional step test give evidence of its very high sensitivity, specificity, and clinical significant as a tool for the examination of the children with different length of the low extremities. It is concluded that the test is one of the most informative predictors of the effectiveness of rehabilitation in the children with different length of the lower extremities.

  2. Extreme value laws for fractal intensity functions in dynamical systems: Minkowski analysis

    NASA Astrophysics Data System (ADS)

    Mantica, Giorgio; Perotti, Luca

    2016-09-01

    Typically, in the dynamical theory of extremal events, the function that gauges the intensity of a phenomenon is assumed to be convex and maximal, or singular, at a single, or at most a finite collection of points in phase-space. In this paper we generalize this situation to fractal landscapes, i.e. intensity functions characterized by an uncountable set of singularities, located on a Cantor set. This reveals the dynamical rôle of classical quantities like the Minkowski dimension and content, whose definition we extend to account for singular continuous invariant measures. We also introduce the concept of extremely rare event, quantified by non-standard Minkowski constants and we study its consequences to extreme value statistics. Limit laws are derived from formal calculations and are verified by numerical experiments. Dedicated to the memory of Joseph Ford, on the twentieth anniversary of his departure.

  3. A dependence modelling study of extreme rainfall in Madeira Island

    NASA Astrophysics Data System (ADS)

    Gouveia-Reis, Délia; Guerreiro Lopes, Luiz; Mendonça, Sandra

    2016-08-01

    The dependence between variables plays a central role in multivariate extremes. In this paper, spatial dependence of Madeira Island's rainfall data is addressed within an extreme value copula approach through an analysis of maximum annual data. The impact of altitude, slope orientation, distance between rain gauge stations and distance from the stations to the sea are investigated for two different periods of time. The results obtained highlight the influence of the island's complex topography on the spatial distribution of extreme rainfall in Madeira Island.

  4. Log D versus HPLC derived hydrophobicity: The development of predictive tools to aid in the rational design of bioactive peptoids

    DOE PAGES

    Bolt, H. L.; Williams, C. E. J.; Brooks, R. V.; ...

    2017-01-13

    Hydrophobicity has proven to be an extremely useful parameter in small molecule drug discovery programmes given that it can be used as a predictive tool to enable rational design. For larger molecules, including peptoids, where folding is possible, the situation is more complicated and the average hydrophobicity (as determined by RP-HPLC retention time) may not always provide an effective predictive tool for rational design. Herein, we report the first ever application of partitioning experiments to determine the log D values for a series of peptoids. By comparing log D and average hydrophobicities we highlight the potential advantage of employing themore » former as a predictive tool in the rational design of biologically active peptoids.« less

  5. Eight guidelines for developing a strategy for the '90s.

    PubMed

    Kaufman, N

    1994-03-20

    Regardless of the outcome of federal reform initiatives, health care is undergoing structural change of unprecedented magnitude. Structural change occurs when there is a fundamental, sustainable change in the values and purchasing behavior of buyers. During such times, market leaders are extremely vulnerable to competitive threats due to internal bureaucratic barriers. Witness the U.S. computer and automobile industries. As Robert Lutz, president of Chrysler, points out, "Being large doesn't mean being safe. The large won't eat the small. The swift will eat the slow." During this dynamic period in health care, it is critical that strategy be on target. Periods of structural change are filled with numerous threats as well as opportunities. The following are eight guidelines for developing health care strategy during the structural changes of the '90s.

  6. TURBULENCE-GENERATED PROTON-SCALE STRUCTURES IN THE TERRESTRIAL MAGNETOSHEATH

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vörös, Zoltán; Narita, Yasuhito; Yordanova, Emiliya

    2016-03-01

    Recent results of numerical magnetohydrodynamic simulations suggest that in collisionless space plasmas, turbulence can spontaneously generate thin current sheets. These coherent structures can partially explain the intermittency and the non-homogenous distribution of localized plasma heating in turbulence. In this Letter, Cluster multi-point observations are used to investigate the distribution of magnetic field discontinuities and the associated small-scale current sheets in the terrestrial magnetosheath downstream of a quasi-parallel bow shock. It is shown experimentally, for the first time, that the strongest turbulence-generated current sheets occupy the long tails of probability distribution functions associated with extremal values of magnetic field partial derivatives.more » During the analyzed one-hour time interval, about a hundred strong discontinuities, possibly proton-scale current sheets, were observed.« less

  7. Study of mesoscale phenomena, winter monsoon clouds and snow area based on LANDSAT data

    NASA Technical Reports Server (NTRS)

    Tsuchiya, K. (Principal Investigator)

    1976-01-01

    The author has identified the following significant results. Most longitudinal clouds which appear as continuous linear clouds are composed of small transversal clouds. There are mountain waves of different wavelength in a comparatively narrow area indicating complicated orographical effects on wind and temperature distribution or on both dynamical and static stability condition. There is a particular shape of cirrus cloud suggestive of turbulence in the vicinity of CAT in the upper troposphere near jet stream level and its cold air side. Thin cirrus of overcast condition can be distinguished by MSS; however, extremely thin cirrus of partly cloudy condition cannot be detected even in LANDSAT data. This presents a serious problem in the interpretation of satellite thermal infrared radiation data since they affect the value.

  8. The early-type strong emission-line supergiants of the Magellanic Clouds - A spectroscopic zoology

    NASA Technical Reports Server (NTRS)

    Shore, S. N.; Sanduleak, N.

    1984-01-01

    The results of a spectroscopic survey of 21 early-type extreme emission line supergiants of the Large and Small Magellanic Clouds using IUE and optical spectra are presented. The combined observations are discussed and the literature on each star in the sample is summarized. The classification procedures and the methods by which effective temperatures, bolometric magnitudes, and reddenings were assigned are discussed. The derived reddening values are given along with some results concerning anomalous reddening among the sample stars. The derived mass, luminosity, and radius for each star are presented, and the ultraviolet emission lines are described. Mass-loss rates are derived and discussed, and the implications of these observations for the evolution of the most massive stars in the Local Group are addressed.

  9. Accretion rates of protoplanets 2: Gaussian distribution of planestesimal velocities

    NASA Technical Reports Server (NTRS)

    Greenzweig, Yuval; Lissauer, Jack J.

    1991-01-01

    The growth rate of a protoplanet embedded in a uniform surface density disk of planetesimals having a triaxial Gaussian velocity distribution was calculated. The longitudes of the aspses and nodes of the planetesimals are uniformly distributed, and the protoplanet is on a circular orbit. The accretion rate in the two body approximation is enhanced by a factor of approximately 3, compared to the case where all planetesimals have eccentricity and inclination equal to the root mean square (RMS) values of those variables in the Gaussian distribution disk. Numerical three body integrations show comparable enhancements, except when the RMS initial planetesimal eccentricities are extremely small. This enhancement in accretion rate should be incorporated by all models, analytical or numerical, which assume a single random velocity for all planetesimals, in lieu of a Gaussian distribution.

  10. Log D versus HPLC derived hydrophobicity: The development of predictive tools to aid in the rational design of bioactive peptoids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bolt, H. L.; Williams, C. E. J.; Brooks, R. V.

    Hydrophobicity has proven to be an extremely useful parameter in small molecule drug discovery programmes given that it can be used as a predictive tool to enable rational design. For larger molecules, including peptoids, where folding is possible, the situation is more complicated and the average hydrophobicity (as determined by RP-HPLC retention time) may not always provide an effective predictive tool for rational design. Herein, we report the first ever application of partitioning experiments to determine the log D values for a series of peptoids. By comparing log D and average hydrophobicities we highlight the potential advantage of employing themore » former as a predictive tool in the rational design of biologically active peptoids.« less

  11. Fabrication of sub-12 nm thick silicon nanowires by processing scanning probe lithography masks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kyoung Ryu, Yu; Garcia, Ricardo, E-mail: r.garcia@csic.es; Aitor Postigo, Pablo

    2014-06-02

    Silicon nanowires are key elements to fabricate very sensitive mechanical and electronic devices. We provide a method to fabricate sub-12 nm silicon nanowires in thickness by combining oxidation scanning probe lithography and anisotropic dry etching. Extremely thin oxide masks (0.3–1.1 nm) are transferred into nanowires of 2–12 nm in thickness. The width ratio between the mask and the silicon nanowire is close to one which implies that the nanowire width is controlled by the feature size of the nanolithography. This method enables the fabrication of very small single silicon nanowires with cross-sections below 100 nm{sup 2}. Those values are the smallest obtained withmore » a top-down lithography method.« less

  12. A New Approach to Extreme Value Estimation Applicable to a Wide Variety of Random Variables

    NASA Technical Reports Server (NTRS)

    Holland, Frederic A., Jr.

    1997-01-01

    Designing reliable structures requires an estimate of the maximum and minimum values (i.e., strength and load) that may be encountered in service. Yet designs based on very extreme values (to insure safety) can result in extra material usage and hence, uneconomic systems. In aerospace applications, severe over-design cannot be tolerated making it almost mandatory to design closer to the assumed limits of the design random variables. The issue then is predicting extreme values that are practical, i.e. neither too conservative or non-conservative. Obtaining design values by employing safety factors is well known to often result in overly conservative designs and. Safety factor values have historically been selected rather arbitrarily, often lacking a sound rational basis. To answer the question of how safe a design needs to be has lead design theorists to probabilistic and statistical methods. The so-called three-sigma approach is one such method and has been described as the first step in utilizing information about the data dispersion. However, this method is based on the assumption that the random variable is dispersed symmetrically about the mean and is essentially limited to normally distributed random variables. Use of this method can therefore result in unsafe or overly conservative design allowables if the common assumption of normality is incorrect.

  13. Small-scale dynamo at low magnetic Prandtl numbers

    NASA Astrophysics Data System (ADS)

    Schober, Jennifer; Schleicher, Dominik; Bovino, Stefano; Klessen, Ralf S.

    2012-12-01

    The present-day Universe is highly magnetized, even though the first magnetic seed fields were most probably extremely weak. To explain the growth of the magnetic field strength over many orders of magnitude, fast amplification processes need to operate. The most efficient mechanism known today is the small-scale dynamo, which converts turbulent kinetic energy into magnetic energy leading to an exponential growth of the magnetic field. The efficiency of the dynamo depends on the type of turbulence indicated by the slope of the turbulence spectrum v(ℓ)∝ℓϑ, where v(ℓ) is the eddy velocity at a scale ℓ. We explore turbulent spectra ranging from incompressible Kolmogorov turbulence with ϑ=1/3 to highly compressible Burgers turbulence with ϑ=1/2. In this work, we analyze the properties of the small-scale dynamo for low magnetic Prandtl numbers Pm, which denotes the ratio of the magnetic Reynolds number, Rm, to the hydrodynamical one, Re. We solve the Kazantsev equation, which describes the evolution of the small-scale magnetic field, using the WKB approximation. In the limit of low magnetic Prandtl numbers, the growth rate is proportional to Rm(1-ϑ)/(1+ϑ). We furthermore discuss the critical magnetic Reynolds number Rmcrit, which is required for small-scale dynamo action. The value of Rmcrit is roughly 100 for Kolmogorov turbulence and 2700 for Burgers. Furthermore, we discuss that Rmcrit provides a stronger constraint in the limit of low Pm than it does for large Pm. We conclude that the small-scale dynamo can operate in the regime of low magnetic Prandtl numbers if the magnetic Reynolds number is large enough. Thus, the magnetic field amplification on small scales can take place in a broad range of physical environments and amplify week magnetic seed fields on short time scales.

  14. Small-scale dynamo at low magnetic Prandtl numbers.

    PubMed

    Schober, Jennifer; Schleicher, Dominik; Bovino, Stefano; Klessen, Ralf S

    2012-12-01

    The present-day Universe is highly magnetized, even though the first magnetic seed fields were most probably extremely weak. To explain the growth of the magnetic field strength over many orders of magnitude, fast amplification processes need to operate. The most efficient mechanism known today is the small-scale dynamo, which converts turbulent kinetic energy into magnetic energy leading to an exponential growth of the magnetic field. The efficiency of the dynamo depends on the type of turbulence indicated by the slope of the turbulence spectrum v(ℓ)∝ℓ^{ϑ}, where v(ℓ) is the eddy velocity at a scale ℓ. We explore turbulent spectra ranging from incompressible Kolmogorov turbulence with ϑ=1/3 to highly compressible Burgers turbulence with ϑ=1/2. In this work, we analyze the properties of the small-scale dynamo for low magnetic Prandtl numbers Pm, which denotes the ratio of the magnetic Reynolds number, Rm, to the hydrodynamical one, Re. We solve the Kazantsev equation, which describes the evolution of the small-scale magnetic field, using the WKB approximation. In the limit of low magnetic Prandtl numbers, the growth rate is proportional to Rm^{(1-ϑ)/(1+ϑ)}. We furthermore discuss the critical magnetic Reynolds number Rm_{crit}, which is required for small-scale dynamo action. The value of Rm_{crit} is roughly 100 for Kolmogorov turbulence and 2700 for Burgers. Furthermore, we discuss that Rm_{crit} provides a stronger constraint in the limit of low Pm than it does for large Pm. We conclude that the small-scale dynamo can operate in the regime of low magnetic Prandtl numbers if the magnetic Reynolds number is large enough. Thus, the magnetic field amplification on small scales can take place in a broad range of physical environments and amplify week magnetic seed fields on short time scales.

  15. Impacts of Model Bias on the Climate Change Signal and Effects of Weighted Ensembles of Regional Climate Model Simulations: A Case Study over Southern Québec, Canada

    DOE PAGES

    Eum, Hyung-Il; Gachon, Philippe; Laprise, René

    2016-01-01

    This study examined the impact of model biases on climate change signals for daily precipitation and for minimum and maximum temperatures. Through the use of multiple climate scenarios from 12 regional climate model simulations, the ensemble mean, and three synthetic simulations generated by a weighting procedure, we investigated intermodel seasonal climate change signals between current and future periods, for both median and extreme precipitation/temperature values. A significant dependence of seasonal climate change signals on the model biases over southern Québec in Canada was detected for temperatures, but not for precipitation. This suggests that the regional temperature change signal is affectedmore » by local processes. Seasonally, model bias affects future mean and extreme values in winter and summer. In addition, potentially large increases in future extremes of temperature and precipitation values were projected. For three synthetic scenarios, systematically less bias and a narrow range of mean change for all variables were projected compared to those of climate model simulations. In addition, synthetic scenarios were found to better capture the spatial variability of extreme cold temperatures than the ensemble mean scenario. Finally, these results indicate that the synthetic scenarios have greater potential to reduce the uncertainty of future climate projections and capture the spatial variability of extreme climate events.« less

  16. Impacts of Model Bias on the Climate Change Signal and Effects of Weighted Ensembles of Regional Climate Model Simulations: A Case Study over Southern Québec, Canada

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eum, Hyung-Il; Gachon, Philippe; Laprise, René

    This study examined the impact of model biases on climate change signals for daily precipitation and for minimum and maximum temperatures. Through the use of multiple climate scenarios from 12 regional climate model simulations, the ensemble mean, and three synthetic simulations generated by a weighting procedure, we investigated intermodel seasonal climate change signals between current and future periods, for both median and extreme precipitation/temperature values. A significant dependence of seasonal climate change signals on the model biases over southern Québec in Canada was detected for temperatures, but not for precipitation. This suggests that the regional temperature change signal is affectedmore » by local processes. Seasonally, model bias affects future mean and extreme values in winter and summer. In addition, potentially large increases in future extremes of temperature and precipitation values were projected. For three synthetic scenarios, systematically less bias and a narrow range of mean change for all variables were projected compared to those of climate model simulations. In addition, synthetic scenarios were found to better capture the spatial variability of extreme cold temperatures than the ensemble mean scenario. Finally, these results indicate that the synthetic scenarios have greater potential to reduce the uncertainty of future climate projections and capture the spatial variability of extreme climate events.« less

  17. From ozone mini-holes and mini-highs towards extreme value theory: New insights from extreme events and non-stationarity

    NASA Astrophysics Data System (ADS)

    Rieder, H. E.; Staehelin, J.; Maeder, J. A.; Ribatet, M.; Stübi, R.; Weihs, P.; Holawe, F.; Peter, T.; Davison, A. C.

    2009-04-01

    Over the last few decades negative trends in stratospheric ozone have been studied because of the direct link between decreasing stratospheric ozone and increasing surface UV-radiation. Recently a discussion on ozone recovery has begun. Long-term measurements of total ozone extending back earlier than 1958 are limited and only available from a few stations in the northern hemisphere. The world's longest total ozone record is available from Arosa, Switzerland (Staehelin et al., 1998a,b). At this site total ozone measurements have been made since late 1926 through the present day. Within this study (Rieder et al., 2009) new tools from extreme value theory (e.g. Coles, 2001; Ribatet, 2007) are applied to select mathematically well-defined thresholds for extreme low and extreme high total ozone. A heavy-tail focused approach is used by fitting the Generalized Pareto Distribution (GPD) to the Arosa time series. Asymptotic arguments (Pickands, 1975) justify the use of the GPD for modeling exceedances over a sufficiently high (or below a sufficiently low) threshold (Coles, 2001). More precisely, the GPD is the limiting distribution of normalized excesses over a threshold, as the threshold approaches the endpoint of the distribution. In practice, GPD parameters are fitted, to exceedances by maximum likelihood or other methods - such as the probability weighted moments. A preliminary step consists in defining an appropriate threshold for which the asymptotic GPD approximation holds. Suitable tools for threshold selection as the MRL-plot (mean residual life plot) and TC-plot (stability plot) from the POT-package (Ribatet, 2007) are presented. The frequency distribution of extremes in low (termed ELOs) and high (termed EHOs) total ozone and their influence on the long-term changes in total ozone are analyzed. Further it is shown that from the GPD-model the distribution of so-called ozone mini holes (e.g. Bojkov and Balis, 2001) can be precisely estimated and that the "extremes concept" provides new information on the data distribution and variability within the Arosa record as well as on the influence of ELOs and EHOs on the long-term trends of the ozone time series. References: Bojkov, R. D., and Balis, D.S.: Characteristics of episodes with extremely low ozone values in the northern middle latitudes 1975-2000, Ann. Geophys., 19, 797-807, 2001. Coles, S.: An Introduction to Statistical Modeling of Extreme Values, Springer Series in Statistics, ISBN:1852334592, Springer, Berlin, 2001. Pickands, J.: Statistical inference using extreme order statistics, Ann. Stat., 3, 1, 119-131, 1975. Ribatet, M.: POT: Modelling peaks over a threshold, R News, 7, 34-36, 2007. Rieder, H.E., Staehelin, J., Maeder, J.A., Stübi, R., Weihs, P., Holawe, F., and M. Ribatet: From ozone mini holes and mini highs towards extreme value theory: New insights from extreme events and non stationarity, submitted to J. Geophys. Res., 2009. Staehelin, J., Kegel, R., and Harris, N. R.: Trend analysis of the homogenized total ozone series of Arosa (Switzerland), 1929-1996, J. Geophys. Res., 103(D7), 8389-8400, doi:10.1029/97JD03650, 1998a. Staehelin, J., Renaud, A., Bader, J., McPeters, R., Viatte, P., Hoegger, B., Bugnion, V., Giroud, M., and Schill, H.: Total ozone series at Arosa (Switzerland): Homogenization and data comparison, J. Geophys. Res., 103(D5), 5827-5842, doi:10.1029/97JD02402, 1998b.

  18. Food security politics and the Millennium Development Goals.

    PubMed

    McMichael, Philip; Schneider, Mindi

    2011-01-01

    This article reviews proposals regarding the recent food crisis in the context of a broader, threshold debate on the future of agriculture and food security. While the MDGs have focused on eradicating extreme poverty and hunger, the food crisis pushed the hungry over the one billion mark. There is thus a renewed focus on agricultural development, which pivots on the salience of industrial agriculture (as a supply source) in addressing food security. The World Bank's new 'agriculture for development' initiative seeks to improve small-farmer productivity with new inputs, and their incorporation into global markets via value-chains originating in industrial agriculture. An alternative claim, originating in 'food sovereignty' politics, demanding small-farmer rights to develop bio-regionally specific agro-ecological methods and provision for local, rather than global, markets, resonates in the IAASTD report, which implies agribusiness as usual ''is no longer an option'. The basic divide is over whether agriculture is a servant of economic growth, or should be developed as a foundational source of social and ecological sustainability. We review and compare these different paradigmatic approaches to food security, and their political and ecological implications.

  19. Generating Health Estimates by Zip Code: A Semiparametric Small Area Estimation Approach Using the California Health Interview Survey.

    PubMed

    Wang, Yueyan; Ponce, Ninez A; Wang, Pan; Opsomer, Jean D; Yu, Hongjian

    2015-12-01

    We propose a method to meet challenges in generating health estimates for granular geographic areas in which the survey sample size is extremely small. Our generalized linear mixed model predicts health outcomes using both individual-level and neighborhood-level predictors. The model's feature of nonparametric smoothing function on neighborhood-level variables better captures the association between neighborhood environment and the outcome. Using 2011 to 2012 data from the California Health Interview Survey, we demonstrate an empirical application of this method to estimate the fraction of residents without health insurance for Zip Code Tabulation Areas (ZCTAs). Our method generated stable estimates of uninsurance for 1519 of 1765 ZCTAs (86%) in California. For some areas with great socioeconomic diversity across adjacent neighborhoods, such as Los Angeles County, the modeled uninsured estimates revealed much heterogeneity among geographically adjacent ZCTAs. The proposed method can increase the value of health surveys by providing modeled estimates for health data at a granular geographic level. It can account for variations in health outcomes at the neighborhood level as a result of both socioeconomic characteristics and geographic locations.

  20. Impacts of Small Scale Flow Regulation on Sediment Dynamics in an Ecologically Important Upland River

    NASA Astrophysics Data System (ADS)

    Quinlan, E.; Gibbins, C. N.; Batalla, R. J.; Vericat, D.

    2015-03-01

    Flow regulation is widely recognized as affecting fluvial processes and river ecosystems. Most impact assessments have focused on large dams and major water transfer schemes, so relatively little is known about the impacts of smaller dams, weirs and water diversions. This paper assesses sediment dynamics in an upland river (the Ehen, NW England) whose flows are regulated by a small weir and tributary diversion. The river is important ecologically due to the presence of the endangered freshwater pearl mussel Margaritifera margaritifera, a species known to be sensitive to sedimentary conditions. Fine sediment yield for the 300-m long study reach was estimated to be 0.057 t km-2 year-1, a very low value relative to other upland UK rivers. Mean in-channel storage of fine sediment was also low, estimated at an average of around 40 g m-2. Although the study period was characterized by frequent high flow events, little movement of coarser bed material was observed. Data therefore indicate an extremely stable fluvial system within the study reach. The implication of this stability for pearl mussels is discussed.

  1. 400 Years of summer hydroclimate from stable isotopes in Iberian trees

    NASA Astrophysics Data System (ADS)

    Andreu-Hayles, Laia; Ummenhofer, Caroline C.; Barriendos, Mariano; Schleser, Gerhard H.; Helle, Gerhard; Leuenberger, Markus; Gutiérrez, Emilia; Cook, Edward R.

    2017-07-01

    Tree rings are natural archives that annually record distinct types of past climate variability depending on the parameters measured. Here, we use ring-width and stable isotopes in cellulose of trees from the northwestern Iberian Peninsula (IP) to understand regional summer hydroclimate over the last 400 years and the associated atmospheric patterns. Correlations between tree rings and climate data demonstrate that isotope signatures in the targeted Iberian pine forests are very sensitive to water availability during the summer period, and are mainly controlled by stomatal conductance. Non-linear methods based on extreme events analysis allow for capturing distinct seasonal climatic variability recorded by tree-ring parameters and asymmetric signals of the associated atmospheric features. Moreover, years with extreme high (low) values in the tree-ring records were characterised by coherent large-scale atmospheric circulation patterns with reduced (enhanced) moisture transport onto the northwestern IP. These analyses of extremes revealed that high/low proxy values do not necessarily correspond to mirror images in the atmospheric anomaly patterns, suggesting different drivers of these patterns and the corresponding signature recorded in the proxies. Regional hydroclimate features across the broader IP and western Europe during extreme wet/dry summers detected by the northwestern IP trees compare favourably to independent multicentury sea level pressure and drought reconstructions for Europe. Historical records also validate our findings that attribute non-linear moisture signals recorded by extreme tree-ring values to distinct large-scale atmospheric patterns and allow for 400-year reconstructions of the frequency of occurrence of extreme conditions in late spring and summer hydroclimate.

  2. 400 years of summer hydroclimate from stable isotopes in Iberian trees

    NASA Astrophysics Data System (ADS)

    Andreu-Hayles, Laia; Ummenhofer, Caroline C.; Barriendos, Mariano; Schleser, Gerhard H.; Helle, Gerhard; Leuenberger, Markus; Gutierrez, Emilia; Cook, Edward R.

    2017-04-01

    Tree rings are natural archives that annually record distinct types of past climate variability depending on the parameters measured. Here, we use ring-width and stable isotopes in cellulose of trees from the northwestern Iberian Peninsula (IP) to understand regional summer hydroclimate over the last 400 years and the associated atmospheric patterns. Correlations between tree rings and climate data demonstrate that isotope signatures in the targeted Iberian pine forests are very sensitive to water availability during the summer period, and are mainly controlled by stomatal conductance. Non-linear methods based on extreme events analysis allow for capturing distinct seasonal climatic variability recorded by tree-ring parameters and asymmetric signals of the associated atmospheric features. Moreover, years with extreme high (low) values in the tree-ring records were characterised by coherent large-scale atmospheric circulation patterns with reduced (enhanced) moisture transport onto the northwestern IP. These analyses of extremes revealed that high/low proxy values do not necessarily correspond to mirror images in the atmospheric anomaly patterns, suggesting different drivers of these patterns and the corresponding signature recorded in the proxies. Regional hydroclimate features across the broader IP and western Europe during extreme wet/dry summers detected by the northwestern IP trees compare favourably to an independent multicentury sea level pressure and drought reconstruction for Europe. Historical records also validate our findings that attribute non-linear moisture signals recorded by extreme tree-ring values to distinct large-scale atmospheric patterns and allow for 400-yr reconstructions of the frequency of occurrence of extreme conditions in summer hydroclimate. We will discuss how the results for Lillo compare with other records.

  3. Analysis and modeling of extreme temperatures in several cities in northwestern Mexico under climate change conditions

    NASA Astrophysics Data System (ADS)

    García-Cueto, O. Rafael; Cavazos, M. Tereza; de Grau, Pamela; Santillán-Soto, Néstor

    2014-04-01

    The generalized extreme value distribution is applied in this article to model the statistical behavior of the maximum and minimum temperature distribution tails in four cities of Baja California in northwestern Mexico, using data from 1950-2010. The approach used of the maximum of annual time blocks. Temporal trends were included as covariates in the location parameter (μ), which resulted in significant improvements to the proposed models, particularly for the extreme maximum temperature values in the cities of Mexicali, Tijuana, and Tecate, and the extreme minimum temperature values in Mexicali and Ensenada. These models were used to estimate future probabilities over the next 100 years (2015-2110) for different time periods, and they were compared with changes in the extreme (P90th and P10th) percentiles of maximum and minimum temperature scenarios for a set of six general circulation models under low (RCP4.5) and high (RCP8.5) radiative forcings. By the end of the twenty-first century, the scenarios of the changes in extreme maximum summer temperature are of the same order in both the statistical model and the high radiative scenario (increases of 4-5 °C). The low radiative scenario is more conservative (increases of 2-3 °C). The winter scenario shows that minimum temperatures could be less severe; the temperature increases suggested by the probabilistic model are greater than those projected for the end of the century by the set of global models under RCP4.5 and RCP8.5 scenarios. The likely impacts on the region are discussed.

  4. Extreme Mean and Its Applications

    NASA Technical Reports Server (NTRS)

    Swaroop, R.; Brownlow, J. D.

    1979-01-01

    Extreme value statistics obtained from normally distributed data are considered. An extreme mean is defined as the mean of p-th probability truncated normal distribution. An unbiased estimate of this extreme mean and its large sample distribution are derived. The distribution of this estimate even for very large samples is found to be nonnormal. Further, as the sample size increases, the variance of the unbiased estimate converges to the Cramer-Rao lower bound. The computer program used to obtain the density and distribution functions of the standardized unbiased estimate, and the confidence intervals of the extreme mean for any data are included for ready application. An example is included to demonstrate the usefulness of extreme mean application.

  5. Introducing the refined gravity hypothesis of extreme sexual size dimorphism

    PubMed Central

    2010-01-01

    Background Explanations for the evolution of female-biased, extreme Sexual Size Dimorphism (SSD), which has puzzled researchers since Darwin, are still controversial. Here we propose an extension of the Gravity Hypothesis (i.e., the GH, which postulates a climbing advantage for small males) that in conjunction with the fecundity hypothesis appears to have the most general power to explain the evolution of SSD in spiders so far. In this "Bridging GH" we propose that bridging locomotion (i.e., walking upside-down under own-made silk bridges) may be behind the evolution of extreme SSD. A biomechanical model shows that there is a physical constraint for large spiders to bridge. This should lead to a trade-off between other traits and dispersal in which bridging would favor smaller sizes and other selective forces (e.g. fecundity selection in females) would favor larger sizes. If bridging allows faster dispersal, small males would have a selective advantage by enjoying more mating opportunities. We predicted that both large males and females would show a lower propensity to bridge, and that SSD would be negatively correlated with sexual dimorphism in bridging propensity. To test these hypotheses we experimentally induced bridging in males and females of 13 species of spiders belonging to the two clades in which bridging locomotion has evolved independently and in which most of the cases of extreme SSD in spiders are found. Results We found that 1) as the degree of SSD increased and females became larger, females tended to bridge less relative to males, and that 2) smaller males and females show a higher propensity to bridge. Conclusions Physical constraints make bridging inefficient for large spiders. Thus, in species where bridging is a very common mode of locomotion, small males, by being more efficient at bridging, will be competitively superior and enjoy more mating opportunities. This "Bridging GH" helps to solve the controversial question of what keeps males small and also contributes to explain the wide range of SSD in spiders, as those spider species in which extreme SSD has not evolved but still live in tall vegetation, do not use bridging locomotion to disperse. PMID:20682029

  6. Simulation-Based Extreme Value Marked Correlations in Fatigue of Advanced Engineering Alloys (PREPRINT)

    DTIC Science & Technology

    2010-04-01

    000 the response of damage dependent processes like fatigue crack formation, a framework is needed that accounts for the extreme value life...many different damage processes (e.g. fatigue, creep, fracture). In this work, multiple material volumes for both IN100 and Ti-6Al-4V are simulated via...polycrystalline P/M Ni-base superalloy IN100 Typically, fatigue damage formation in polycrystalline superalloys has been linked to the existence of

  7. Impact of possible climate changes on river runoff under different natural conditions

    NASA Astrophysics Data System (ADS)

    Gusev, Yeugeniy M.; Nasonova, Olga N.; Kovalev, Evgeny E.; Ayzel, Georgy V.

    2018-06-01

    The present study was carried out within the framework of the International Inter-Sectoral Impact Model Intercomparison Project (ISI-MIP) for 11 large river basins located in different continents of the globe under a wide variety of natural conditions. The aim of the study was to investigate possible changes in various characteristics of annual river runoff (mean values, standard deviations, frequency of extreme annual runoff) up to 2100 on the basis of application of the land surface model SWAP and meteorological projections simulated by five General Circulation Models (GCMs) according to four RCP scenarios. Analysis of the obtained results has shown that changes in climatic runoff are different (both in magnitude and sign) for the river basins located in different regions of the planet due to differences in natural (primarily climatic) conditions. The climatic elasticities of river runoff to changes in air temperature and precipitation were estimated that makes it possible, as the first approximation, to project changes in climatic values of annual runoff, using the projected changes in mean annual air temperature and annual precipitation for the river basins. It was found that for most rivers under study, the frequency of occurrence of extreme runoff values increases. This is true both for extremely high runoff (when the projected climatic runoff increases) and for extremely low values (when the projected climatic runoff decreases).

  8. Projected changes to short- and long-duration precipitation extremes over the Canadian Prairie Provinces

    NASA Astrophysics Data System (ADS)

    Masud, M. B.; Khaliq, M. N.; Wheater, H. S.

    2017-09-01

    The effects of climate change on April-October short- and long-duration precipitation extremes over the Canadian Prairie Provinces were evaluated using a multi-Regional Climate Model (RCM) ensemble available through the North American Regional Climate Change Assessment Program. Simulations considered include those performed with six RCMs driven by the National Centre for Environmental Prediction (NCEP) reanalysis II product for the 1981-2000 period and those driven by four Atmosphere-Ocean General Circulation Models (AOGCMs) for the current 1971-2000 and future 2041-2070 periods (i.e. a total of 11 current-to-future period simulation pairs). A regional frequency analysis approach was used to develop 2-, 5-, 10-, 25-, and 50-year return values of precipitation extremes from NCEP and AOGCM-driven current and future period simulations that respectively were used to study the performance of RCMs and projected changes for selected return values at regional, grid-cell and local scales. Performance errors due to internal dynamics and physics of RCMs studied for the 1981-2000 period reveal considerable variation in the performance of the RCMs. However, the performance errors were found to be much smaller for RCM ensemble averages than for individual RCMs. Projected changes in future climate to selected regional return values of short-duration (e.g. 15- and 30-min) precipitation extremes and for longer return periods (e.g. 50-year) were found to be mostly larger than those to the longer duration (e.g. 24- and 48-h) extremes and short return periods (e.g. 2-year). Overall, projected changes in precipitation extremes were larger for southeastern regions followed by southern and northern regions and smaller for southwestern and western regions of the study area. The changes to return values were also found to be statistically significant for the majority of the RCM-AOGCM simulation pairs. These projections might be useful as a key input for the future planning of urban drainage infrastructure and development of strategic climate change adaptation measures.

  9. Analysis of extreme values of the economic efficiency indicators of transport infrastructure projects

    NASA Astrophysics Data System (ADS)

    Korytárová, J.; Vaňková, L.

    2017-10-01

    Paper builds on previous research of the authors into the evaluation of economic efficiency of transport infrastructure projects evaluated by the economic efficiency ratio - NPV, IRR and BCR. Values of indicators and subsequent outputs of the sensitivity analysis show extremely favourable values in some cases. The authors dealt with the analysis of these indicators down to the level of the input variables and examined which inputs have a larger share of these extreme values. NCF for the calculation of above mentioned ratios is created by benefits that arise as the difference between zero and investment options of the project (savings in travel and operating costs, savings in travel time costs, reduction in accident costs and savings in exogenous costs) as well as total agency costs. Savings in travel time costs which contribute to the overall utility of projects by more than 70% appear to be the most important benefits in the long term horizon. This is the reason why this benefit emphasized. The outcome of the article has resulted how the particular basic variables contributed to the total robustness of economic efficiency of these project.

  10. The magnitude and colour of noise in genetic negative feedback systems

    PubMed Central

    Voliotis, Margaritis; Bowsher, Clive G.

    2012-01-01

    The comparative ability of transcriptional and small RNA-mediated negative feedback to control fluctuations or ‘noise’ in gene expression remains unexplored. Both autoregulatory mechanisms usually suppress the average (mean) of the protein level and its variability across cells. The variance of the number of proteins per molecule of mean expression is also typically reduced compared with the unregulated system, but is almost never below the value of one. This relative variance often substantially exceeds a recently obtained, theoretical lower limit for biochemical feedback systems. Adding the transcriptional or small RNA-mediated control has different effects. Transcriptional autorepression robustly reduces both the relative variance and persistence (lifetime) of fluctuations. Both benefits combine to reduce noise in downstream gene expression. Autorepression via small RNA can achieve more extreme noise reduction and typically has less effect on the mean expression level. However, it is often more costly to implement and is more sensitive to rate parameters. Theoretical lower limits on the relative variance are known to decrease slowly as a measure of the cost per molecule of mean expression increases. However, the proportional increase in cost to achieve substantial noise suppression can be different away from the optimal frontier—for transcriptional autorepression, it is frequently negligible. PMID:22581772

  11. Metal–Organic Framework Thin Film Coated Optical Fiber Sensors: A Novel Waveguide-Based Chemical Sensing Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Ki-Joong; Lu, Ping; Culp, Jeffrey T.

    Integration of optical fiber with sensitive thin films offers great potential for the realization of novel chemical sensing platforms. In this study, we present a simple design strategy and high performance of nanoporous metal–organic framework (MOF) based optical gas sensors, which enables detection of a wide range of concentrations of small molecules based upon extremely small differences in refractive indices as a function of analyte adsorption within the MOF framework. Thin and compact MOF films can be uniformly formed and tightly bound on the surface of etched optical fiber through a simple solution method which is critical for manufacturability ofmore » MOF-based sensor devices. The resulting sensors show high sensitivity/selectivity to CO 2 gas relative to other small gases (H 2, N 2, O 2, and CO) with rapid (< tens of seconds) response time and excellent reversibility, which can be well correlated to the physisorption of gases into a nanoporous MOF. We propose a refractive index based sensing mechanism for the MOF-integrated optical fiber platform which results in an amplification of inherent optical absorption present within the MOF-based sensing layer with increasing values of effective refractive index associated with adsorption of gases.« less

  12. Metal–Organic Framework Thin Film Coated Optical Fiber Sensors: A Novel Waveguide-Based Chemical Sensing Platform

    DOE PAGES

    Kim, Ki-Joong; Lu, Ping; Culp, Jeffrey T.; ...

    2018-01-18

    Integration of optical fiber with sensitive thin films offers great potential for the realization of novel chemical sensing platforms. In this study, we present a simple design strategy and high performance of nanoporous metal–organic framework (MOF) based optical gas sensors, which enables detection of a wide range of concentrations of small molecules based upon extremely small differences in refractive indices as a function of analyte adsorption within the MOF framework. Thin and compact MOF films can be uniformly formed and tightly bound on the surface of etched optical fiber through a simple solution method which is critical for manufacturability ofmore » MOF-based sensor devices. The resulting sensors show high sensitivity/selectivity to CO 2 gas relative to other small gases (H 2, N 2, O 2, and CO) with rapid (< tens of seconds) response time and excellent reversibility, which can be well correlated to the physisorption of gases into a nanoporous MOF. We propose a refractive index based sensing mechanism for the MOF-integrated optical fiber platform which results in an amplification of inherent optical absorption present within the MOF-based sensing layer with increasing values of effective refractive index associated with adsorption of gases.« less

  13. Three-dimensional modeling of HCFC-123 in the atmosphere: assessing its potential environmental impacts and rationale for continued use.

    PubMed

    Wuebbles, Donald J; Patten, Kenneth O

    2009-05-01

    HCFC-123 (C2HCl2F3) is used in large refrigeration systems and as a fire suppression agent blend. Like other hydrochlorofluorocarbons, production and consumption of HCFC-123 is limited under the Montreal Protocol on Substances that Deplete the Ozone Layer. The purpose of this study is to update the understanding of the current and projected impacts of HCFC-123 on stratospheric ozone and on climate and to discuss the potential environmental effects from continued use of this chemical for specific applications. For the first time, the Ozone Depletion Potential (ODP) of a HCFC is determined using a three-dimensional model (MOZART-3) of atmospheric physics and chemistry. All previous studies have relied on results from two-dimensional models. The derived HCFC-123 ODP of 0.0098 is smaller than previous values. Analysis of the projected uses and emissions of HCFC-123, assuming reasonable levels of projected growth and use in centrifugal chiller and fire suppressant applications, suggests an extremely small impact on the environment due to its short atmospheric lifetime, low ODP, low Global Warming Potential (GWP), and the small production and emission of its limited applications. The current contribution of HCFC-123 to stratospheric reactive chlorine is too small to be measurable.

  14. Evaluation of NASA's MERRA Precipitation Product in Reproducing the Observed Trend and Distribution of Extreme Precipitation Events in the United States

    NASA Technical Reports Server (NTRS)

    Ashouri, Hamed; Sorooshian, Soroosh; Hsu, Kuo-Lin; Bosilovich, Michael G.; Lee, Jaechoul; Wehner, Michael F.; Collow, Allison

    2016-01-01

    This study evaluates the performance of NASA's Modern-Era Retrospective Analysis for Research and Applications (MERRA) precipitation product in reproducing the trend and distribution of extreme precipitation events. Utilizing the extreme value theory, time-invariant and time-variant extreme value distributions are developed to model the trends and changes in the patterns of extreme precipitation events over the contiguous United States during 1979-2010. The Climate Prediction Center (CPC) U.S.Unified gridded observation data are used as the observational dataset. The CPC analysis shows that the eastern and western parts of the United States are experiencing positive and negative trends in annual maxima, respectively. The continental-scale patterns of change found in MERRA seem to reasonably mirror the observed patterns of change found in CPC. This is not previously expected, given the difficulty in constraining precipitation in reanalysis products. MERRA tends to overestimate the frequency at which the 99th percentile of precipitation is exceeded because this threshold tends to be lower in MERRA, making it easier to be exceeded. This feature is dominant during the summer months. MERRA tends to reproduce spatial patterns of the scale and location parameters of the generalized extreme value and generalized Pareto distributions. However, MERRA underestimates these parameters, particularly over the Gulf Coast states, leading to lower magnitudes in extreme precipitation events. Two issues in MERRA are identified: 1) MERRA shows a spurious negative trend in Nebraska and Kansas, which is most likely related to the changes in the satellite observing system over time that has apparently affected the water cycle in the central United States, and 2) the patterns of positive trend over the Gulf Coast states and along the East Coast seem to be correlated with the tropical cyclones in these regions. The analysis of the trends in the seasonal precipitation extremes indicates that the hurricane and winter seasons are contributing the most to these trend patterns in the southeastern United States. In addition, the increasing annual trend simulated by MERRA in the Gulf Coast region is due to an incorrect trend in winter precipitation extremes.

  15. Evaluation of NASA’s MERRA Precipitation Product in Reproducing the Observed Trend and Distribution of Extreme Precipitation Events in the United States

    DOE PAGES

    Ashouri, Hamed; Sorooshian, Soroosh; Hsu, Kuo-Lin; ...

    2016-02-03

    This study evaluates the performance of NASA's Modern-Era Retrospective Analysis for Research and Applications (MERRA) precipitation product in reproducing the trend and distribution of extreme precipitation events. Utilizing the extreme value theory, time-invariant and time-variant extreme value distributions are developed to model the trends and changes in the patterns of extreme precipitation events over the contiguous United States during 1979-2010. The Climate Prediction Center (CPC)U.S.Unified gridded observation data are used as the observational dataset. The CPC analysis shows that the eastern and western parts of the United States are experiencing positive and negative trends in annual maxima, respectively. The continental-scalemore » patterns of change found in MERRA seem to reasonably mirror the observed patterns of change found in CPC. This is not previously expected, given the difficulty in constraining precipitation in reanalysis products. MERRA tends to overestimate the frequency at which the 99th percentile of precipitation is exceeded because this threshold tends to be lower in MERRA, making it easier to be exceeded. This feature is dominant during the summer months. MERRAtends to reproduce spatial patterns of the scale and location parameters of the generalized extreme value and generalized Pareto distributions. However, MERRA underestimates these parameters, particularly over the Gulf Coast states, leading to lower magnitudes in extreme precipitation events. Two issues in MERRA are identified: 1)MERRAshows a spurious negative trend in Nebraska andKansas, which ismost likely related to the changes in the satellite observing system over time that has apparently affected the water cycle in the central United States, and 2) the patterns of positive trend over theGulf Coast states and along the East Coast seem to be correlated with the tropical cyclones in these regions. The analysis of the trends in the seasonal precipitation extremes indicates that the hurricane and winter seasons are contributing the most to these trend patterns in the southeastern United States. The increasing annual trend simulated by MERRA in the Gulf Coast region is due to an incorrect trend in winter precipitation extremes.« less

  16. Trends in Gender Differences in Academic Achievement from 1960 to 1994: An Analysis of Differences in Mean, Variance, and Extreme Scores.

    ERIC Educational Resources Information Center

    Nowell, Amy; Hedges, Larry V.

    1998-01-01

    Uses evidence from seven surveys of the U.S. 12th-grade population and the National Assessment of Educational Progress to show that gender differences in mean and variance in academic achievement are small from 1960 to 1994 but that differences in extreme scores are often substantial. (SLD)

  17. [Multi-temporal scale analysis of impacts of extreme high temperature on net carbon uptake in subtropical coniferous plantation.

    PubMed

    Zhang, Mi; Wen, Xue Fa; Zhang, Lei Ming; Wang, Hui Min; Guo, Yi Wen; Yu, Gui Rui

    2018-02-01

    Extreme high temperature is one of important extreme weathers that impact forest ecosystem carbon cycle. In this study, applying CO 2 flux and routine meteorological data measured during 2003-2012, we examined the impacts of extreme high temperature and extreme high temperature event on net carbon uptake of subtropical coniferous plantation in Qianyanzhou. Combining with wavelet analysis, we analyzed environmental controls on net carbon uptake at different temporal scales, when the extreme high temperature and extreme high temperature event happened. The results showed that mean daily cumulative NEE decreased by 51% in the days with daily maximum air temperature range between 35 ℃ and 40 ℃, compared with that in the days with the range between 30 ℃ and 34 ℃. The effects of the extreme high temperature and extreme high temperature event on monthly NEE and annual NEE related to the strength and duration of extreme high tempe-rature event. In 2003, when strong extreme high temperature event happened, the sum of monthly cumulative NEE in July and August was only -11.64 g C·m -2 ·(2 month) -1 . The value decreased by 90%, compared with multi-year average value. At the same time, the relative variation of annual NEE reached -6.7%. In July and August, when the extreme high temperature and extreme high temperature event occurred, air temperature (T a ) and vapor press deficit (VPD) were the dominant controller for the daily variation of NEE. The coherency between NEE T a and NEE VPD was 0.97 and 0.95, respectively. At 8-, 16-, and 32-day periods, T a , VPD, soil water content at 5 cm depth (SWC), and precipitation (P) controlled NEE. The coherency between NEE SWC and NEE P was higher than 0.8 at monthly scale. The results indicated that atmospheric water deficit impacted NEE at short temporal scale, when the extreme high temperature and extreme high temperature event occurred, both of atmospheric water deficit and soil drought stress impacted NEE at long temporal scales in this ecosystem.

  18. Fluoride pollution of atmospheric precipitation and its relationship with air circulation and weather patterns (Wielkopolski National Park, Poland).

    PubMed

    Walna, Barbara; Kurzyca, Iwona; Bednorz, Ewa; Kolendowicz, Leszek

    2013-07-01

    A 2-year study (2010-2011) of fluorides in atmospheric precipitation in the open area and in throughfall in Wielkopolski National Park (west-central Poland) showed their high concentrations, reaching a maximum value of 2 mg/l under the tree crowns. These high values indicate substantial deposition of up to 52 mg/m(2)/year. In 2011, over 51% of open area precipitation was characterized by fluoride concentration higher than 0.10 mg/l, and in throughfall such concentrations were found in more than 86% of events. In 2010, a strong connection was evident between fluoride and acid-forming ions, and in 2011, a correlation between phosphate and nitrite ions was seen. Analysis of available data on F(-) concentrations in the air did not show an unequivocal effect on F(-) concentrations in precipitation. To find reasons for and source areas of high fluoride pollution, the cases of extreme fluoride concentration in rainwater were related to atmospheric circulation and weather patterns. Weather conditions on days of extreme pollution were determined by movement of weather fronts over western Poland, or by small cyclonic centers with meteorological fronts. Macroscale air advection over the sampling site originated in the western quadrant (NW, W, and SW), particularly in the middle layers of the troposphere (2,500-5,000 m a.s.l.). Such directions indicate western Poland and Germany as possible sources of the pollution. At the same time in the lower troposphere, air inflow was frequently from the north, showing short distance transport from local emitters, and from the agglomeration of Poznań.

  19. Life in extreme environments: Investigations on the ecophysiology of a desert bird, the Australian Diamond Dove (Geopelia cuneata Latham).

    PubMed

    Schleucher, Elke; Prinzinger, Roland; Withers, Philip C

    1991-09-01

    The Diamond Dove, Geopelia cuneata, is the world's second smallest (ca. 35 g) species of the columbid order. The Diamond Dove is endemic in the arid and semiarid Mulga and Spinifex regions of Central and Western Australia. It regularly encounters ambient temperatures (T a ) in its habitat above +40° C, especially when foraging for seeds on bare ground cover, and may be found at up to 40 km from water. This entails extreme thermal stress, with evaporative cooling constrained by limited water supply. Energy metabolism (M), respiration, body temperature (T a ) and water budget were examined with regard to physiological adaptations to these extreme environmental conditions. The zone of thermal neutrality (TNZ) extended from +34° C to at least +45° C. Basal metabolic rate (BMR) was 34.10±4.19 J g -1 h -1 , corresponding to the values predicted for a typical columbid bird. Thermal conductance (C) was higher than predicted. Geopelia cuneata showed the typical breathing pattern of doves, a combination of normal breathing at a stable frequency (ca. 60 min -1 ) at low T a and panting followed by gular flutter (up to 960 min -1 ) at high T a . At T a > +36° C, T a increased to considerably higher levels without increasing metabolic rate, i.e. Q 10 =1. This enabled the doves not only to store heat but also to save the amout of water that would have been required for evaporative cooling if T a had remained constant. The birds were able to dissipate more than 100% of the metabolic heat by evaporation at T a ≥ +44° C. This was achieved by gular flutter (an extremely effective mechanism for evaporation), and also by a low metabolic rate due to the low Q 10 value for metabolism during increased T b . At lower T a , Geopelia cuneata predominantly relied on non-evaporative mechanisms during heat stress, to save water. Total evaporative water loss over the whole T a range was 19-33% lower than expected. In this respect, their small body size proved to be an important advantage for successful survival in hot and arid environments.

  20. Evaluation of dynamically downscaled extreme temperature using a spatially-aggregated generalized extreme value (GEV) model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Jiali; Han, Yuefeng; Stein, Michael L.

    2016-02-10

    The Weather Research and Forecast (WRF) model downscaling skill in extreme maximum daily temperature is evaluated by using the generalized extreme value (GEV) distribution. While the GEV distribution has been used extensively in climatology and meteorology for estimating probabilities of extreme events, accurately estimating GEV parameters based on data from a single pixel can be difficult, even with fairly long data records. This work proposes a simple method assuming that the shape parameter, the most difficult of the three parameters to estimate, does not vary over a relatively large region. This approach is applied to evaluate 31-year WRF-downscaled extreme maximummore » temperature through comparison with North American Regional Reanalysis (NARR) data. Uncertainty in GEV parameter estimates and the statistical significance in the differences of estimates between WRF and NARR are accounted for by conducting bootstrap resampling. Despite certain biases over parts of the United States, overall, WRF shows good agreement with NARR in the spatial pattern and magnitudes of GEV parameter estimates. Both WRF and NARR show a significant increase in extreme maximum temperature over the southern Great Plains and southeastern United States in January and over the western United States in July. The GEV model shows clear benefits from the regionally constant shape parameter assumption, for example, leading to estimates of the location and scale parameters of the model that show coherent spatial patterns.« less

  1. Extreme storm surge and wind wave climate scenario simulations at the Venetian littoral

    NASA Astrophysics Data System (ADS)

    Lionello, P.; Galati, M. B.; Elvini, E.

    Scenario climate projections for extreme marine storms producing storm surges and wind waves are very important for the northern flat coast of the Adriatic Sea, where the area at risk includes a unique cultural and environmental heritage, and important economic activities. This study uses a shallow water model and a spectral wave model for computing the storm surge and the wind wave field, respectively, from the sea level pressure and wind fields that have been computed by the RegCM regional climate model. Simulations cover the period 1961-1990 for the present climate (control simulations) and the period 2071-2100 for the A2 and B2 scenarios. Generalized Extreme Value analysis is used for estimating values for the 10 and 100 year return times. The adequacy of these modeling tools for a reliable estimation of the climate change signal, without needing further downscaling is shown. However, this study has mainly a methodological value, because issues such as interdecadal variability and intermodel variability cannot be addressed, since the analysis is based on single model 30-year long simulations. The control simulation looks reasonably accurate for extreme value analysis, though it overestimates/underestimates the frequency of high/low surge and wind wave events with respect to observations. Scenario simulations suggest higher frequency of intense storms for the B2 scenario, but not for the A2. Likely, these differences are not the effect of climate change, but of climate multidecadal variability. Extreme storms are stronger in future scenarios, but differences are not statistically significant. Therefore this study does not provide convincing evidence for more stormy conditions in future scenarios.

  2. [Quantitative Evaluation of Metal Artifacts on CT Images on the Basis of Statistics of Extremes].

    PubMed

    Kitaguchi, Shigetoshi; Imai, Kuniharu; Ueda, Suguru; Hashimoto, Naomi; Hattori, Shouta; Saika, Takahiro; Ono, Yoshifumi

    2016-05-01

    It is well-known that metal artifacts have a harmful effect on the image quality of computed tomography (CT) images. However, the physical property remains still unknown. In this study, we investigated the relationship between metal artifacts and tube currents using statistics of extremes. A commercially available phantom for measuring CT dose index 160 mm in diameter was prepared and a brass rod 13 mm in diameter was placed at the centerline of the phantom. This phantom was used as a target object to evaluate metal artifacts and was scanned using an area detector CT scanner with various tube currents under a constant tube voltage of 120 kV. Sixty parallel line segments with a length of 100 pixels were placed to cross metal artifacts on CT images and the largest difference between two adjacent CT values in each of 60 CT value profiles of these line segments was employed as a feature variable for measuring metal artifacts; these feature variables were analyzed on the basis of extreme value theory. The CT value variation induced by metal artifacts was statistically characterized by Gumbel distribution, which was one of the extreme value distributions; namely, metal artifacts have the same statistical characteristic as streak artifacts. Therefore, Gumbel evaluation method makes it possible to analyze not only streak artifacts but also metal artifacts. Furthermore, the location parameter in Gumbel distribution was shown to be in inverse proportion to the square root of a tube current. This result suggested that metal artifacts have the same dose dependence as image noises.

  3. Preliminary evidence for associations between second-trimester human chorionic gonadotropin and unconjugated oestriol levels with pregnancy outcome in Down syndrome pregnancies.

    PubMed

    Benn, P A

    1998-04-01

    Fifty-six cases of Down syndrome were identified in a population of women who had undergone maternal serum triple marker screening [alpha-fetoprotein (AFP), human chorionic gonadotropin (hCG), and unconjugated oestriol (uE3) analyses]. These affected pregnancies represented all known cases present in the population of 34,368 women screened. Using a 1:270 mid-trimester Down syndrome risk to define the screen-positive group, 42 affected pregnancies were screen-positive (medians: AFP = 0.79 MOM, hCG = 2.13 MOM, uE3 = 0.62 MOM, age 34.6 years) and 14 pregnancies were screen-negative (medians: AFP = 0.82 MOM, hCG = 1.57 MOM, uE3 = 0.92 MOM, age 24.2 years). Four affected pregnancies were associated with in utero death and each of these cases was associated with relatively extreme values of AFP, hCG, and uE3, including the three highest levels of hCG in the entire series of Down syndrome pregnancies. Twenty-nine (15 screen-positive and 14 screen-negative) affected pregnancies resulted in liveborns. Down syndrome pregnancies had a significantly shorter gestational term than controls, and Down syndrome babies were also lighter than controls, even after adjustment for sex and gestational age. In affected pregnancies, a low uE3 level appeared to be associated with a greater chance of a small-for-gestational age baby. No correlations could be demonstrated between AFP or hCG levels and gestational age-adjusted term weight. Based on this small series, it would appear that uE3 may be particularly useful in detecting those Down syndrome cases associated with small-for-gestational age fetuses. A very high hCG value may indicate a higher probability of fetal death.

  4. Properties and Alignment of Interstellar Dust Grains toward Type Ia Supernovae with Anomalous Polarization Curves

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoang, Thiem, E-mail: thiemhoang@kasi.re.kr; Canadian Institute for Theoretical Astrophysics, University of Toronto, 60 St. George Street, Toronto, ON M5S 3H8; Institute of Theoretical Physics, Goethe Universität Frankfurt, D-60438 Frankfurt am Main

    Recent photometric and polarimetric observations of Type Ia supernovae (SNe Ia) show unusually low total-to-selective extinction ratios ( R {sub V} < 2) and wavelengths of maximum polarization ( λ{sub max} < 0.4 μ m) for several SNe Ia, which indicates peculiar properties of interstellar (IS) dust in the SN-hosted galaxies and/or the presence of circumstellar (CS) dust. In this paper, we use an inversion technique to infer the best-fit grain size distribution and the alignment function of interstellar grains along the lines of sight toward four SNe Ia with anomalous extinction and polarization data (SN 1986G, SN 2006X, SNmore » 2008fp, and SN 2014J). We find that to reproduce low values of R{sub V}, a significant enhancement in the mass of small grains of radius a < 0.1 μ m is required. For SN 2014J, a simultaneous fit to its observed extinction and polarization is unsuccessful if all the data are attributed to IS dust (model 1), but a good fit is obtained when accounting for the contribution of CS dust (model 2). For SN 2008fp, our best-fit results for model 1 show that in order to reproduce an extreme value of λ{sub max} ∼ 0.15 μ m, small silicate grains must be aligned as efficiently as big grains. For this case, we suggest that strong radiation from the SN can induce efficient alignment of small grains in a nearby intervening molecular cloud via the radiative torque (RAT) mechanism. The resulting time dependence polarization from this RAT alignment model can be tested by observing at ultraviolet wavelengths.« less

  5. Comparison of Bootstrapping and Markov Chain Monte Carlo for Copula Analysis of Hydrological Droughts

    NASA Astrophysics Data System (ADS)

    Yang, P.; Ng, T. L.; Yang, W.

    2015-12-01

    Effective water resources management depends on the reliable estimation of the uncertainty of drought events. Confidence intervals (CIs) are commonly applied to quantify this uncertainty. A CI seeks to be at the minimal length necessary to cover the true value of the estimated variable with the desired probability. In drought analysis where two or more variables (e.g., duration and severity) are often used to describe a drought, copulas have been found suitable for representing the joint probability behavior of these variables. However, the comprehensive assessment of the parameter uncertainties of copulas of droughts has been largely ignored, and the few studies that have recognized this issue have not explicitly compared the various methods to produce the best CIs. Thus, the objective of this study to compare the CIs generated using two widely applied uncertainty estimation methods, bootstrapping and Markov Chain Monte Carlo (MCMC). To achieve this objective, (1) the marginal distributions lognormal, Gamma, and Generalized Extreme Value, and the copula functions Clayton, Frank, and Plackett are selected to construct joint probability functions of two drought related variables. (2) The resulting joint functions are then fitted to 200 sets of simulated realizations of drought events with known distribution and extreme parameters and (3) from there, using bootstrapping and MCMC, CIs of the parameters are generated and compared. The effect of an informative prior on the CIs generated by MCMC is also evaluated. CIs are produced for different sample sizes (50, 100, and 200) of the simulated drought events for fitting the joint probability functions. Preliminary results assuming lognormal marginal distributions and the Clayton copula function suggest that for cases with small or medium sample sizes (~50-100), MCMC to be superior method if an informative prior exists. Where an informative prior is unavailable, for small sample sizes (~50), both bootstrapping and MCMC yield the same level of performance, and for medium sample sizes (~100), bootstrapping is better. For cases with a large sample size (~200), there is little difference between the CIs generated using bootstrapping and MCMC regardless of whether or not an informative prior exists.

  6. Extreme value statistics for two-dimensional convective penetration in a pre-main sequence star

    NASA Astrophysics Data System (ADS)

    Pratt, J.; Baraffe, I.; Goffrey, T.; Constantino, T.; Viallet, M.; Popov, M. V.; Walder, R.; Folini, D.

    2017-08-01

    Context. In the interior of stars, a convectively unstable zone typically borders a zone that is stable to convection. Convective motions can penetrate the boundary between these zones, creating a layer characterized by intermittent convective mixing, and gradual erosion of the density and temperature stratification. Aims: We examine a penetration layer formed between a central radiative zone and a large convection zone in the deep interior of a young low-mass star. Using the Multidimensional Stellar Implicit Code (MUSIC) to simulate two-dimensional compressible stellar convection in a spherical geometry over long times, we produce statistics that characterize the extent and impact of convective penetration in this layer. Methods: We apply extreme value theory to the maximal extent of convective penetration at any time. We compare statistical results from simulations which treat non-local convection, throughout a large portion of the stellar radius, with simulations designed to treat local convection in a small region surrounding the penetration layer. For each of these situations, we compare simulations of different resolution, which have different velocity magnitudes. We also compare statistical results between simulations that radiate energy at a constant rate to those that allow energy to radiate from the stellar surface according to the local surface temperature. Results: Based on the frequency and depth of penetrating convective structures, we observe two distinct layers that form between the convection zone and the stable radiative zone. We show that the probability density function of the maximal depth of convective penetration at any time corresponds closely in space with the radial position where internal waves are excited. We find that the maximal penetration depth can be modeled by a Weibull distribution with a small shape parameter. Using these results, and building on established scalings for diffusion enhanced by large-scale convective motions, we propose a new form for the diffusion coefficient that may be used for one-dimensional stellar evolution calculations in the large Péclet number regime. These results should contribute to the 321D link.

  7. Assessing Regional Scale Variability in Extreme Value Statistics Under Altered Climate Scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brunsell, Nathaniel; Mechem, David; Ma, Chunsheng

    Recent studies have suggested that low-frequency modes of climate variability can significantly influence regional climate. The climatology associated with extreme events has been shown to be particularly sensitive. This has profound implications for droughts, heat waves, and food production. We propose to examine regional climate simulations conducted over the continental United States by applying a recently developed technique which combines wavelet multi–resolution analysis with information theory metrics. This research is motivated by two fundamental questions concerning the spatial and temporal structure of extreme events. These questions are 1) what temporal scales of the extreme value distributions are most sensitive tomore » alteration by low-frequency climate forcings and 2) what is the nature of the spatial structure of variation in these timescales? The primary objective is to assess to what extent information theory metrics can be useful in characterizing the nature of extreme weather phenomena. Specifically, we hypothesize that (1) changes in the nature of extreme events will impact the temporal probability density functions and that information theory metrics will be sensitive these changes and (2) via a wavelet multi–resolution analysis, we will be able to characterize the relative contribution of different timescales on the stochastic nature of extreme events. In order to address these hypotheses, we propose a unique combination of an established regional climate modeling approach and advanced statistical techniques to assess the effects of low-frequency modes on climate extremes over North America. The behavior of climate extremes in RCM simulations for the 20th century will be compared with statistics calculated from the United States Historical Climatology Network (USHCN) and simulations from the North American Regional Climate Change Assessment Program (NARCCAP). This effort will serve to establish the baseline behavior of climate extremes, the validity of an innovative multi–resolution information theory approach, and the ability of the RCM modeling framework to represent the low-frequency modulation of extreme climate events. Once the skill of the modeling and analysis methodology has been established, we will apply the same approach for the AR5 (IPCC Fifth Assessment Report) climate change scenarios in order to assess how climate extremes and the the influence of lowfrequency variability on climate extremes might vary under changing climate. The research specifically addresses the DOE focus area 2. Simulation of climate extremes under a changing climate. Specific results will include (1) a better understanding of the spatial and temporal structure of extreme events, (2) a thorough quantification of how extreme values are impacted by low-frequency climate teleconnections, (3) increased knowledge of current regional climate models ability to ascertain these influences, and (4) a detailed examination of the how the distribution of extreme events are likely to change under different climate change scenarios. In addition, this research will assess the ability of the innovative wavelet information theory approach to characterize extreme events. Any and all of these results will greatly enhance society’s ability to understand and mitigate the regional ramifications of future global climate change.« less

  8. Regioselective Synthesis of Cellulose Ester Homopolymers

    Treesearch

    Daiqiang Xu; Kristen Voiges; Thomas Elder; Petra Mischnick; Kevin J. Edgar

    2012-01-01

    Regioselective synthesis of cellulose esters is extremely difficult due to the small reactivity differences between cellulose hydroxyl groups, small differences in steric demand between acyl moieties of interest, and the difficulty of attaching and detaching many protecting groups in the presence of cellulose ester moieties without removing the ester groups. Yet the...

  9. Particles, particles everywhere: What is in the air we breathe?

    USDA-ARS?s Scientific Manuscript database

    Particulate matter (PM) air pollution consists of extremely small particles, some so small that they can directly enter the bloodstream through the lungs. PM is of prime concern from both health and environmental perspectives. Current research is focused on understanding how PM forms in the atmosphe...

  10. Compilation of 1985 annual reports of the Navy elf (extremely low frequency) communications system ecological monitoring program. Volume 2. Tabs D-G. Annual progress report, January-December 1985

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Band, R.N.; Snider, R.J.; Snider, R.M.

    1986-07-01

    This volume consists of the following reports: Soil Amoeba; Soil and Litter Arthropoda and Earthworm Studies; Biological Studies on Pollinating Insects: Megachilid Bees; Small Vertebrates: Small Mammals and Nesting Birds.

  11. Uncertainties of flood frequency estimation approaches based on continuous simulation using data resampling

    NASA Astrophysics Data System (ADS)

    Arnaud, Patrick; Cantet, Philippe; Odry, Jean

    2017-11-01

    Flood frequency analyses (FFAs) are needed for flood risk management. Many methods exist ranging from classical purely statistical approaches to more complex approaches based on process simulation. The results of these methods are associated with uncertainties that are sometimes difficult to estimate due to the complexity of the approaches or the number of parameters, especially for process simulation. This is the case of the simulation-based FFA approach called SHYREG presented in this paper, in which a rainfall generator is coupled with a simple rainfall-runoff model in an attempt to estimate the uncertainties due to the estimation of the seven parameters needed to estimate flood frequencies. The six parameters of the rainfall generator are mean values, so their theoretical distribution is known and can be used to estimate the generator uncertainties. In contrast, the theoretical distribution of the single hydrological model parameter is unknown; consequently, a bootstrap method is applied to estimate the calibration uncertainties. The propagation of uncertainty from the rainfall generator to the hydrological model is also taken into account. This method is applied to 1112 basins throughout France. Uncertainties coming from the SHYREG method and from purely statistical approaches are compared, and the results are discussed according to the length of the recorded observations, basin size and basin location. Uncertainties of the SHYREG method decrease as the basin size increases or as the length of the recorded flow increases. Moreover, the results show that the confidence intervals of the SHYREG method are relatively small despite the complexity of the method and the number of parameters (seven). This is due to the stability of the parameters and takes into account the dependence of uncertainties due to the rainfall model and the hydrological calibration. Indeed, the uncertainties on the flow quantiles are on the same order of magnitude as those associated with the use of a statistical law with two parameters (here generalised extreme value Type I distribution) and clearly lower than those associated with the use of a three-parameter law (here generalised extreme value Type II distribution). For extreme flood quantiles, the uncertainties are mostly due to the rainfall generator because of the progressive saturation of the hydrological model.

  12. Complexity-aware simple modeling.

    PubMed

    Gómez-Schiavon, Mariana; El-Samad, Hana

    2018-02-26

    Mathematical models continue to be essential for deepening our understanding of biology. On one extreme, simple or small-scale models help delineate general biological principles. However, the parsimony of detail in these models as well as their assumption of modularity and insulation make them inaccurate for describing quantitative features. On the other extreme, large-scale and detailed models can quantitatively recapitulate a phenotype of interest, but have to rely on many unknown parameters, making them often difficult to parse mechanistically and to use for extracting general principles. We discuss some examples of a new approach-complexity-aware simple modeling-that can bridge the gap between the small-scale and large-scale approaches. Copyright © 2018 Elsevier Ltd. All rights reserved.

  13. Instability of Poiseuille flow at extreme Mach numbers: linear analysis and simulations.

    PubMed

    Xie, Zhimin; Girimaji, Sharath S

    2014-04-01

    We develop the perturbation equations to describe instability evolution in Poiseuille flow at the limit of very high Mach numbers. At this limit the equation governing the flow is the pressure-released Navier-Stokes equation. The ensuing semianalytical solution is compared against simulations performed using the gas-kinetic method (GKM), resulting in excellent agreement. A similar comparison between analytical and computational results of small perturbation growth is performed at the incompressible (zero Mach number) limit, again leading to excellent agreement. The study accomplishes two important goals: it (i) contrasts the small perturbation evolution in Poiseuille flows at extreme Mach numbers and (ii) provides important verification of the GKM simulation scheme.

  14. Spatial distribution of precipitation extremes in Norway

    NASA Astrophysics Data System (ADS)

    Verpe Dyrrdal, Anita; Skaugen, Thomas; Lenkoski, Alex; Thorarinsdottir, Thordis; Stordal, Frode; Førland, Eirik J.

    2015-04-01

    Estimates of extreme precipitation, in terms of return levels, are crucial in planning and design of important infrastructure. Through two separate studies, we have examined the levels and spatial distribution of daily extreme precipitation over catchments in Norway, and hourly extreme precipitation in a point. The analyses were carried out through the development of two new methods for estimating extreme precipitation in Norway. For daily precipitation we fit the Generalized Extreme Value (GEV) distribution to areal time series from a gridded dataset, consisting of daily precipitation during the period 1957-today with a resolution of 1x1 km². This grid-based method is more objective and less manual and time-consuming compared to the existing method at MET Norway. In addition, estimates in ungauged catchments are easier to obtain, and the GEV approach includes a measure of uncertainty, which is a requirement in climate studies today. Further, we go into depth on the debated GEV shape parameter, which plays an important role for longer return periods. We show that it varies according to dominating precipitation types, having positive values in the southeast and negative values in the southwest. We also find indications that the degree of orographic enhancement might affect the shape parameter. For hourly precipitation, we estimate return levels on a 1x1 km² grid, by linking GEV distributions with latent Gaussian fields in a Bayesian hierarchical model (BHM). Generalized linear models on the GEV parameters, estimated from observations, are able to incorporate location-specific geographic and meteorological information and thereby accommodate these effects on extreme precipitation. Gaussian fields capture additional unexplained spatial heterogeneity and overcome the sparse grid on which observations are collected, while a Bayesian model averaging component directly assesses model uncertainty. We find that mean summer precipitation, mean summer temperature, latitude, longitude, mean annual precipitation and elevation are good covariate candidates for hourly precipitation in our model. Summer indices succeed because hourly precipitation extremes often occur during the convective season. The spatial distribution of hourly and daily precipitation differs in Norway. Daily precipitation extremes are larger along the southwestern coast, where large-scale frontal systems dominate during fall season and the mountain ridge generates strong orographic enhancement. The largest hourly precipitation extremes are mostly produced by intense convective showers during summer, and are thus found along the entire southern coast, including the Oslo-region.

  15. Combination of radar and daily precipitation data to estimate meaningful sub-daily point precipitation extremes

    NASA Astrophysics Data System (ADS)

    Pegram, Geoff; Bardossy, Andras; Sinclair, Scott

    2017-04-01

    The use of radar measurements for the space time estimation of precipitation has for many decades been a central topic in hydro-meteorology. In this presentation we are interested specifically in daily and sub-daily extreme values of precipitation at gauged or ungauged locations which are important for design. The purpose of the presentation is to develop a methodology to combine daily precipitation observations and radar measurements to estimate sub-daily extremes at point locations. Radar data corrected using precipitation-reflectivity relationships lead to biased estimations of extremes. Different possibilities of correcting systematic errors using the daily observations are investigated. Observed gauged daily amounts are interpolated to un-sampled points and subsequently disaggregated using the sub-daily values obtained by the radar. Different corrections based on the spatial variability and the sub-daily entropy of scaled rainfall distributions are used to provide unbiased corrections of short duration extremes. In addition, a statistical procedure not based on a matching day by day correction is tested. In this last procedure, as we are only interested in rare extremes, low to medium values of rainfall depth were neglected leaving 12 days of ranked daily maxima in each set per year, whose sum typically comprises about 50% of each annual rainfall total. The sum of these 12 day maxima is first interpolated using a Kriging procedure. Subsequently this sum is disaggregated to daily values using a nearest neighbour procedure. The daily sums are then disaggregated by using the relative values of the biggest 12 radar based days in each year. Of course, the timings of radar and gauge maxima can be different, so the new method presented here uses radar for disaggregating daily gauge totals down to 15 min intervals in order to extract the maxima of sub-hourly through to daily rainfall. The methodologies were tested in South Africa, where an S-band radar operated relatively continuously at Bethlehem from 1998 to 2003, whose scan at 1.5 km above ground [CAPPI] overlapped a dense [10 km spacing] set of 45 pluviometers recording in the same 6-year period. This valuable set of data was obtained from each of 37 selected radar pixels [1 km square in plan] which contained a pluviometer, not masked out by the radar foot-print. The pluviometer data were also aggregated to daily totals, for the same purpose. The extremes obtained using disaggregation methods were compared to the observed extremes in a cross validation procedure. The unusual and novel goal was not to obtain the reproduction of the precipitation matching in space and time, but to obtain frequency distributions of the point extremes, which we found to be stable. Published as: Bárdossy, A., and G. G. S. Pegram (2017) Journal of Hydrology, Volume 544, pp 397-406

  16. Design of a New Ultracompact Resonant Plasmonic Multi-Analyte Label-Free Biosensing Platform

    PubMed Central

    De Palo, Maripina; Ciminelli, Caterina

    2017-01-01

    In this paper, we report on the design of a bio-multisensing platform for the selective label-free detection of protein biomarkers, carried out through a 3D numerical algorithm. The platform includes a number of biosensors, each of them is based on a plasmonic nanocavity, consisting of a periodic metal structure to be deposited on a silicon oxide substrate. Light is strongly confined in a region with extremely small size (=1.57 μm2), to enhance the light-matter interaction. A surface sensitivity Ss = 1.8 nm/nm has been calculated together with a detection limit of 128 pg/mm2. Such performance, together with the extremely small footprint, allow the integration of several devices on a single chip to realize extremely compact lab-on-chip microsystems. In addition, each sensing element of the platform has a good chemical stability that is guaranteed by the selection of gold for its fabrication. PMID:28783075

  17. Ethical research as the target of animal extremism: an international problem.

    PubMed

    Conn, P Michael; Rantin, F T

    2010-02-01

    Animal extremism has been increasing worldwide; frequently researchers are the targets of actions by groups with extreme animal rights agendas. Sometimes this targeting is violent and may involve assaults on family members or destruction of property. In this article, we summarize recent events and suggest steps that researchers can take to educate the public on the value of animal research both for people and animals.

  18. Estimating the extreme low-temperature event using nonparametric methods

    NASA Astrophysics Data System (ADS)

    D'Silva, Anisha

    This thesis presents a new method of estimating the one-in-N low temperature threshold using a non-parametric statistical method called kernel density estimation applied to daily average wind-adjusted temperatures. We apply our One-in-N Algorithm to local gas distribution companies (LDCs), as they have to forecast the daily natural gas needs of their consumers. In winter, demand for natural gas is high. Extreme low temperature events are not directly related to an LDCs gas demand forecasting, but knowledge of extreme low temperatures is important to ensure that an LDC has enough capacity to meet customer demands when extreme low temperatures are experienced. We present a detailed explanation of our One-in-N Algorithm and compare it to the methods using the generalized extreme value distribution, the normal distribution, and the variance-weighted composite distribution. We show that our One-in-N Algorithm estimates the one-in- N low temperature threshold more accurately than the methods using the generalized extreme value distribution, the normal distribution, and the variance-weighted composite distribution according to root mean square error (RMSE) measure at a 5% level of significance. The One-in- N Algorithm is tested by counting the number of times the daily average wind-adjusted temperature is less than or equal to the one-in- N low temperature threshold.

  19. The Logic of Values Clarification

    ERIC Educational Resources Information Center

    Kazepides, A. C.

    1977-01-01

    Traces the origin of the Values Clarification movement in education in Carl Roger's clien-centered therapy and exposes its unwarranted extreme ethical stance. Examines a model episode of values clarification and shows how the theoretical confusions of the Values Clarification proponents are reflected in their actual teaching strategies. (Editor/RK)

  20. Repeated Small Bowel Obstruction Caused by Chestnut Ingestion without the Formation of Phytobezoars.

    PubMed

    Satake, Ryu; Chinda, Daisuke; Shimoyama, Tadashi; Satake, Miwa; Oota, Rie; Sato, Satoshi; Yamai, Kiyonori; Hachimori, Hisashi; Okamoto, Yutaka; Yamada, Kyogo; Matsuura, Osamu; Hashizume, Tadashi; Soma, Yasushi; Fukuda, Shinsaku

    2016-01-01

    A small number of cases of small bowel obstruction caused by foods without the formation of phytobezoars have been reported. Repeated small bowel obstruction due to the ingestion of the same food is extremely rare. We present the case of 63-year-old woman who developed small bowel obstruction twice due to the ingestion of chestnuts without the formation of phytobezoars. This is the first reported case of repeated small bowel obstruction caused by chestnut ingestion. Careful interviews are necessary to determine the meal history of elderly patients and psychiatric patients.

  1. A Metastatistical Approach to Satellite Estimates of Extreme Rainfall Events

    NASA Astrophysics Data System (ADS)

    Zorzetto, E.; Marani, M.

    2017-12-01

    The estimation of the average recurrence interval of intense rainfall events is a central issue for both hydrologic modeling and engineering design. These estimates require the inference of the properties of the right tail of the statistical distribution of precipitation, a task often performed using the Generalized Extreme Value (GEV) distribution, estimated either from a samples of annual maxima (AM) or with a peaks over threshold (POT) approach. However, these approaches require long and homogeneous rainfall records, which often are not available, especially in the case of remote-sensed rainfall datasets. We use here, and tailor it to remotely-sensed rainfall estimates, an alternative approach, based on the metastatistical extreme value distribution (MEVD), which produces estimates of rainfall extreme values based on the probability distribution function (pdf) of all measured `ordinary' rainfall event. This methodology also accounts for the interannual variations observed in the pdf of daily rainfall by integrating over the sample space of its random parameters. We illustrate the application of this framework to the TRMM Multi-satellite Precipitation Analysis rainfall dataset, where MEVD optimally exploits the relatively short datasets of satellite-sensed rainfall, while taking full advantage of its high spatial resolution and quasi-global coverage. Accuracy of TRMM precipitation estimates and scale issues are here investigated for a case study located in the Little Washita watershed, Oklahoma, using a dense network of rain gauges for independent ground validation. The methodology contributes to our understanding of the risk of extreme rainfall events, as it allows i) an optimal use of the TRMM datasets in estimating the tail of the probability distribution of daily rainfall, and ii) a global mapping of daily rainfall extremes and distributional tail properties, bridging the existing gaps in rain gauges networks.

  2. Assessing the Implications of Changing Extreme Value Distributions of Weather on Carbon and Water Cycling in Grasslands

    NASA Astrophysics Data System (ADS)

    Brunsell, N. A.; Nippert, J. B.

    2011-12-01

    As the climate warms, it is generally acknowledged that the number and magnitude of extreme weather events will increase. We examined an ecophysiological model's responses to precipitation and temperature anomalies in relation to the mean and variance of annual precipitation along a pronounced precipitation gradient from eastern to western Kansas. This natural gradient creates a template of potential responses for both the mean and variance of annual precipitation to compare the timescales of carbon and water fluxes. Using data from several Ameriflux sites (KZU and KFS) and a third eddy covariance tower (K4B) along the gradient, BIOME-BGC was used to characterize water and carbon cycle responses to extreme weather events. Changes in the extreme value distributions were based on SRES A1B and A2 scenarios using an ensemble mean of 21 GCMs for the region, downscaled using a stochastic weather generator. We focused on changing the timing and magnitude of precipitation and altering the diurnal and seasonal temperature ranges. Biome-BGC was then forced with daily output from the stochastic weather generator, and we examined how potential changes in these extreme value distributions impact carbon and water cycling at the sites across the Kansas precipitation gradient at time scales ranging from daily to interannual. To decompose the time scales of response, we applied a wavelet based information theory analysis approach. Results indicate impacts in soil moisture memory and carbon allocation processes, which vary in response to both the mean and variance of precipitation along the precipitation gradient. These results suggest a more pronounced focus ecosystem responses to extreme events across a range of temporal scales in order to fully characterize the water and carbon cycle responses to global climate change.

  3. The Demand for Disaster Microinsurance for Small Businesses in Urban Slums: The Results of Surveys in Three Indian Cities.

    PubMed

    Patel, Ronak; Walker, Garrett; Bhatt, Mihir; Pathak, Vishal

    2017-03-01

    Small informal businesses make up the core markets for many poor urban communities, providing essential goods, services, and livelihoods. Many of these communities and businesses exist in hazardous locations. In most cases, these business owners do not have access to proper coping mechanisms including risk transfer and lack resilience to shocks. Access to risk-transfer in the form of insurance for these small businesses is extremely limited. This demand survey is the first phase of an intervention to test disaster microinsurance for these businesses. Previous research has examined the demand for and value of microinsurance to protect poor households but not micro- and medium-sized informal urban businesses. This study investigates knowledge about and demand for microinsurance among small informal business owners in three different cities of India. Survey of all informal business owners (n=4919) identified through purposive sampling of the most vulnerable in three proposed study sites: Guwahati in Assam (n=1622), Puri in Odisha (n=1551) and Cuddalore in Tamil Nadu (n=1746). Our findings reflect that while small business owners largely did not know about disaster microinsurance, after describing it, a vast majority wanted to subscribe to such a program. Without it, they often rely on personal savings, forgo basic necessities, or take out costly loans that trap them in debt to cope with disasters. This research supports the need for more experiments on actual adoption patterns, feasibility studies, and innovative trial programs by governments, non-governmental organizations, and insurance providers.

  4. Disappearing inflaton potential via heavy field dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kitajima, Naoya; Takahashi, Fuminobu, E-mail: kitajima@tuhep.phys.tohoku.ac.jp, E-mail: fumi@tuhep.phys.tohoku.ac.jp

    2016-02-01

    We propose a possibility that the inflaton potential is significantly modified after inflation due to heavy field dynamics. During inflation such a heavy scalar field may be stabilized at a value deviated from the low-energy minimum. In extreme cases, the inflaton potential vanishes and the inflaton becomes almost massless at some time after inflation. Such transition of the inflaton potential has interesting implications for primordial density perturbations, reheating, creation of unwanted relics, dark radiation, and experimental search for light degrees of freedom. To be concrete, we consider a chaotic inflation in supergravity where the inflaton mass parameter is promoted tomore » a modulus field, finding that the inflaton becomes stable after the transition and contributes to dark matter. Another example is a hilltop inflation (also called new inflation) by the MSSM Higgs field which acquires a large expectation value just after inflation, but it returns to the origin after the transition and finally rolls down to the electroweak vacuum. Interestingly, the smallness of the electroweak scale compared to the Planck scale is directly related to the flatness of the inflaton potential.« less

  5. Peculiar phase diagram with isolated superconducting regions in ThFeAsN1‑x O x

    NASA Astrophysics Data System (ADS)

    Li, Bai-Zhuo; Wang, Zhi-Cheng; Wang, Jia-Lu; Zhang, Fu-Xiang; Wang, Dong-Ze; Zhang, Feng-Yuan; Sun, Yu-Ping; Jing, Qiang; Zhang, Hua-Fu; Tan, Shu-Gang; Li, Yu-Ke; Feng, Chun-Mu; Mei, Yu-Xue; Wang, Cao; Cao, Guang-Han

    2018-06-01

    ThFeAsN1‑x O x () system with heavy electron doping has been studied by the measurements of x-ray diffraction, electrical resistivity, magnetic susceptibility and specific heat. The non-doped compound exhibits superconductivity at K, which is possibly due to an internal uniaxial chemical pressure that is manifested by the extremely small value of As height with respect to the Fe plane. With the oxygen substitution, the T c value decreases rapidly to below 2 K for , and surprisingly, superconductivity re-appears in the range of with a maximum of 17.5 K at x  =  0.3. For the normal-state resistivity, while the samples in intermediate non-superconducting interval exhibit Fermi liquid behavior, those in other regions show a non-Fermi-liquid behavior. The specific heat jump for the superconducting sample of x  =  0.4 is , which is discussed in terms of anisotropic superconducting gap. The peculiar phase diagram in ThFeAsN1‑x O x presents additional ingredients for understanding the superconducting mechanism in iron-based superconductors.

  6. Regime switching model for financial data: Empirical risk analysis

    NASA Astrophysics Data System (ADS)

    Salhi, Khaled; Deaconu, Madalina; Lejay, Antoine; Champagnat, Nicolas; Navet, Nicolas

    2016-11-01

    This paper constructs a regime switching model for the univariate Value-at-Risk estimation. Extreme value theory (EVT) and hidden Markov models (HMM) are combined to estimate a hybrid model that takes volatility clustering into account. In the first stage, HMM is used to classify data in crisis and steady periods, while in the second stage, EVT is applied to the previously classified data to rub out the delay between regime switching and their detection. This new model is applied to prices of numerous stocks exchanged on NYSE Euronext Paris over the period 2001-2011. We focus on daily returns for which calibration has to be done on a small dataset. The relative performance of the regime switching model is benchmarked against other well-known modeling techniques, such as stable, power laws and GARCH models. The empirical results show that the regime switching model increases predictive performance of financial forecasting according to the number of violations and tail-loss tests. This suggests that the regime switching model is a robust forecasting variant of power laws model while remaining practical to implement the VaR measurement.

  7. Exact extreme-value statistics at mixed-order transitions.

    PubMed

    Bar, Amir; Majumdar, Satya N; Schehr, Grégory; Mukamel, David

    2016-05-01

    We study extreme-value statistics for spatially extended models exhibiting mixed-order phase transitions (MOT). These are phase transitions that exhibit features common to both first-order (discontinuity of the order parameter) and second-order (diverging correlation length) transitions. We consider here the truncated inverse distance squared Ising model, which is a prototypical model exhibiting MOT, and study analytically the extreme-value statistics of the domain lengths The lengths of the domains are identically distributed random variables except for the global constraint that their sum equals the total system size L. In addition, the number of such domains is also a fluctuating variable, and not fixed. In the paramagnetic phase, we show that the distribution of the largest domain length l_{max} converges, in the large L limit, to a Gumbel distribution. However, at the critical point (for a certain range of parameters) and in the ferromagnetic phase, we show that the fluctuations of l_{max} are governed by novel distributions, which we compute exactly. Our main analytical results are verified by numerical simulations.

  8. Estimating maximum instantaneous distortion from inlet total pressure rms and PSD measurements. [Root Mean Square and Power Spectral Density methods

    NASA Technical Reports Server (NTRS)

    Melick, H. C., Jr.; Ybarra, A. H.; Bencze, D. P.

    1975-01-01

    An inexpensive method is developed to determine the extreme values of instantaneous inlet distortion. This method also provides insight into the basic mechanics of unsteady inlet flow and the associated engine reaction. The analysis is based on fundamental fluid dynamics and statistical methods to provide an understanding of the turbulent inlet flow and quantitatively relate the rms level and power spectral density (PSD) function of the measured time variant total pressure fluctuations to the strength and size of the low pressure regions. The most probable extreme value of the instantaneous distortion is then synthesized from this information in conjunction with the steady state distortion. Results of the analysis show the extreme values to be dependent upon the steady state distortion, the measured turbulence rms level and PSD function, the time on point, and the engine response characteristics. Analytical projections of instantaneous distortion are presented and compared with data obtained by a conventional, highly time correlated, 40 probe instantaneous pressure measurement system.

  9. Small-scale studies of roasted ore waste reveal extreme ranges of stable mercury isotope signatures

    NASA Astrophysics Data System (ADS)

    Smith, Robin S.; Wiederhold, Jan G.; Jew, Adam D.; Brown, Gordon E.; Bourdon, Bernard; Kretzschmar, Ruben

    2014-07-01

    Active and closed Hg mines are significant sources of Hg contamination to the environment, mainly due to large volumes of mine waste material disposed of on-site. The application of Hg isotopes as source tracer from such contaminated sites requires knowledge of the Hg isotope signatures of different materials potentially released to the environment. Previous work has shown that calcine, the waste residue of the on-site ore roasting process, can exhibit distinct Hg isotope signatures compared with the primary ore. Here, we report results from a detailed small-scale study of Hg isotope variations in calcine collected from the closed New Idria Hg mine, San Benito County, CA, USA. The calcine samples exhibited different internal layering features which were investigated using optical microscopy, micro X-ray fluorescence, micro X-ray absorption spectroscopy (μ-XAS), and stable Hg isotope analysis. Significant Fe, S, and Hg concentration gradients were found across the different internal layers. Isotopic analyses revealed an extreme variation with pronounced isotopic gradients across the internal layered features. Overall, δ202Hg (±0.10‰, 2 SD) describing mass-dependent fractionation (MDF) ranged from -5.96 to 14.49‰, which is by far the largest range of δ202Hg values reported for any environmental sample. In addition, Δ199Hg (±0.06‰, 2 SD) describing mass-independent fractionation (MIF) ranged from -0.17 to 0.21‰. The μ-XAS analyses suggested that cinnabar and metacinnabar are the dominant Hg-bearing phases in the calcine. Our results demonstrate that the incomplete roasting of HgS ores in Hg mines can cause extreme mass-dependent Hg isotope fractionations at the scale of individual calcine pieces with enrichments in both light and heavy Hg isotopes relative to the primary ore signatures. This finding has important implications for the application of Hg isotopes as potential source tracers for Hg released to the environment from closed Hg mines and highlights the need for detailed source signature identification.

  10. A Non-Stationary Approach for Estimating Future Hydroclimatic Extremes Using Monte-Carlo Simulation

    NASA Astrophysics Data System (ADS)

    Byun, K.; Hamlet, A. F.

    2017-12-01

    There is substantial evidence that observed hydrologic extremes (e.g. floods, extreme stormwater events, and low flows) are changing and that climate change will continue to alter the probability distributions of hydrologic extremes over time. These non-stationary risks imply that conventional approaches for designing hydrologic infrastructure (or making other climate-sensitive decisions) based on retrospective analysis and stationary statistics will become increasingly problematic through time. To develop a framework for assessing risks in a non-stationary environment our study develops a new approach using a super ensemble of simulated hydrologic extremes based on Monte Carlo (MC) methods. Specifically, using statistically downscaled future GCM projections from the CMIP5 archive (using the Hybrid Delta (HD) method), we extract daily precipitation (P) and temperature (T) at 1/16 degree resolution based on a group of moving 30-yr windows within a given design lifespan (e.g. 10, 25, 50-yr). Using these T and P scenarios we simulate daily streamflow using the Variable Infiltration Capacity (VIC) model for each year of the design lifespan and fit a Generalized Extreme Value (GEV) probability distribution to the simulated annual extremes. MC experiments are then used to construct a random series of 10,000 realizations of the design lifespan, estimating annual extremes using the estimated unique GEV parameters for each individual year of the design lifespan. Our preliminary results for two watersheds in Midwest show that there are considerable differences in the extreme values for a given percentile between conventional MC and non-stationary MC approach. Design standards based on our non-stationary approach are also directly dependent on the design lifespan of infrastructure, a sensitivity which is notably absent from conventional approaches based on retrospective analysis. The experimental approach can be applied to a wide range of hydroclimatic variables of interest.

  11. Spatial variability of extreme rainfall at radar subpixel scale

    NASA Astrophysics Data System (ADS)

    Peleg, Nadav; Marra, Francesco; Fatichi, Simone; Paschalis, Athanasios; Molnar, Peter; Burlando, Paolo

    2018-01-01

    Extreme rainfall is quantified in engineering practice using Intensity-Duration-Frequency curves (IDF) that are traditionally derived from rain-gauges and more recently also from remote sensing instruments, such as weather radars. These instruments measure rainfall at different spatial scales: rain-gauge samples rainfall at the point scale while weather radar averages precipitation on a relatively large area, generally around 1 km2. As such, a radar derived IDF curve is representative of the mean areal rainfall over a given radar pixel and neglects the within-pixel rainfall variability. In this study, we quantify subpixel variability of extreme rainfall by using a novel space-time rainfall generator (STREAP model) that downscales in space the rainfall within a given radar pixel. The study was conducted using a unique radar data record (23 years) and a very dense rain-gauge network in the Eastern Mediterranean area (northern Israel). Radar-IDF curves, together with an ensemble of point-based IDF curves representing the radar subpixel extreme rainfall variability, were developed fitting Generalized Extreme Value (GEV) distributions to annual rainfall maxima. It was found that the mean areal extreme rainfall derived from the radar underestimate most of the extreme values computed for point locations within the radar pixel (on average, ∼70%). The subpixel variability of rainfall extreme was found to increase with longer return periods and shorter durations (e.g. from a maximum variability of 10% for a return period of 2 years and a duration of 4 h to 30% for 50 years return period and 20 min duration). For the longer return periods, a considerable enhancement of extreme rainfall variability was found when stochastic (natural) climate variability was taken into account. Bounding the range of the subpixel extreme rainfall derived from radar-IDF can be of major importance for different applications that require very local estimates of rainfall extremes.

  12. Capillary pressure-saturation relationships for porous granular materials: Pore morphology method vs. pore unit assembly method

    NASA Astrophysics Data System (ADS)

    Sweijen, Thomas; Aslannejad, Hamed; Hassanizadeh, S. Majid

    2017-09-01

    In studies of two-phase flow in complex porous media it is often desirable to have an estimation of the capillary pressure-saturation curve prior to measurements. Therefore, we compare in this research the capability of three pore-scale approaches in reproducing experimentally measured capillary pressure-saturation curves. To do so, we have generated 12 packings of spheres that are representative of four different glass-bead packings and eight different sand packings, for which we have found experimental data on the capillary pressure-saturation curve in the literature. In generating the packings, we matched the particle size distributions and porosity values of the granular materials. We have used three different pore-scale approaches for generating the capillary pressure-saturation curves of each packing: i) the Pore Unit Assembly (PUA) method in combination with the Mayer and Stowe-Princen (MS-P) approximation for estimating the entry pressures of pore throats, ii) the PUA method in combination with the hemisphere approximation, and iii) the Pore Morphology Method (PMM) in combination with the hemisphere approximation. The three approaches were also used to produce capillary pressure-saturation curves for the coating layer of paper, used in inkjet printing. Curves for such layers are extremely difficult to determine experimentally, due to their very small thickness and the presence of extremely small pores (less than one micrometer in size). Results indicate that the PMM and PUA-hemisphere method give similar capillary pressure-saturation curves, because both methods rely on a hemisphere to represent the air-water interface. The ability of the hemisphere approximation and the MS-P approximation to reproduce correct capillary pressure seems to depend on the type of particle size distribution, with the hemisphere approximation working well for narrowly distributed granular materials.

  13. Prediction of pH-Dependent Hydrophobic Profiles of Small Molecules from Miertus-Scrocco-Tomasi Continuum Solvation Calculations.

    PubMed

    Zamora, William J; Curutchet, Carles; Campanera, Josep M; Luque, F Javier

    2017-10-26

    Hydrophobicity is a key physicochemical descriptor used to understand the biological profile of (bio)organic compounds as well as a broad variety of biochemical, pharmacological, and toxicological processes. This property is estimated from the partition coefficient between aqueous and nonaqueous environments for neutral compounds (P N ) and corrected for the pH-dependence of ionizable compounds as the distribution coefficient (D). Here, we have extended the parametrization of the Miertus-Scrocco-Tomasi continuum solvation model in n-octanol to nitrogen-containing heterocyclic compounds, as they are present in many biologically relevant molecules (e.g., purines and pyrimidines bases, amino acids, and drugs), to obtain accurate log P N values for these molecules. This refinement also includes solvation calculations for ionic species in n-octanol with the aim of reproducing the experimental partition of ionic compounds (P I ). Finally, the suitability of different formalisms to estimate the distribution coefficient for a wide range of pH values has been examined for a set of small acidic and basic compounds. The results indicate that in general the simple pH-dependence model of the ionizable compound in water suffices to predict the partitioning at or around physiological pH. However, at extreme pH values, where ionic species are predominant, more elaborate models provide a better prediction of the n-octanol/water distribution coefficient, especially for amino acid analogues. Finally, the results also show that these formalisms are better suited to reproduce the experimental pH-dependent distribution curves of log D for both acidic and basic compounds as well as for amino acid analogues.

  14. Nucleation and dynamic rupture on weakly stressed faults sustained by thermal pressurization

    NASA Astrophysics Data System (ADS)

    Schmitt, Stuart V.; Segall, Paul; Dunham, Eric M.

    2015-11-01

    Earthquake nucleation requires that the shear stress τ locally reaches a fault's static strength, fσeff, the product of the friction coefficient and effective normal stress. Once rupture initiates, shear heating-induced thermal pressurization can sustain rupture at much lower τ/σeff ratios, a stress condition believed to be the case during most earthquakes. This requires that earthquakes nucleate at heterogeneities. We model nucleation and dynamic rupture on faults in a 2-D elastic medium with rate/state friction and thermal pressurization, subjected to globally low τ but with local stress heterogeneities that permit nucleation. We examine end-member cases of either high-τ or low-σeff heterogeneities. We find that thermal pressurization can sustain slip at τ/σeff values as low as 0.13, compared to static friction of ˜0.7. Background τ (and, to lesser extent, heterogeneity width) controls whether ruptures arrest or are sustained, with extremely low values resulting in arrest. For a small range of background τ, sustained slip is pulse-like. Cessation of slip in a pulse tail can result from either diffusive restrengthening of σeff or a wave-mediated stopping phase that follows the rupture tip. Slightly larger background τ leads to sustained crack-like rupture. Thermal pressurization is stronger at high-τ heterogeneities, resulting in a lower background τ threshold for sustained rupture and potentially larger arresting ruptures. High-stress events also initiate with higher moment rate, although this may be difficult to observe in nature. For arresting ruptures, stress drops and the dependence of fracture energy on mean slip are both consistent with values inferred for small earthquakes.

  15. Light-Dependent Sulfide Oxidation in the Anoxic Zone of the Chesapeake Bay Can Be Explained by Small Populations of Phototrophic Bacteria

    PubMed Central

    Bennett, Alexa J.; Hanson, Thomas E.; Luther, George W.

    2015-01-01

    Microbial sulfide oxidation in aquatic environments is an important ecosystem process, as sulfide is potently toxic to aerobic organisms. Sulfide oxidation in anoxic waters can prevent the efflux of sulfide to aerobic water masses, thus mitigating toxicity. The contribution of phototrophic sulfide-oxidizing bacteria to anaerobic sulfide oxidation in the Chesapeake Bay and the redox chemistry of the stratified water column were investigated in the summers of 2011 to 2014. In 2011 and 2013, phototrophic sulfide-oxidizing bacteria closely related to Prosthecochloris species of the phylum Chlorobi were cultivated from waters sampled at and below the oxic-anoxic interface, where measured light penetration was sufficient to support populations of low-light-adapted photosynthetic bacteria. In 2012, 2013, and 2014, light-dependent sulfide loss was observed in freshly collected water column samples. In these samples, extremely low light levels caused 2- to 10-fold increases in the sulfide uptake rate over the sulfide uptake rate under dark conditions. An enrichment, CB11, dominated by Prosthecochloris species, oxidized sulfide with a Ks value of 11 μM and a Vmax value of 51 μM min−1 (mg protein−1). Using these kinetic values with in situ sulfide concentrations and light fluxes, we calculated that a small population of Chlorobi similar to those in enrichment CB11 can account for the observed anaerobic light-dependent sulfide consumption activity in natural water samples. We conclude that Chlorobi play a far larger role in the Chesapeake Bay than currently appreciated. This result has potential implications for coastal anoxic waters and expanding oxygen-minimum zones as they begin to impinge on the photic zone. PMID:26296727

  16. Capturing spatial and temporal patterns of widespread, extreme flooding across Europe

    NASA Astrophysics Data System (ADS)

    Busby, Kathryn; Raven, Emma; Liu, Ye

    2013-04-01

    Statistical characterisation of physical hazards is an integral part of probabilistic catastrophe models used by the reinsurance industry to estimate losses from large scale events. Extreme flood events are not restricted by country boundaries which poses an issue for reinsurance companies as their exposures often extend beyond them. We discuss challenges and solutions that allow us to appropriately capture the spatial and temporal dependence of extreme hydrological events on a continental-scale, which in turn enables us to generate an industry-standard stochastic event set for estimating financial losses for widespread flooding. By presenting our event set methodology, we focus on explaining how extreme value theory (EVT) and dependence modelling are used to account for short, inconsistent hydrological data from different countries, and how to make appropriate statistical decisions that best characterise the nature of flooding across Europe. The consistency of input data is of vital importance when identifying historical flood patterns. Collating data from numerous sources inherently causes inconsistencies and we demonstrate our robust approach to assessing the data and refining it to compile a single consistent dataset. This dataset is then extrapolated using a parameterised EVT distribution to estimate extremes. Our method then captures the dependence of flood events across countries using an advanced multivariate extreme value model. Throughout, important statistical decisions are explored including: (1) distribution choice; (2) the threshold to apply for extracting extreme data points; (3) a regional analysis; (4) the definition of a flood event, which is often linked with reinsurance industry's hour's clause; and (5) handling of missing values. Finally, having modelled the historical patterns of flooding across Europe, we sample from this model to generate our stochastic event set comprising of thousands of events over thousands of years. We then briefly illustrate how this is applied within a probabilistic model to estimate catastrophic loss curves used by the reinsurance industry.

  17. Impact of Persistent Degassing of Kilauea Volcano on Domestic Water Supplies

    NASA Astrophysics Data System (ADS)

    Thomas, D. M.; Macomber, T.

    2010-12-01

    In March, 2008, a small explosive eruption in the summit crater of Kilauea Volcano marked the initiation of a new, persistently degassing vent at Kilauea. Emission rates of sulfur dioxide initially exceeded 1000 tons per day but declined to a longer term rate of ~800 tons per day. Because of its location farther inland, the plume from this vent generated more severe and more frequent adverse air quality impacts on the surrounding and downwind communities than has the longer lived degassing vents at Pu'u O'o. Because many residents on Hawaii Island derive their domestic water supply from roof catchment systems, deposition of aerosols produced in the volcanic plume could pose a significant health threat to the community. In order to quantify that risk, a program of screening of water catchment systems was undertaken in three communities: Lower Puna, upwind of the vent; Volcano Village, immediately adjacent to the Kilauea summit; and Hawaiian Ocean View Estates, located ~65 km downwind from the vent. An aggregate of 439 samples were collected and analyzed for pH, and fluoride, chloride and sulfate ion concentrations; the median values and extrema are shown in Table I below. The pH values for the catchments proved not to be a good indicator of plume influence: the Volcano and Ocean View communities showed a bimodal distribution of values reflecting residents managing their water systems (median pH = 6.2 and 7.2 respectively) and those that didn't (median pH = 4.5 and 4.3 respectively); however, the lower extremes for pH gave values of 2.9 and 3.3 respectively. Chloride values were also variable due to the use of sodium hypochlorite to treat for biological contaminants. The median values for fluoride and sulfate show a progressive increase from the Puna catchments to Volcano and Ocean View. We believe that these values are consistent with the relative exposure of the communities to the volcanic plume: although the Volcano community is closer to the source, wind conditions conducive to exposure are infrequent whereas the more distant Ocean View community is exposed to a more dilute plume but at a much higher frequency. Even though the median values are within accepted limits for drinking water, the extreme values observed are cause for concern: the pH values are well below those recommended for drinking water and the fluoride values are approaching WHO recommended drinking water levels. With even modest increases in plume output or exposure times, some of the community catchment systems can accumulate sufficient acid or fluoride ion concentrations to pose a significant health threat if drinking water is drawn from those catchments. Continued monitoring of catchment water quality is recommended.Table I. Catchment Water Supply Analytical Results Concentrations in parts per million

  18. Determining octanol-water partition coefficients for extremely hydrophobic chemicals by combining "slow stirring" and solid-phase microextraction.

    PubMed

    Jonker, Michiel T O

    2016-06-01

    Octanol-water partition coefficients (KOW ) are widely used in fate and effects modeling of chemicals. Still, high-quality experimental KOW data are scarce, in particular for very hydrophobic chemicals. This hampers reliable assessments of several fate and effect parameters and the development and validation of new models. One reason for the limited availability of experimental values may relate to the challenging nature of KOW measurements. In the present study, KOW values for 13 polycyclic aromatic hydrocarbons were determined with the gold standard "slow-stirring" method (log KOW 4.6-7.2). These values were then used as reference data for the development of an alternative method for measuring KOW . This approach combined slow stirring and equilibrium sampling of the extremely low aqueous concentrations with polydimethylsiloxane-coated solid-phase microextraction fibers, applying experimentally determined fiber-water partition coefficients. It resulted in KOW values matching the slow-stirring data very well. Therefore, the method was subsequently applied to a series of 17 moderately to extremely hydrophobic petrochemical compounds. The obtained KOW values spanned almost 6 orders of magnitude, with the highest value measuring 10(10.6) . The present study demonstrates that the hydrophobicity domain within which experimental KOW measurements are possible can be extended with the help of solid-phase microextraction and that experimentally determined KOW values can exceed the proposed upper limit of 10(9) . Environ Toxicol Chem 2016;35:1371-1377. © 2015 SETAC. © 2015 SETAC.

  19. The evolution of extreme precipitations in high resolution scenarios over France

    NASA Astrophysics Data System (ADS)

    Colin, J.; Déqué, M.; Somot, S.

    2009-09-01

    Over the past years, improving the modelling of extreme events and their variability at climatic time scales has become one of the challenging issue raised in the regional climate research field. This study shows the results of a high resolution (12 km) scenario run over France with the limited area model (LAM) ALADIN-Climat, regarding the representation of extreme precipitations. The runs were conducted in the framework of the ANR-SCAMPEI national project on high resolution scenarios over French mountains. As a first step, we attempt to quantify one of the uncertainties implied by the use of LAM : the size of the area on which the model is run. In particular, we address the issue of whether a relatively small domain allows the model to create its small scale process. Indeed, high resolution scenarios cannot be run on large domains because of the computation time. Therefore one needs to answer this preliminary question before producing and analyzing such scenarios. To do so, we worked in the framework of a « big brother » experiment. We performed a 23-year long global simulation in present-day climate (1979-2001) with the ARPEGE-Climat GCM, at a resolution of approximately 50 km over Europe (stretched grid). This first simulation, named ARP50, constitutes the « big brother » reference of our experiment. It has been validated in comparison with the CRU climatology. Then we filtered the short waves (up to 200 km) from ARP50 in order to obtain the equivalent of coarse resolution lateral boundary conditions (LBC). We have carried out three ALADIN-Climat simulations at a 50 km resolution with these LBC, using different configurations of the model : * FRA50, run over a small domain (2000 x 2000 km, centered over France), * EUR50, run over a larger domain (5000 x 5000 km, centered over France as well), * EUR50-SN, run over the large domain (using spectral nudging). Considering the facts that ARPEGE-Climat and ALADIN-Climat models share the same physics and dynamics and that both regional and global simulations were run at the same resolution, ARP50 can be regarded as a reference with which FRA50, EUR50 and EUR50-SN should each be compared. After an analysis of the differences between the regional simulations and ARP50 in annual and seasonal mean, we focus on the representation of rainfall extremes comparing two dimensional fields of various index inspired from STARDEX and quantile-quantile plots. The results show a good agreement with the ARP50 reference for all three regional simulations and little differences are found between them. This result indicates that the use of small domains is not significantly detrimental to the modelling of extreme precipitation events. It also shows that the spectral nudging technique has no detrimental effect on the extreme precipitation. Therefore, high resolution scenarios performed on a relatively small domain such as the ones run for SCAMPEI, can be regarded as good tools to explore their possible evolution in the future climate. Preliminary results on the response of precipitation extremes over South-East France are given.

  20. Work-related burns.

    PubMed

    Pruitt, Valerie M

    2006-01-01

    Work-related upper extremity burns often occur. The cause directs the course of action. Thermal burns should be assessed for system alterations, and depth of burn should be determined. Deep partial-thickness burns and more severe burns require a specialist evaluation. Chemical burns must be irrigated and the agent identified. Some chemical burns, such as those that involve phenols and metal fragments, require specific topical applications before water lavage. Hydrofluoric acid burns can cause life-threatening electrolyte abnormalities with a small, highly concentrated acid burn. The goal with any extremity burn is to provide the patient with a multidisciplinary team approach to achieve a functional, usable extremity.

  1. Controllable gaussian-qubit interface for extremal quantum state engineering.

    PubMed

    Adesso, Gerardo; Campbell, Steve; Illuminati, Fabrizio; Paternostro, Mauro

    2010-06-18

    We study state engineering through bilinear interactions between two remote qubits and two-mode gaussian light fields. The attainable two-qubit states span the entire physically allowed region in the entanglement-versus-global-purity plane. Two-mode gaussian states with maximal entanglement at fixed global and marginal entropies produce maximally entangled two-qubit states in the corresponding entropic diagram. We show that a small set of parameters characterizing extremally entangled two-mode gaussian states is sufficient to control the engineering of extremally entangled two-qubit states, which can be realized in realistic matter-light scenarios.

  2. Changes in seasonal streamflow extremes experienced in rivers of Northwestern South America (Colombia)

    NASA Astrophysics Data System (ADS)

    Pierini, J. O.; Restrepo, J. C.; Aguirre, J.; Bustamante, A. M.; Velásquez, G. J.

    2017-04-01

    A measure of the variability in seasonal extreme streamflow was estimated for the Colombian Caribbean coast, using monthly time series of freshwater discharge from ten watersheds. The aim was to detect modifications in the streamflow monthly distribution, seasonal trends, variance and extreme monthly values. A 20-year length time moving window, with 1-year successive shiftments, was applied to the monthly series to analyze the seasonal variability of streamflow. The seasonal-windowed data were statistically fitted through the Gamma distribution function. Scale and shape parameters were computed using the Maximum Likelihood Estimation (MLE) and the bootstrap method for 1000 resample. A trend analysis was performed for each windowed-serie, allowing to detect the window of maximum absolute values for trends. Significant temporal shifts in seasonal streamflow distribution and quantiles (QT), were obtained for different frequencies. Wet and dry extremes periods increased significantly in the last decades. Such increase did not occur simultaneously through the region. Some locations exhibited continuous increases only at minimum QT.

  3. Precipitation extremes and their relation to climatic indices in the Pacific Northwest USA

    NASA Astrophysics Data System (ADS)

    Zarekarizi, Mahkameh; Rana, Arun; Moradkhani, Hamid

    2018-06-01

    There has been focus on the influence of climate indices on precipitation extremes in the literature. Current study presents the evaluation of the precipitation-based extremes in Columbia River Basin (CRB) in the Pacific Northwest USA. We first analyzed the precipitation-based extremes using statistically (ten GCMs) and dynamically downscaled (three GCMs) past and future climate projections. Seven precipitation-based indices that help inform about the flood duration/intensity are used. These indices help in attaining first-hand information on spatial and temporal scales for different service sectors including energy, agriculture, forestry etc. Evaluation of these indices is first performed in historical period (1971-2000) followed by analysis of their relation to large scale tele-connections. Further we mapped these indices over the area to evaluate the spatial variation of past and future extremes in downscaled and observational data. The analysis shows that high values of extreme indices are clustered in either western or northern parts of the basin for historical period whereas the northern part is experiencing higher degree of change in the indices for future scenario. The focus is also on evaluating the relation of these extreme indices to climate tele-connections in historical period to understand their relationship with extremes over CRB. Various climate indices are evaluated for their relationship using Principal Component Analysis (PCA) and Singular Value Decomposition (SVD). Results indicated that, out of 13 climate tele-connections used in the study, CRB is being most affected inversely by East Pacific (EP), Western Pacific (WP), East Atlantic (EA) and North Atlaentic Oscillation (NAO).

  4. Combining wood anatomy and stable isotope variations in a 600-year multi-parameter climate reconstruction from Corsican black pine

    NASA Astrophysics Data System (ADS)

    Szymczak, Sonja; Hetzer, Timo; Bräuning, Achim; Joachimski, Michael M.; Leuschner, Hanns-Hubert; Kuhlemann, Joachim

    2014-10-01

    We present a new multi-parameter dataset from Corsican black pine growing on the island of Corsica in the Western Mediterranean basin covering the period AD 1410-2008. Wood parameters measured include tree-ring width, latewood width, earlywood width, cell lumen area, cell width, cell wall thickness, modelled wood density, as well as stable carbon and oxygen isotopes. We evaluated the relationships between different parameters and determined the value of the dataset for climate reconstructions. Correlation analyses revealed that carbon isotope ratios are influenced by cell parameters determining cell size, whereas oxygen isotope ratios are influenced by cell parameters determining the amount of transportable water in the xylem. A summer (June to August) precipitation reconstruction dating back to AD 1185 was established based on tree-ring width. No long-term trends or pronounced periods with extreme high/low precipitation are recorded in our reconstruction, indicating relatively stable moisture conditions over the entire time period. By comparing the precipitation reconstruction with a summer temperature reconstruction derived from the carbon isotope chronologies, we identified summers with extreme climate conditions, i.e. warm-dry, warm-wet, cold-dry and cold-wet. Extreme climate conditions during summer months were found to influence cell parameter characteristics. Cold-wet summers promote the production of broad latewood composed of wide and thin-walled tracheids, while warm-wet summers promote the production of latewood with small thick-walled cells. The presented dataset emphasizes the potential of multi-parameter wood analysis from one tree species over long time scales.

  5. Autoassociative memory retrieval and spontaneous activity bumps in small-world networks of integrate-and-fire neurons.

    PubMed

    Anishchenko, Anastasia; Treves, Alessandro

    2006-10-01

    The metric structure of synaptic connections is obviously an important factor in shaping the properties of neural networks, in particular the capacity to retrieve memories, with which are endowed autoassociative nets operating via attractor dynamics. Qualitatively, some real networks in the brain could be characterized as 'small worlds', in the sense that the structure of their connections is intermediate between the extremes of an orderly geometric arrangement and of a geometry-independent random mesh. Small worlds can be defined more precisely in terms of their mean path length and clustering coefficient; but is such a precise description useful for a better understanding of how the type of connectivity affects memory retrieval? We have simulated an autoassociative memory network of integrate-and-fire units, positioned on a ring, with the network connectivity varied parametrically between ordered and random. We find that the network retrieves previously stored memory patterns when the connectivity is close to random, and displays the characteristic behavior of ordered nets (localized 'bumps' of activity) when the connectivity is close to ordered. Recent analytical work shows that these two behaviors can coexist in a network of simple threshold-linear units, leading to localized retrieval states. We find that they tend to be mutually exclusive behaviors, however, with our integrate-and-fire units. Moreover, the transition between the two occurs for values of the connectivity parameter which are not simply related to the notion of small worlds.

  6. Using damage data to estimate the risk from summer convective precipitation extremes

    NASA Astrophysics Data System (ADS)

    Schroeer, Katharina; Tye, Mari

    2017-04-01

    This study explores the potential added value from including loss and damage data to understand the risks from high-intensity short-duration convective precipitation events. Projected increases in these events are expected even in regions that are likely to become more arid. Such high intensity precipitation events can trigger hazardous flash floods, debris flows, and landslides that put people and local assets at risk. However, the assessment of local scale precipitation extremes is hampered by its high spatial and temporal variability. In addition to this, not only are extreme events rare, but such small-scale events are likely to be underreported where they do not coincide with the observation network. Reports of private loss and damage on a local administrative unit scale (LAU 2 level) are used to explore the relationship between observed rainfall events and damages reportedly related to hydro-meteorological processes. With 480 Austrian municipalities located within our south-eastern Alpine study region, the damage data are available on a much smaller scale than the available rainfall data. Precipitation is recorded daily at 185 gauges and 52% of these stations additionally deliver sub-hourly rainfall information. To obtain physically plausible information, damage and rainfall data are grouped and analyzed on a catchment scale. The data indicate that rainfall intensities are higher on days that coincide with a damage claim than on days for which no damage was reported. However, approximately one third of the damages related to hydro-meteorological hazards were claimed on days for which no rainfall was recorded at any gauge in the respective catchment. Our goal is to assess whether these events indicate potential extreme events missing in the observations. Damage always is a consequence of an asset being exposed and susceptible to a hazardous process, and naturally, many factors influence whether an extreme rainfall event causes damage. We set up a statistical model to test whether the relationship between extreme rainfall events and damages is robust enough to estimate a potential underrepresentation of high intensity rainfall events in ungauged areas. Risk-relevant factors of socio-economic vulnerability, land cover, streamflow data, and weather type information are included to improve and sharpen the analysis. Within this study, we first aim to identify which rainfall events are most damaging and which factors affect the damages - seen as a proxy for the vulnerability - related to summer convective rainfall extremes in different catchment types. Secondly, we aim to detect potentially unreported damaging rainfall events and estimate the likelihood of such cases. We anticipate this damage perspective on summertime extreme convective precipitation to be beneficial for risk assessment, uncertainty management, and decision making with respect to weather and climate extremes on the regional-to-local level.

  7. The Electrocardiogram and Ischemic Heart Disease in Aircraft Pilots

    PubMed Central

    Manning, G. W.

    1965-01-01

    A review of the Royal Canadian Air Force electrocardiographic (ECG) program for selection of aircrew and detection of coronary disease in trained aircrew is presented. Twenty reported cases of death due to coronary disease in pilots while at the controls of an aircraft are reviewed. The use of routine electrocardiography in the selection of aircrew has proved to be of considerable value, particularly in view of the high cost of training. The ECG continues to be our most sensitive means of detecting asymptomatic coronary disease in aircrew personnel. It is apparent that from both the military and commercial standpoint the incidence of aircraft accidents due to coronary disease is extremely small. This is due in large part to the careful medical supervision of flying personnel including the routine use of electrocardiography in the assessment of flying fitness of aircrew. PMID:14323657

  8. Comparison of cavitation bubbles evolution in viscous media

    NASA Astrophysics Data System (ADS)

    Jasikova, Darina; Schovanec, Petr; Kotek, Michal; Kopecky, Vaclav

    2018-06-01

    There have been tried many types of liquids with different ranges of viscosity values that have been tested to form a single cavitation bubble. The purpose of these experiments was to observe the behaviour of cavitation bubbles in media with different ranges of absorbance. The most of the method was based on spark to induced superheat limit of liquid. Here we used arrangement of the laser-induced breakdown (LIB) method. There were described the set cavitation setting that affects the size bubble in media with different absorbance. We visualized the cavitation bubble with a 60 kHz high speed camera. We used here shadowgraphy setup for the bubble visualization. There were observed time development and bubble extinction in various media, where the size of the bubble in the silicone oil was extremely small, due to the absorbance size of silicon oil.

  9. Modelling maximum river flow by using Bayesian Markov Chain Monte Carlo

    NASA Astrophysics Data System (ADS)

    Cheong, R. Y.; Gabda, D.

    2017-09-01

    Analysis of flood trends is vital since flooding threatens human living in terms of financial, environment and security. The data of annual maximum river flows in Sabah were fitted into generalized extreme value (GEV) distribution. Maximum likelihood estimator (MLE) raised naturally when working with GEV distribution. However, previous researches showed that MLE provide unstable results especially in small sample size. In this study, we used different Bayesian Markov Chain Monte Carlo (MCMC) based on Metropolis-Hastings algorithm to estimate GEV parameters. Bayesian MCMC method is a statistical inference which studies the parameter estimation by using posterior distribution based on Bayes’ theorem. Metropolis-Hastings algorithm is used to overcome the high dimensional state space faced in Monte Carlo method. This approach also considers more uncertainty in parameter estimation which then presents a better prediction on maximum river flow in Sabah.

  10. Specific net present value: an improved method for assessing modularisation costs in water services with growing demand.

    PubMed

    Maurer, M

    2009-05-01

    A specific net present value (SNPV) approach is introduced as a criterion in economic engineering decisions. The SNPV expresses average costs, including the growth rate and plant utilisation over the planning horizon, factors that are excluded from a standard net present value approach. The use of SNPV favours alternatives that are cheaper per service unit and are therefore closer to the costs that a user has to cover. It also shows that demand growth has a similar influence on average costs as an economy of scale. In a high growth scenario, solutions providing less idle capacity can have higher present value costs and still be economically favourable. The SNPV approach is applied in two examples to calculate acceptable additional costs for modularisation and comparable costs for on-site treatment (OST) as an extreme form of modularisation. The calculations show that: (i) the SNPV approach is suitable for quantifying the comparable costs of an OST system in a different scenario; (ii) small systems with projected high demand growth rates and high real interest rates are the most probable entry market for OST water treatment systems; (iii) operating expenses are currently the main economic weakness of membrane-based wastewater OST systems; and (iv) when high growth in demand is expected, up to 100% can be additionally invested in modularisation and staging the expansion of a treatment plant.

  11. Response Identification in the Extremely Low Frequency Region of an Electret Condenser Microphone

    PubMed Central

    Jeng, Yih-Nen; Yang, Tzung-Ming; Lee, Shang-Yin

    2011-01-01

    This study shows that a small electret condenser microphone connected to a notebook or a personal computer (PC) has a prominent response in the extremely low frequency region in a specific environment. It confines most acoustic waves within a tiny air cell as follows. The air cell is constructed by drilling a small hole in a digital versatile disk (DVD) plate. A small speaker and an electret condenser microphone are attached to the two sides of the hole. Thus, the acoustic energy emitted by the speaker and reaching the microphone is strong enough to actuate the diaphragm of the latter. The experiments showed that, once small air leakages are allowed on the margin of the speaker, the microphone captured the signal in the range of 0.5 to 20 Hz. Moreover, by removing the plastic cover of the microphone and attaching the microphone head to the vibration surface, the low frequency signal can be effectively captured too. Two examples are included to show the convenience of applying the microphone to pick up the low frequency vibration information of practical systems. PMID:22346594

  12. Response identification in the extremely low frequency region of an electret condenser microphone.

    PubMed

    Jeng, Yih-Nen; Yang, Tzung-Ming; Lee, Shang-Yin

    2011-01-01

    This study shows that a small electret condenser microphone connected to a notebook or a personal computer (PC) has a prominent response in the extremely low frequency region in a specific environment. It confines most acoustic waves within a tiny air cell as follows. The air cell is constructed by drilling a small hole in a digital versatile disk (DVD) plate. A small speaker and an electret condenser microphone are attached to the two sides of the hole. Thus, the acoustic energy emitted by the speaker and reaching the microphone is strong enough to actuate the diaphragm of the latter. The experiments showed that, once small air leakages are allowed on the margin of the speaker, the microphone captured the signal in the range of 0.5 to 20 Hz. Moreover, by removing the plastic cover of the microphone and attaching the microphone head to the vibration surface, the low frequency signal can be effectively captured too. Two examples are included to show the convenience of applying the microphone to pick up the low frequency vibration information of practical systems.

  13. Development of a miniature Stirling cryocooler for LWIR small satellite applications

    NASA Astrophysics Data System (ADS)

    Kirkconnell, C. S.; Hon, R. C.; Perella, M. D.; Crittenden, T. M.; Ghiaasiaan, S. M.

    2017-05-01

    The optimum small satellite (SmallSat) cryocooler system must be extremely compact and lightweight, achieved in this paper by operating a linear cryocooler at a frequency of approximately 300 Hz. Operation at this frequency, which is well in excess of the 100-150 Hz reported in recent papers on related efforts, requires an evolution beyond the traditional Oxford-class, flexure-based methods of setting the mechanical resonance. A novel approach that optimizes the electromagnetic design and the mechanical design together to simultaneously achieve the required dynamic and thermodynamic performances is described. Since highly miniaturized pulse tube coolers are fundamentally ill-suited for the sub-80K temperature range of interest because the boundary layer losses inside the pulse tube become dominant at the associated very small pulse tube size, a moving displacer Stirling cryocooler architecture is used. Compact compressor mechanisms developed on a previous program are reused for this design, and they have been adapted to yield an extremely compact Stirling warm end motor mechanism. Supporting thermodynamic and electromagnetic analysis results are reported.

  14. Invited Article: Visualisation of extreme value events in optical communications

    NASA Astrophysics Data System (ADS)

    Derevyanko, Stanislav; Redyuk, Alexey; Vergeles, Sergey; Turitsyn, Sergei

    2018-06-01

    Fluctuations of a temporal signal propagating along long-haul transoceanic scale fiber links can be visualised in the spatio-temporal domain drawing visual analogy with ocean waves. Substantial overlapping of information symbols or use of multi-frequency signals leads to strong statistical deviations of local peak power from an average signal power level. We consider long-haul optical communication systems from this unusual angle, treating them as physical systems with a huge number of random statistical events, including extreme value fluctuations that potentially might affect the quality of data transmission. We apply the well-established concepts of adaptive wavefront shaping used in imaging through turbid medium to detect the detrimental phase modulated sequences in optical communications that can cause extreme power outages (rare optical waves of ultra-high amplitude) during propagation down the ultra-long fiber line. We illustrate the concept by a theoretical analysis of rare events of high-intensity fluctuations—optical freak waves, taking as an example an increasingly popular optical frequency division multiplexing data format where the problem of high peak to average power ratio is the most acute. We also show how such short living extreme value spikes in the optical data streams are affected by nonlinearity and demonstrate the negative impact of such events on the system performance.

  15. Hazard assessment for small torrent catchments - lessons learned

    NASA Astrophysics Data System (ADS)

    Eisl, Julia; Huebl, Johannes

    2013-04-01

    The documentation of extreme events as a part of the integral risk management cycle is an important basis for the analysis and assessment of natural hazards. In July 2011 a flood event occurred in the Wölzer-valley in the province of Styria, Austria. For this event at the "Wölzerbach" a detailed event documentation was carried out, gathering data about rainfall, runoff and sediment transport as well as information on damaged objects, infrastructure or crops using various sources. The flood was triggered by heavy rainfalls in two tributaries of the Wölzer-river. Though a rain as well as a discharge gaging station exists for the Wölzer-river, the torrents affected by the high intensity rainfalls are ungaged. For these ungaged torrent catchments the common methods for hazard assessment were evaluated. The back-calculation of the rainfall event was done using a new approach for precipitation analysis. In torrent catchments especially small-scale and high-intensity rainfall events are mainly responsible for extreme events. Austria's weather surveillance radar is operated by the air traffic service "AustroControl". The usually available dataset is interpreted and shows divergences especially when it comes to high intensity rainfalls. For this study the raw data of the radar were requested and analysed. Further on the event was back-calculated with different rainfall-runoff models, hydraulic models and sediment transport models to obtain calibration parameters for future use in hazard assessment for this region. Since there are often problems with woody debris different scenarios were simulated. The calibrated and plausible results from the runoff models were used for the comparison with empirical approaches used in the practical sector. For the planning of mitigation measures of the Schöttl-torrent, which is one of the affected tributaries of the Wölzer-river, a physical scale model was used in addition to the insights of the event analysis to design a check dam for sediment retention. As far as the transport capacity of the lower reaches is limited a balance had to be found between protection on the one hand and sediment connectivity to the Wölzer-river on the other. The lessons learned kicked off discussions for future hazard assessment especially concerning the use of rainfall data and design precipitation values for small torrent catchments. Also the comparison with empirical values showed the need for differentiated concepts for hazard analysis. Therefor recommendations for the use of spatial rainfall reduction factors as well as the demarcation of hazard maps using different event scenarios are proposed.

  16. Characteristics and present trends of wave extremes in the Mediterranean Sea

    NASA Astrophysics Data System (ADS)

    Pino, Cosimo; Lionello, Piero; Galati, Maria Barbara

    2010-05-01

    Wind generated surface waves are an important factor characterizing marine storminess and the marine environment. This contribution considers characteristics and trends of SWH (Significant Wave Height) extremes (both high and low extremes, such as dead calm duration are analyzed). The data analysis is based on a 44-year long simulation (1958-2001) of the wave field in the Mediterranean Sea. The quality of the model simulation is controlled using satellite data. The results show the different characteristics of the different parts of the basin with the variability being higher in the western (where the highest values are produced) than in the eastern areas of the basin (where absence of wave is a rare condition). In fact, both duration of storms and of dead calm episodes is larger in the east than in the west part of the Mediterranean. The African coast and the southern Ionian Sea are the areas were exceptional values of SWH are expected to occur in correspondence with exceptional meteorological events. Significant trends of storm characteristics are present only in sparse areas and suggest a decrease of both storm intensity and duration (a marginal increase of storm intensity is present in the center of the Mediterranean). The statistics of extremes and high SWH values is substantially steady during the second half of the 20th century. The influence of the large-scale teleconnection patterns (TlcP) that are known to be relevant for the Mediterranean climate on the intensity and spatial distribution of extreme SWH (Significant Wave Height) has been investigated. The analysis was focused on the monthly scale analysing the variability of links along the annual cycle. The considered TlcP are the North Atlantic Oscillation, the East-Atlantic / West-Russian pattern and the Scandinavian pattern and their effect on the intensity and the frequency of high/low SWH conditions. The results show it is difficult to establish a dominant TlcP for SWH extremes, because all 4 patterns considered are important for at least few months in the year and none of them is important for the whole year. High extremes in winter and fall are more influenced by the TlcPs than in other seasons and low extremes.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ashouri, Hamed; Sorooshian, Soroosh; Hsu, Kuo-Lin

    This study evaluates the performance of NASA's Modern-Era Retrospective Analysis for Research and Applications (MERRA) precipitation product in reproducing the trend and distribution of extreme precipitation events. Utilizing the extreme value theory, time-invariant and time-variant extreme value distributions are developed to model the trends and changes in the patterns of extreme precipitation events over the contiguous United States during 1979-2010. The Climate Prediction Center (CPC)U.S.Unified gridded observation data are used as the observational dataset. The CPC analysis shows that the eastern and western parts of the United States are experiencing positive and negative trends in annual maxima, respectively. The continental-scalemore » patterns of change found in MERRA seem to reasonably mirror the observed patterns of change found in CPC. This is not previously expected, given the difficulty in constraining precipitation in reanalysis products. MERRA tends to overestimate the frequency at which the 99th percentile of precipitation is exceeded because this threshold tends to be lower in MERRA, making it easier to be exceeded. This feature is dominant during the summer months. MERRAtends to reproduce spatial patterns of the scale and location parameters of the generalized extreme value and generalized Pareto distributions. However, MERRA underestimates these parameters, particularly over the Gulf Coast states, leading to lower magnitudes in extreme precipitation events. Two issues in MERRA are identified: 1)MERRAshows a spurious negative trend in Nebraska andKansas, which ismost likely related to the changes in the satellite observing system over time that has apparently affected the water cycle in the central United States, and 2) the patterns of positive trend over theGulf Coast states and along the East Coast seem to be correlated with the tropical cyclones in these regions. The analysis of the trends in the seasonal precipitation extremes indicates that the hurricane and winter seasons are contributing the most to these trend patterns in the southeastern United States. The increasing annual trend simulated by MERRA in the Gulf Coast region is due to an incorrect trend in winter precipitation extremes.« less

  18. Entropy, extremality, euclidean variations, and the equations of motion

    NASA Astrophysics Data System (ADS)

    Dong, Xi; Lewkowycz, Aitor

    2018-01-01

    We study the Euclidean gravitational path integral computing the Rényi entropy and analyze its behavior under small variations. We argue that, in Einstein gravity, the extremality condition can be understood from the variational principle at the level of the action, without having to solve explicitly the equations of motion. This set-up is then generalized to arbitrary theories of gravity, where we show that the respective entanglement entropy functional needs to be extremized. We also extend this result to all orders in Newton's constant G N , providing a derivation of quantum extremality. Understanding quantum extremality for mixtures of states provides a generalization of the dual of the boundary modular Hamiltonian which is given by the bulk modular Hamiltonian plus the area operator, evaluated on the so-called modular extremal surface. This gives a bulk prescription for computing the relative entropies to all orders in G N . We also comment on how these ideas can be used to derive an integrated version of the equations of motion, linearized around arbitrary states.

  19. Cumulative hazard: The case of nuisance flooding

    NASA Astrophysics Data System (ADS)

    Moftakhari, Hamed R.; AghaKouchak, Amir; Sanders, Brett F.; Matthew, Richard A.

    2017-02-01

    The cumulative cost of frequent events (e.g., nuisance floods) over time may exceed the costs of the extreme but infrequent events for which societies typically prepare. Here we analyze the likelihood of exceedances above mean higher high water and the corresponding property value exposure for minor, major, and extreme coastal floods. Our results suggest that, in response to sea level rise, nuisance flooding (NF) could generate property value exposure comparable to, or larger than, extreme events. Determining whether (and when) low cost, nuisance incidents aggregate into high cost impacts and deciding when to invest in preventive measures are among the most difficult decisions for policymakers. It would be unfortunate if efforts to protect societies from extreme events (e.g., 0.01 annual probability) left them exposed to a cumulative hazard with enormous costs. We propose a Cumulative Hazard Index (CHI) as a tool for framing the future cumulative impact of low cost incidents relative to infrequent extreme events. CHI suggests that in New York, NY, Washington, DC, Miami, FL, San Francisco, CA, and Seattle, WA, a careful consideration of socioeconomic impacts of NF for prioritization is crucial for sustainable coastal flood risk management.

  20. Epidemiologic Evaluation of Measurement Data in the Presence of Detection Limits

    PubMed Central

    Lubin, Jay H.; Colt, Joanne S.; Camann, David; Davis, Scott; Cerhan, James R.; Severson, Richard K.; Bernstein, Leslie; Hartge, Patricia

    2004-01-01

    Quantitative measurements of environmental factors greatly improve the quality of epidemiologic studies but can pose challenges because of the presence of upper or lower detection limits or interfering compounds, which do not allow for precise measured values. We consider the regression of an environmental measurement (dependent variable) on several covariates (independent variables). Various strategies are commonly employed to impute values for interval-measured data, including assignment of one-half the detection limit to nondetected values or of “fill-in” values randomly selected from an appropriate distribution. On the basis of a limited simulation study, we found that the former approach can be biased unless the percentage of measurements below detection limits is small (5–10%). The fill-in approach generally produces unbiased parameter estimates but may produce biased variance estimates and thereby distort inference when 30% or more of the data are below detection limits. Truncated data methods (e.g., Tobit regression) and multiple imputation offer two unbiased approaches for analyzing measurement data with detection limits. If interest resides solely on regression parameters, then Tobit regression can be used. If individualized values for measurements below detection limits are needed for additional analysis, such as relative risk regression or graphical display, then multiple imputation produces unbiased estimates and nominal confidence intervals unless the proportion of missing data is extreme. We illustrate various approaches using measurements of pesticide residues in carpet dust in control subjects from a case–control study of non-Hodgkin lymphoma. PMID:15579415

  1. On the precision of experimentally determined protein folding rates and φ-values

    PubMed Central

    De Los Rios, Miguel A.; Muralidhara, B.K.; Wildes, David; Sosnick, Tobin R.; Marqusee, Susan; Wittung-Stafshede, Pernilla; Plaxco, Kevin W.; Ruczinski, Ingo

    2006-01-01

    φ-Values, a relatively direct probe of transition-state structure, are an important benchmark in both experimental and theoretical studies of protein folding. Recently, however, significant controversy has emerged regarding the reliability with which φ-values can be determined experimentally: Because φ is a ratio of differences between experimental observables it is extremely sensitive to errors in those observations when the differences are small. Here we address this issue directly by performing blind, replicate measurements in three laboratories. By monitoring within- and between-laboratory variability, we have determined the precision with which folding rates and φ-values are measured using generally accepted laboratory practices and under conditions typical of our laboratories. We find that, unless the change in free energy associated with the probing mutation is quite large, the precision of φ-values is relatively poor when determined using rates extrapolated to the absence of denaturant. In contrast, when we employ rates estimated at nonzero denaturant concentrations or assume that the slopes of the chevron arms (mf and mu) are invariant upon mutation, the precision of our estimates of φ is significantly improved. Nevertheless, the reproducibility we thus obtain still compares poorly with the confidence intervals typically reported in the literature. This discrepancy appears to arise due to differences in how precision is calculated, the dependence of precision on the number of data points employed in defining a chevron, and interlaboratory sources of variability that may have been largely ignored in the prior literature. PMID:16501226

  2. Testing anthropic reasoning for the cosmological constant with a realistic galaxy formation model

    NASA Astrophysics Data System (ADS)

    Sudoh, Takahiro; Totani, Tomonori; Makiya, Ryu; Nagashima, Masahiro

    2017-01-01

    The anthropic principle is one of the possible explanations for the cosmological constant (Λ) problem. In previous studies, a dark halo mass threshold comparable with our Galaxy must be assumed in galaxy formation to get a reasonably large probability of finding the observed small value, P(<Λobs), though stars are found in much smaller galaxies as well. Here we examine the anthropic argument by using a semi-analytic model of cosmological galaxy formation, which can reproduce many observations such as galaxy luminosity functions. We calculate the probability distribution of Λ by running the model code for a wide range of Λ, while other cosmological parameters and model parameters for baryonic processes of galaxy formation are kept constant. Assuming that the prior probability distribution is flat per unit Λ, and that the number of observers is proportional to stellar mass, we find P(<Λobs) = 6.7 per cent without introducing any galaxy mass threshold. We also investigate the effect of metallicity; we find P(<Λobs) = 9.0 per cent if observers exist only in galaxies whose metallicity is higher than the solar abundance. If the number of observers is proportional to metallicity, we find P(<Λobs) = 9.7 per cent. Since these probabilities are not extremely small, we conclude that the anthropic argument is a viable explanation, if the value of Λ observed in our Universe is determined by a probability distribution.

  3. Matter-wave solitons in nonlinear optical lattices

    NASA Astrophysics Data System (ADS)

    Sakaguchi, Hidetsugu; Malomed, Boris A.

    2005-10-01

    We introduce a dynamical model of a Bose-Einstein condensate based on the one-dimensional (1D) Gross-Pitaevskii equation (GPE) with a nonlinear optical lattice (NOL), which is represented by the cubic term whose coefficient is periodically modulated in the coordinate. The model describes a situation when the atomic scattering length is spatially modulated, via the optically controlled Feshbach resonance, in an optical lattice created by interference of two laser beams. Relatively narrow solitons supported by the NOL are predicted by means of the variational approximation (VA), and an averaging method is applied to broad solitons. A different feature is a minimum norm (number of atoms), N=Nmin , necessary for the existence of solitons. The VA predicts Nmin very accurately. Numerical results are chiefly presented for the NOL with the zero spatial average value of the nonlinearity coefficient. Solitons with values of the amplitude A larger than at N=Nmin are stable. Unstable solitons with smaller, but not too small, A rearrange themselves into persistent breathers. For still smaller A , the soliton slowly decays into radiation without forming a breather. Broad solitons with very small A are practically stable, as their decay is extremely slow. These broad solitons may freely move across the lattice, featuring quasielastic collisions. Narrow solitons, which are strongly pinned to the NOL, can easily form stable complexes. Finally, the weakly unstable low-amplitude solitons are stabilized if a cubic term with a constant coefficient, corresponding to weak attraction, is included in the GPE.

  4. An efficient spectral method for the simulation of dynamos in Cartesian geometry and its implementation on massively parallel computers

    NASA Astrophysics Data System (ADS)

    Stellmach, Stephan; Hansen, Ulrich

    2008-05-01

    Numerical simulations of the process of convection and magnetic field generation in planetary cores still fail to reach geophysically realistic control parameter values. Future progress in this field depends crucially on efficient numerical algorithms which are able to take advantage of the newest generation of parallel computers. Desirable features of simulation algorithms include (1) spectral accuracy, (2) an operation count per time step that is small and roughly proportional to the number of grid points, (3) memory requirements that scale linear with resolution, (4) an implicit treatment of all linear terms including the Coriolis force, (5) the ability to treat all kinds of common boundary conditions, and (6) reasonable efficiency on massively parallel machines with tens of thousands of processors. So far, algorithms for fully self-consistent dynamo simulations in spherical shells do not achieve all these criteria simultaneously, resulting in strong restrictions on the possible resolutions. In this paper, we demonstrate that local dynamo models in which the process of convection and magnetic field generation is only simulated for a small part of a planetary core in Cartesian geometry can achieve the above goal. We propose an algorithm that fulfills the first five of the above criteria and demonstrate that a model implementation of our method on an IBM Blue Gene/L system scales impressively well for up to O(104) processors. This allows for numerical simulations at rather extreme parameter values.

  5. When personality and culture clash: the psychological distress of allocentrics in an individualist culture and idiocentrics in a collectivist culture.

    PubMed

    Caldwell-Harris, Catherine L; Ayçiçegi, Ayse

    2006-09-01

    Because humans need both autonomy and interdependence, persons with either an extreme collectivist orientation (allocentrics) or extreme individualist values (idiocentrics) may be at risk for possession of some features of psychopathology. Is an extreme personality style a risk factor primarily when it conflicts with the values of the surrounding society? Individualism-collectivism scenarios and a battery of clinical and personality scales were administered to nonclinical samples of college students in Boston and Istanbul. For students residing in a highly individualistic society (Boston), collectivism scores were positively correlated with depression, social anxiety, obsessive-compulsive disorder and dependent personality. Individualism scores, particularly horizontal individualism, were negatively correlated with these same scales. A different pattern was obtained for students residing in a collectivist culture, Istanbul. Here individualism (and especially horizontal individualism) was positively correlated with scales for paranoid, schizoid, narcissistic, borderline and antisocial personality disorder. Collectivism (particularly vertical collectivism) was associated with low report of symptoms on these scales. These results indicate that having a personality style which conflicts with the values of society is associated with psychiatric symptoms. Having an orientation inconsistent with societal values may thus be a risk factor for poor mental health.

  6. Projected Changes in Temperature and Precipitation Extremes over China as Measured by 50-yr Return Values and Periods Based on a CMIP5 Ensemble

    NASA Astrophysics Data System (ADS)

    Xu, Ying; Gao, Xuejie; Giorgi, Filippo; Zhou, Botao; Shi, Ying; Wu, Jie; Zhang, Yongxiang

    2018-04-01

    Future changes in the 50-yr return level for temperature and precipitation extremes over mainland China are investigated based on a CMIP5 multi-model ensemble for RCP2.6, RCP4.5 and RCP8.5 scenarios. The following indices are analyzed: TXx and TNn (the annual maximum and minimum of daily maximum and minimum surface temperature), RX5day (the annual maximum consecutive 5-day precipitation) and CDD (maximum annual number of consecutive dry days). After first validating the model performance, future changes in the 50-yr return values and return periods for these indices are investigated along with the inter-model spread. Multi-model median changes show an increase in the 50-yr return values of TXx and a decrease for TNn, more specifically, by the end of the 21st century under RCP8.5, the present day 50-yr return period of warm events is reduced to 1.2 yr, while extreme cold events over the country are projected to essentially disappear. A general increase in RX5day 50-yr return values is found in the future. By the end of the 21st century under RCP8.5, events of the present RX5day 50-yr return period are projected to reduce to < 10 yr over most of China. Changes in CDD-50 show a dipole pattern over China, with a decrease in the values and longer return periods in the north, and vice versa in the south. Our study also highlights the need for further improvements in the representation of extreme events in climate models to assess the future risks and engineering design related to large-scale infrastructure in China.

  7. Extreme air-sea surface turbulent fluxes in mid latitudes - estimation, origins and mechanisms

    NASA Astrophysics Data System (ADS)

    Gulev, Sergey; Natalia, Tilinina

    2014-05-01

    Extreme turbulent heat fluxes in the North Atlantic and North Pacific mid latitudes were estimated from the modern era and first generation reanalyses (NCEP-DOE, ERA-Interim, MERRA NCEP-CFSR, JRA-25) for the period from 1979 onwards. We used direct surface turbulent flux output as well as reanalysis state variables from which fluxes have been computed using COARE-3 bulk algorithm. For estimation of extreme flux values we analyzed surface flux probability density distribution which was approximated by Modified Fisher-Tippett distribution. In all reanalyses extreme turbulent heat fluxes amount to 1500-2000 W/m2 (for the 99th percentile) and can exceed 2000 W/m2 for higher percentiles in the western boundary current extension (WBCE) regions. Different reanalyses show significantly different shape of MFT distribution, implying considerable differences in the estimates of extreme fluxes. The highest extreme turbulent latent heat fluxes are diagnosed in NCEP-DOE, ERA-Interim and NCEP-CFSR reanalyses with the smallest being in MERRA. These differences may not necessarily reflect the differences in mean values. Analysis shows that differences in statistical properties of the state variables are the major source of differences in the shape of PDF of fluxes and in the estimates of extreme fluxes while the contribution of computational schemes used in different reanalyses is minor. The strongest differences in the characteristics of probability distributions of surface fluxes and extreme surface flux values between different reanalyses are found in the WBCE extension regions and high latitudes. In the next instance we analyzed the mechanisms responsible for forming surface turbulent fluxes and their potential role in changes of midlatitudinal heat balance. Midlatitudinal cyclones were considered as the major mechanism responsible for extreme turbulent fluxes which are typically occur during the cold air outbreaks in the rear parts of cyclones when atmospheric conditions provide locally high winds and air-sea temperature gradients. For this purpose we linked characteristics of cyclone activity over the midlatitudinal oceans with the extreme surface turbulent heat fluxes. Cyclone tracks and parameters of cyclone life cycle (deepening rates, propagation velocities, life time and clustering) were derived from the same reanalyses using state of the art numerical tracking algorithm. The main questions addressed in this study are (i) through which mechanisms extreme surface fluxes are associated with cyclone activity? and (ii) which types of cyclones are responsible for forming extreme turbulent fluxes? Our analysis shows that extreme surface fluxes are typically associated not with cyclones themselves but rather with cyclone-anticyclone interaction zones. This implies that North Atlantic and North Pacific series of intense cyclones do not result in the anomalous surface fluxes. Alternatively, extreme fluxes are most frequently associated with blocking situations, particularly with the intensification of the Siberian and North American Anticyclones providing cold-air outbreaks over WBC regions.

  8. A Framework to Understand Extreme Space Weather Event Probability.

    PubMed

    Jonas, Seth; Fronczyk, Kassandra; Pratt, Lucas M

    2018-03-12

    An extreme space weather event has the potential to disrupt or damage infrastructure systems and technologies that many societies rely on for economic and social well-being. Space weather events occur regularly, but extreme events are less frequent, with a small number of historical examples over the last 160 years. During the past decade, published works have (1) examined the physical characteristics of the extreme historical events and (2) discussed the probability or return rate of select extreme geomagnetic disturbances, including the 1859 Carrington event. Here we present initial findings on a unified framework approach to visualize space weather event probability, using a Bayesian model average, in the context of historical extreme events. We present disturbance storm time (Dst) probability (a proxy for geomagnetic disturbance intensity) across multiple return periods and discuss parameters of interest to policymakers and planners in the context of past extreme space weather events. We discuss the current state of these analyses, their utility to policymakers and planners, the current limitations when compared to other hazards, and several gaps that need to be filled to enhance space weather risk assessments. © 2018 Society for Risk Analysis.

  9. Analysis of the giant genomes of Fritillaria (Liliaceae) indicates that a lack of DNA removal characterizes extreme expansions in genome size.

    PubMed

    Kelly, Laura J; Renny-Byfield, Simon; Pellicer, Jaume; Macas, Jiří; Novák, Petr; Neumann, Pavel; Lysak, Martin A; Day, Peter D; Berger, Madeleine; Fay, Michael F; Nichols, Richard A; Leitch, Andrew R; Leitch, Ilia J

    2015-10-01

    Plants exhibit an extraordinary range of genome sizes, varying by > 2000-fold between the smallest and largest recorded values. In the absence of polyploidy, changes in the amount of repetitive DNA (transposable elements and tandem repeats) are primarily responsible for genome size differences between species. However, there is ongoing debate regarding the relative importance of amplification of repetitive DNA versus its deletion in governing genome size. Using data from 454 sequencing, we analysed the most repetitive fraction of some of the largest known genomes for diploid plant species, from members of Fritillaria. We revealed that genomic expansion has not resulted from the recent massive amplification of just a handful of repeat families, as shown in species with smaller genomes. Instead, the bulk of these immense genomes is composed of highly heterogeneous, relatively low-abundance repeat-derived DNA, supporting a scenario where amplified repeats continually accumulate due to infrequent DNA removal. Our results indicate that a lack of deletion and low turnover of repetitive DNA are major contributors to the evolution of extremely large genomes and show that their size cannot simply be accounted for by the activity of a small number of high-abundance repeat families. © 2015 The Authors. New Phytologist © 2015 New Phytologist Trust.

  10. Synthesis and properties of electrically conductive, ductile, extremely long (~50 μm) nanosheets of K(x)CoO2·yH2O.

    PubMed

    Aksit, Mahmut; Hoselton, Benjamin C; Kim, Ha Jun; Ha, Don-Hyung; Robinson, Richard D

    2013-09-25

    Extremely long, electrically conductive, ductile, free-standing nanosheets of water-stabilized KxCoO2·yH2O are synthesized using the sol-gel and electric-field induced kinetic-demixing (SGKD) process. Room temperature in-plane resistivity of the KxCoO2·yH2O nanosheets is less than ~4.7 mΩ·cm, which corresponds to one of the lowest resistivity values reported for metal oxide nanosheets. The synthesis produces tens of thousands of very high aspect ratio (50,000:50,000:1 = length/width/thickness), millimeter length nanosheets stacked into a macro-scale pellet. Free-standing nanosheets up to ~50 μm long are readily delaminated from the stacked nanosheets. High-resolution transmission electron microscopy (HR-TEM) studies of the free-standing nanosheets indicate that the delaminated pieces consist of individual nanosheet crystals that are turbostratically stacked. X-ray diffraction (XRD) studies confirm that the nanosheets are stacked in perfect registry along their c-axis. Scanning electron microscopy (SEM) based statistical analysis show that the average thickness of the nanosheets is ~13 nm. The nanosheets show ductility with a bending radius as small as ~5 nm.

  11. Selective DNA Pooling for Determination of Linkage between a Molecular Marker and a Quantitative Trait Locus

    PubMed Central

    Darvasi, A.; Soller, M.

    1994-01-01

    Selective genotyping is a method to reduce costs in marker-quantitative trait locus (QTL) linkage determination by genotyping only those individuals with extreme, and hence most informative, quantitative trait values. The DNA pooling strategy (termed: ``selective DNA pooling'') takes this one step further by pooling DNA from the selected individuals at each of the two phenotypic extremes, and basing the test for linkage on marker allele frequencies as estimated from the pooled samples only. This can reduce genotyping costs of marker-QTL linkage determination by up to two orders of magnitude. Theoretical analysis of selective DNA pooling shows that for experiments involving backcross, F(2) and half-sib designs, the power of selective DNA pooling for detecting genes with large effect, can be the same as that obtained by individual selective genotyping. Power for detecting genes with small effect, however, was found to decrease strongly with increase in the technical error of estimating allele frequencies in the pooled samples. The effect of technical error, however, can be markedly reduced by replication of technical procedures. It is also shown that a proportion selected of 0.1 at each tail will be appropriate for a wide range of experimental conditions. PMID:7896115

  12. Efficacy of electromyographic biofeedback compared with conventional physical therapy for upper-extremity function in patients following stroke: a research overview and meta-analysis.

    PubMed

    Moreland, J; Thomson, M A

    1994-06-01

    The purpose of this study was to examine the efficacy of electromyographic biofeedback compared with conventional physical therapy for improving upper-extremity function in patients following a stroke. A literature search was done for the years 1976 to 1992. The selection criteria included single-blinded randomized control trials. Study quality was assessed for nine criteria. For functional (disability index or stage of recovery) and impairment outcomes, meta-analyses were performed on odds ratios for improvement versus no improvement. Mann-Whitney U-Test probability values were combined across studies. Six studies were selected, and outcome data were obtained for five studies. The common odds ratio was 2.2 for function and 1.1 for impairments in favor of biofeedback. The estimate of the number needed to treat to prevent a nonresponder was 11 for function and 22 for impairments. None of the meta-analyses were statistically significant. The results do not conclusively indicate superiority of either form of therapy. Although there is a chance of Type II error, the estimated size of the effect is small. Given this estimate of little or no difference, therapists need to consider cost, ease of application, and patient preference when selecting these therapies.

  13. Extremely hard amorphous-crystalline hybrid steel surface produced by deformation induced cementite amorphization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Wei; Meng, Yifei; Zhang, Xie

    Amorphous and nanograined (NG) steels are two categories of strong steels. However, over the past decade, their application has been hindered by their limited plasticity, the addition of expensive alloying elements, and processing challenges associated with producing bulk materials. Here in this work, we report that the surface of a carburized Fe-Mn-Si martensitic steel with extremely low elemental alloying additions can be economically fabricated into an amorphous-nanocrystalline hybrid structure. Atom probe tomography and nanobeam diffraction of a hard turned steel surface together with molecular dynamics (MD) simulations reveal that the original cementite surface structure experiences a size-dependent amorphization and phasemore » transformation during heavy plastic deformation. MD simulations further show that the martensite-cementite interface serves as a nucleation site for cementite amorphization, and that cementite can become disordered if further strained when the cementite particles are relatively small. These graded structures exhibit a surface hardness of ~16.2 GPa, which exceeds the value of ~8.8 GPa for the original nanocrystalline martensitic steel and most nanocrystalline steels reported before. Finally, this practical and cost-efficient approach for producing a hard surface with retained bulk ductility and toughness can provide expanded opportunities for producing an amorphous-crystalline hybrid structure in steels and other alloy systems.« less

  14. Analysis of real-time vibration data

    USGS Publications Warehouse

    Safak, E.

    2005-01-01

    In recent years, a few structures have been instrumented to provide continuous vibration data in real time, recording not only large-amplitude motions generated by extreme loads, but also small-amplitude motions generated by ambient loads. The main objective in continuous recording is to track any changes in structural characteristics, and to detect damage after an extreme event, such as an earthquake or explosion. The Fourier-based spectral analysis methods have been the primary tool to analyze vibration data from structures. In general, such methods do not work well for real-time data, because real-time data are mainly composed of ambient vibrations with very low amplitudes and signal-to-noise ratios. The long duration, linearity, and the stationarity of ambient data, however, allow us to utilize statistical signal processing tools, which can compensate for the adverse effects of low amplitudes and high noise. The analysis of real-time data requires tools and techniques that can be applied in real-time; i.e., data are processed and analyzed while being acquired. This paper presents some of the basic tools and techniques for processing and analyzing real-time vibration data. The topics discussed include utilization of running time windows, tracking mean and mean-square values, filtering, system identification, and damage detection.

  15. Extremely hard amorphous-crystalline hybrid steel surface produced by deformation induced cementite amorphization

    DOE PAGES

    Guo, Wei; Meng, Yifei; Zhang, Xie; ...

    2018-04-11

    Amorphous and nanograined (NG) steels are two categories of strong steels. However, over the past decade, their application has been hindered by their limited plasticity, the addition of expensive alloying elements, and processing challenges associated with producing bulk materials. Here in this work, we report that the surface of a carburized Fe-Mn-Si martensitic steel with extremely low elemental alloying additions can be economically fabricated into an amorphous-nanocrystalline hybrid structure. Atom probe tomography and nanobeam diffraction of a hard turned steel surface together with molecular dynamics (MD) simulations reveal that the original cementite surface structure experiences a size-dependent amorphization and phasemore » transformation during heavy plastic deformation. MD simulations further show that the martensite-cementite interface serves as a nucleation site for cementite amorphization, and that cementite can become disordered if further strained when the cementite particles are relatively small. These graded structures exhibit a surface hardness of ~16.2 GPa, which exceeds the value of ~8.8 GPa for the original nanocrystalline martensitic steel and most nanocrystalline steels reported before. Finally, this practical and cost-efficient approach for producing a hard surface with retained bulk ductility and toughness can provide expanded opportunities for producing an amorphous-crystalline hybrid structure in steels and other alloy systems.« less

  16. Linearized stability of extreme black holes

    NASA Astrophysics Data System (ADS)

    Burko, Lior M.; Khanna, Gaurav

    2018-03-01

    Extreme black holes have been argued to be unstable, in the sense that under linearized gravitational perturbations of the extreme Kerr spacetime the Weyl scalar ψ4 blows up along their event horizons at very late advanced times. We show numerically, by solving the Teukolsky equation in 2 +1 D , that all algebraically independent curvature scalar polynomials approach limits that exist when advanced time along the event horizon approaches infinity. Therefore, the horizons of extreme black holes are stable against linearized gravitational perturbations. We argue that the divergence of ψ4 is a consequence of the choice of a fixed tetrad, and that in a suitable dynamical tetrad all Weyl scalars, including ψ4, approach their background extreme Kerr values. We make similar conclusions also for the case of scalar field perturbations of extreme Kerr.

  17. Detection of Extremes with AIRS and CrIS

    NASA Technical Reports Server (NTRS)

    Aumann, Hartmut H.; Manning, Evan M.; Behrangi, Ali

    2013-01-01

    Climate change is expected to be detected first as changes in extreme values rather than in mean values. The availability of data of from two instruments in the same orbit, AIRS data for the past eleven years and AIRS and CrIS data from the past year, provides an opportunity to evaluate this using examples of climate relevance: Desertification, seen as changes in hot extremes, severe storm, seen as a change in extremely cold clouds and the warming of the polar zone. We use AIRS to establish trends for the 1%tile, the mean and 99%tile brightness temperatures measured with the 900 cm(exp -1) channel from AIRS for the past 11 years. This channel is in the clearest part of the 11 micron atmospheric window. Substantial trends are seen for land and ocean, which in the case of the 1%tile (cold) extremes are related to the current shift of deep convection from ocean to land. Changes are also seen in the 99%tile for day tropical land, but their interpretation is at present unclear. We also see dramatic changes for the mean and 99%tile of the North Polar area. The trends are an order of magnitude larger than the instrument trend of about 3 mK/year. We use the statistical distribution from the past year derived from AIRS to evaluate the accuracy of continuing the trends established with AIRS with CrIS data. We minimize the concern about differences in the spectral response functions by limiting the analysis to the channel at 900 cm(exp -1).While the two instruments agree within 100 mK for the global day/night land/ocean mean values, there are significant differences when evaluating the1% and 99%tiles. We see a consistent warm bias in the CrIS data relative to AIRS for the 1%tile (extremely cold, cloudy) data in the tropical zone, particularly for tropical land, but the bias is not day/night land/ocean consistent. At this point the difference appears to be due to differences in the radiometric response of AIRS and CrIS to differences in the day/night land/ocean cloud types. Unless the effect can be mitigated by a future reprocessing the CrIS data, it will significantly complicate the concatenation of the AIRS and CrIS data records for the continuation of trends in extreme values.

  18. A large, benign prostatic cyst presented with an extremely high serum prostate-specific antigen level.

    PubMed

    Chen, Han-Kuang; Pemberton, Richard

    2016-01-08

    We report a case of a patient who presented with an extremely high serum prostate specific antigen (PSA) level and underwent radical prostatectomy for presumed prostate cancer. Surprisingly, the whole mount prostatectomy specimen showed only small volume, organ-confined prostate adenocarcinoma and a large, benign intraprostatic cyst, which was thought to be responsible for the PSA elevation. 2016 BMJ Publishing Group Ltd.

  19. Evidence of population resistance to extreme low flows in a fluvial-dependent fish species

    USGS Publications Warehouse

    Katz, Rachel A.; Freeman, Mary C.

    2015-01-01

    Extreme low streamflows are natural disturbances to aquatic populations. Species in naturally intermittent streams display adaptations that enhance persistence during extreme events; however, the fate of populations in perennial streams during unprecedented low-flow periods is not well-understood. Biota requiring swift-flowing habitats may be especially vulnerable to flow reductions. We estimated the abundance and local survival of a native fluvial-dependent fish species (Etheostoma inscriptum) across 5 years encompassing historic low flows in a sixth-order southeastern USA perennial river. Based on capturemark-recapture data, the study shoal may have acted as a refuge during severe drought, with increased young-of-the-year (YOY) recruitment and occasionally high adult immigration. Contrary to expectations, summer and autumn survival rates (30 days) were not strongly depressed during low-flow periods, despite 25%-80% reductions in monthly discharge. Instead, YOY survival increased with lower minimum discharge and in response to small rain events that increased low-flow variability. Age-1+ fish showed the opposite pattern, with survival decreasing in response to increasing low-flow variability. Results from this population dynamics study of a small fish in a perennial river suggest that fluvial-dependent species can be resistant to extreme flow reductions through enhanced YOY recruitment and high survival

  20. Printing Proteins as Microarrays for High-Throughput Function Determination

    NASA Astrophysics Data System (ADS)

    MacBeath, Gavin; Schreiber, Stuart L.

    2000-09-01

    Systematic efforts are currently under way to construct defined sets of cloned genes for high-throughput expression and purification of recombinant proteins. To facilitate subsequent studies of protein function, we have developed miniaturized assays that accommodate extremely low sample volumes and enable the rapid, simultaneous processing of thousands of proteins. A high-precision robot designed to manufacture complementary DNA microarrays was used to spot proteins onto chemically derivatized glass slides at extremely high spatial densities. The proteins attached covalently to the slide surface yet retained their ability to interact specifically with other proteins, or with small molecules, in solution. Three applications for protein microarrays were demonstrated: screening for protein-protein interactions, identifying the substrates of protein kinases, and identifying the protein targets of small molecules.

  1. Forest bat population dynamics over 14 years at a climate refuge: Effects of timber harvesting and weather extremes

    PubMed Central

    Chidel, Mark; Law, Peter R.

    2018-01-01

    Long-term data are needed to explore the interaction of weather extremes with habitat alteration; in particular, can ‘refugia’ buffer population dynamics against climate change and are they robust to disturbances such as timber harvesting. Because forest bats are good indicators of ecosystem health, we used 14 years (1999–2012) of mark-recapture data from a suite of small tree-hollow roosting bats to estimate survival, abundance and body condition in harvested and unharvested forest and over extreme El Niño and La Niña weather events in southeastern Australia. Trapping was replicated within an experimental forest, located in a climate refuge, with different timber harvesting treatments. We trapped foraging bats and banded 3043 with a 32% retrap rate. Mark-recapture analyses allowed for dependence of survival on time, species, sex, logging treatment and for transients. A large portion of the population remained resident, with a maximum time to recapture of nine years. The effect of logging history (unlogged vs 16–30 years post-logging regrowth) on apparent survival was minor and species specific, with no detectable effect for two species, a positive effect for one and negative for the other. There was no effect of logging history on abundance or body condition for any of these species. Apparent survival of residents was not strongly influenced by weather variation (except for the smallest species), unlike previous studies outside of refugia. Despite annual variation in abundance and body condition across the 14 years of the study, no relationship with extreme weather was evident. The location of our study area in a climate refuge potentially buffered bat population dynamics from extreme weather. These results support the value of climate refugia in mitigating climate change impacts, though the lack of an external control highlights the need for further studies on the functioning of climate refugia. Relatively stable population dynamics were not compromised by timber harvesting, suggesting ecologically sustainable harvesting may be compatible with climate refugia. PMID:29444115

  2. Maintaining Small Business Support in Times of Increased Army National Guard Utilization An Impending Crisis

    DTIC Science & Technology

    2006-02-13

    business clientele, additional costs to hire a manager or replacement workers, complete loss of family income, and in extreme situations, bankruptcy, are...USAWC STRATEGY RESEARCH PROJECT MAINTAINING SMALL BUSINESS SUPPORT IN TIMES OF INCREASED ARMY NATIONAL GUARD UTILIZATION: AN IMPENDING CRISIS by...00-00-2005 to 00-00-2006 4. TITLE AND SUBTITLE Maintaining Small Business Support in Times of Increased Army National Guard Utilization An

  3. Evolution caused by extreme events.

    PubMed

    Grant, Peter R; Grant, B Rosemary; Huey, Raymond B; Johnson, Marc T J; Knoll, Andrew H; Schmitt, Johanna

    2017-06-19

    Extreme events can be a major driver of evolutionary change over geological and contemporary timescales. Outstanding examples are evolutionary diversification following mass extinctions caused by extreme volcanism or asteroid impact. The evolution of organisms in contemporary time is typically viewed as a gradual and incremental process that results from genetic change, environmental perturbation or both. However, contemporary environments occasionally experience strong perturbations such as heat waves, floods, hurricanes, droughts and pest outbreaks. These extreme events set up strong selection pressures on organisms, and are small-scale analogues of the dramatic changes documented in the fossil record. Because extreme events are rare, almost by definition, they are difficult to study. So far most attention has been given to their ecological rather than to their evolutionary consequences. We review several case studies of contemporary evolution in response to two types of extreme environmental perturbations, episodic (pulse) or prolonged (press). Evolution is most likely to occur when extreme events alter community composition. We encourage investigators to be prepared for evolutionary change in response to rare events during long-term field studies.This article is part of the themed issue 'Behavioural, ecological and evolutionary responses to extreme climatic events'. © 2017 The Author(s).

  4. Storms or cold fronts: what is really responsible for the extreme waves regime in the Colombian Caribbean coastal region?

    NASA Astrophysics Data System (ADS)

    Otero, L. J.; Ortiz-Royero, J. C.; Ruiz-Merchan, J. K.; Higgins, A. E.; Henriquez, S. A.

    2016-02-01

    The aim of this study is to determine the contribution and importance of cold fronts and storms to extreme waves in different areas of the Colombian Caribbean in an attempt to determine the extent of the threat posed by the flood processes to which these coastal populations are exposed. Furthermore, the study wishes to establish the actions to which coastal engineering constructions should be subject. In the calculation of maritime constructions, the most important parameter is the height of the wave. For this reason, it is necessary to establish the design wave height to which a coastal engineering structure should be resistant. This wave height varies according to the return period considered. The significant height values for the areas focused on in the study were calculated in accordance with Gumbel's extreme value methodology. The methodology was evaluated using data from the reanalysis of the spectral National Oceanic and Atmospheric Administration (NOAA) WAVEWATCH III® (WW3) model for 15 points along the 1600 km of the Colombian Caribbean coastline (continental and insular) between the years 1979 and 2009. The results demonstrated that the extreme waves caused by tropical cyclones and those caused by cold fronts have different effects along the Colombian Caribbean coast. Storms and hurricanes are of greater importance in the Guajira Peninsula (Alta Guajira). In the central area (consisting of Baja Guajira, and the cities of Santa Marta, Barranquilla, and Cartagena), the strong impact of cold fronts on extreme waves is evident. However, in the southern region of the Colombian Caribbean coast (ranging from the Gulf of Morrosquillo to the Gulf of Urabá), the extreme values of wave heights are lower than in the previously mentioned regions, despite being dominated mainly by the passage of cold fronts. Extreme waves in the San Andrés and Providencia insular region present a different dynamic from that in the continental area due to their geographic location. The wave heights in the extreme regime are similar in magnitude to those found in Alta Guajira, but the extreme waves associated with the passage of cold fronts in this region have lower return periods than those associated with the hurricane season.

  5. Assessment of climate change downscaling and non-stationarity on the spatial pattern of a mangrove ecosystem in an arid coastal region of southern Iran

    NASA Astrophysics Data System (ADS)

    Etemadi, Halimeh; Samadi, S. Zahra; Sharifikia, Mohammad; Smoak, Joseph M.

    2016-10-01

    Mangrove wetlands exist in the transition zone between terrestrial and marine environments and have remarkable ecological and socio-economic value. This study uses climate change downscaling to address the question of non-stationarity influences on mangrove variations (expansion and contraction) within an arid coastal region. Our two-step approach includes downscaling models and uncertainty assessment, followed by a non-stationary and trend procedure using the Extreme Value Analysis (extRemes code). The Long Ashton Research Station Weather Generator (LARS-WG) model along with two different general circulation model (GCMs) (MIRH and HadCM3) were used to downscale climatic variables during current (1968-2011) and future (2011-2030, 2045-2065, and 2080-2099) periods. Parametric and non-parametric bootstrapping uncertainty tests demonstrated that the LARS-WGS model skillfully downscaled climatic variables at the 95 % significance level. Downscaling results using MIHR model show that minimum and maximum temperatures will increase in the future (2011-2030, 2045-2065, and 2080-2099) during winter and summer in a range of +4.21 and +4.7 °C, and +3.62 and +3.55 °C, respectively. HadCM3 analysis also revealed an increase in minimum (˜+3.03 °C) and maximum (˜+3.3 °C) temperatures during wet and dry seasons. In addition, we examined how much mangrove area has changed during the past decades and, thus, if climate change non-stationarity impacts mangrove ecosystems. Our results using remote sensing techniques and the non-parametric Mann-Whitney two-sample test indicated a sharp decline in mangrove area during 1972,1987, and 1997 periods ( p value = 0.002). Non-stationary assessment using the generalized extreme value (GEV) distributions by including mangrove area as a covariate further indicated that the null hypothesis of the stationary climate (no trend) should be rejected due to the very low p values for precipitation ( p value = 0.0027), minimum ( p value = 0.000000029) and maximum ( p value = 0.00016) temperatures. Based on non-stationary analysis and an upward trend in downscaled temperature extremes, climate change may control mangrove development in the future.

  6. Randolph AFB, San Antonio, Texas. Revised Uniform Summary of Surface Weather Observations (RUSSWO). Parts A-F.

    DTIC Science & Technology

    1982-02-08

    is printed in any year-month block when the extreme value Is based on an in- complete month (at least one day missing for the month). When a month has...means, standard deviations, and total number of valid observations for each month and annual (all months). An asterisk (*) is printed n each data block...becomes the extreme or monthly total in any of these tables it is printed as "TRACE." Continued on Reverse Side Values ’or means and standard

  7. Optical rogue-wave-like extreme value fluctuations in fiber Raman amplifiers.

    PubMed

    Hammani, Kamal; Finot, Christophe; Dudley, John M; Millot, Guy

    2008-10-13

    We report experimental observation and characterization of rogue wave-like extreme value statistics arising from pump-signal noise transfer in a fiber Raman amplifier. Specifically, by exploiting Raman amplification with an incoherent pump, the amplified signal is shown to develop a series of temporal intensity spikes whose peak power follows a power-law probability distribution. The results are interpreted using a numerical model of the Raman gain process using coupled nonlinear Schrödinger equations, and the numerical model predicts results in good agreement with experiment.

  8. Extreme value problems without calculus: a good link with geometry and elementary maths

    NASA Astrophysics Data System (ADS)

    Ganci, Salvatore

    2016-11-01

    Some classical examples of problem solving, where an extreme value condition is required, are here considered and/or revisited. The search for non-calculus solutions appears pedagogically useful and intriguing as shown through a rich literature. A teacher, who teaches both maths and physics, (as happens in Italian High schools) can find in these kinds of problems a mind stimulating exercise compared with the standard solution obtained by the differential calculus. A good link between the geometric and analytical explanations is so established.

  9. Random walkers with extreme value memory: modelling the peak-end rule

    NASA Astrophysics Data System (ADS)

    Harris, Rosemary J.

    2015-05-01

    Motivated by the psychological literature on the ‘peak-end rule’ for remembered experience, we perform an analysis within a random walk framework of a discrete choice model where agents’ future choices depend on the peak memory of their past experiences. In particular, we use this approach to investigate whether increased noise/disruption always leads to more switching between decisions. Here extreme value theory illuminates different classes of dynamics indicating that the long-time behaviour is dependent on the scale used for reflection; this could have implications, for example, in questionnaire design.

  10. Production of High-Value Nanoparticles via Biogenic Processes Using Aquacultural and Horticultural Food Waste

    PubMed Central

    Ghosh, Purabi R.; Fawcett, Derek; Sharma, Shashi B.; Poinern, Gerrard E. J.

    2017-01-01

    The quantities of organic waste produced globally by aquacultural and horticulture are extremely large and offer an attractive renewable source of biomolecules and bioactive compounds. The availability of such large and diverse sources of waste materials creates a unique opportunity to develop new recycling and food waste utilisation strategies. The aim of this review is to report the current status of research in the emerging field of producing high-value nanoparticles from food waste. Eco-friendly biogenic processes are quite rapid, and are usually carried out at normal room temperature and pressure. These alternative clean technologies do not rely on the use of the toxic chemicals and solvents commonly associated with traditional nanoparticle manufacturing processes. The relatively small number of research articles in the field have been surveyed and evaluated. Among the diversity of waste types, promising candidates and their ability to produce various high-value nanoparticles are discussed. Experimental parameters, nanoparticle characteristics and potential applications for nanoparticles in pharmaceuticals and biomedical applications are discussed. In spite of the advantages, there are a number of challenges, including nanoparticle reproducibility and understanding the formation mechanisms between different food waste products. Thus, there is considerable scope and opportunity for further research in this emerging field. PMID:28773212

  11. The multiple facets of Peto's paradox: a life-history model for the evolution of cancer suppression

    PubMed Central

    Brown, Joel S.; Cunningham, Jessica J.; Gatenby, Robert A.

    2015-01-01

    Large animals should have higher lifetime probabilities of cancer than small animals because each cell division carries an attendant risk of mutating towards a tumour lineage. However, this is not observed—a (Peto's) paradox that suggests large and/or long-lived species have evolved effective cancer suppression mechanisms. Using the Euler–Lotka population model, we demonstrate the evolutionary value of cancer suppression as determined by the ‘cost’ (decreased fecundity) of suppression verses the ‘cost’ of cancer (reduced survivorship). Body size per se will not select for sufficient cancer suppression to explain the paradox. Rather, cancer suppression should be most extreme when the probability of non-cancer death decreases with age (e.g. alligators), maturation is delayed, fecundity rates are low and fecundity increases with age. Thus, the value of cancer suppression is predicted to be lowest in the vole (short lifespan, high fecundity) and highest in the naked mole rat (long lived with late female sexual maturity). The life history of pre-industrial humans likely selected for quite low levels of cancer suppression. In modern humans that live much longer, this level results in unusually high lifetime cancer risks. The model predicts a lifetime risk of 49% compared with the current empirical value of 43%. PMID:26056365

  12. Extreme climatic events change the dynamics and invasibility of semi-arid annual plant communities.

    PubMed

    Jiménez, Milagros A; Jaksic, Fabian M; Armesto, Juan J; Gaxiola, Aurora; Meserve, Peter L; Kelt, Douglas A; Gutiérrez, Julio R

    2011-12-01

    Extreme climatic events represent disturbances that change the availability of resources. We studied their effects on annual plant assemblages in a semi-arid ecosystem in north-central Chile. We analysed 130 years of precipitation data using generalised extreme-value distribution to determine extreme events, and multivariate techniques to analyse 20 years of plant cover data of 34 native and 11 exotic species. Extreme drought resets the dynamics of the system and renders it susceptible to invasion. On the other hand, by favouring native annuals, moderately wet events change species composition and allow the community to be resilient to extreme drought. The probability of extreme drought has doubled over the last 50 years. Therefore, investigations on the interaction of climate change and biological invasions are relevant to determine the potential for future effects on the dynamics of semi-arid annual plant communities. 2011 Blackwell Publishing Ltd/CNRS.

  13. An operational-oriented approach to the assessment of low probability seismic ground motions for critical infrastructures

    NASA Astrophysics Data System (ADS)

    Garcia-Fernandez, Mariano; Assatourians, Karen; Jimenez, Maria-Jose

    2018-01-01

    Extreme natural hazard events have the potential to cause significant disruption to critical infrastructure (CI) networks. Among them, earthquakes represent a major threat as sudden-onset events with limited, if any, capability of forecast, and high damage potential. In recent years, the increased exposure of interdependent systems has heightened concern, motivating the need for a framework for the management of these increased hazards. The seismic performance level and resilience of existing non-nuclear CIs can be analyzed by identifying the ground motion input values leading to failure of selected key elements. Main interest focuses on the ground motions exceeding the original design values, which should correspond to low probability occurrence. A seismic hazard methodology has been specifically developed to consider low-probability ground motions affecting elongated CI networks. The approach is based on Monte Carlo simulation, which allows for building long-duration synthetic earthquake catalogs to derive low-probability amplitudes. This approach does not affect the mean hazard values and allows obtaining a representation of maximum amplitudes that follow a general extreme-value distribution. This facilitates the analysis of the occurrence of extremes, i.e., very low probability of exceedance from unlikely combinations, for the development of, e.g., stress tests, among other applications. Following this methodology, extreme ground-motion scenarios have been developed for selected combinations of modeling inputs including seismic activity models (source model and magnitude-recurrence relationship), ground motion prediction equations (GMPE), hazard levels, and fractiles of extreme ground motion. The different results provide an overview of the effects of different hazard modeling inputs on the generated extreme motion hazard scenarios. This approach to seismic hazard is at the core of the risk analysis procedure developed and applied to European CI transport networks within the framework of the European-funded INFRARISK project. Such an operational seismic hazard framework can be used to provide insight in a timely manner to make informed risk management or regulating further decisions on the required level of detail or on the adoption of measures, the cost of which can be balanced against the benefits of the measures in question.

  14. A Conservative Inverse Normal Test Procedure for Combining P-Values in Integrative Research.

    ERIC Educational Resources Information Center

    Saner, Hilary

    1994-01-01

    The use of p-values in combining results of studies often involves studies that are potentially aberrant. This paper proposes a combined test that permits trimming some of the extreme p-values. The trimmed statistic is based on an inverse cumulative normal transformation of the ordered p-values. (SLD)

  15. Recent work on network application layer: MioNet, the virtual workplace for small businesses

    NASA Astrophysics Data System (ADS)

    Hesselink, Lambertus; Rizal, Dharmarus; Bjornson, Eric; Miller, Brian; Chan, Keith

    2005-11-01

    Small businesses must be extremely efficient and smartly leverage their resources, suppliers, and partners to successfully compete with larger firms. A successful small business requires a set of companies with interlocking business relationships that are dynamic and needs-based. There has been no software solution that creates a secure and flexible way to efficiently connect small business computer-based employees and partners. In this invited paper, we discuss MioNet, a secure and powerful data management platform which may provide millions of small businesses with a virtual workplace and help them to succeed.

  16. Slice sampling technique in Bayesian extreme of gold price modelling

    NASA Astrophysics Data System (ADS)

    Rostami, Mohammad; Adam, Mohd Bakri; Ibrahim, Noor Akma; Yahya, Mohamed Hisham

    2013-09-01

    In this paper, a simulation study of Bayesian extreme values by using Markov Chain Monte Carlo via slice sampling algorithm is implemented. We compared the accuracy of slice sampling with other methods for a Gumbel model. This study revealed that slice sampling algorithm offers more accurate and closer estimates with less RMSE than other methods . Finally we successfully employed this procedure to estimate the parameters of Malaysia extreme gold price from 2000 to 2011.

  17. Characteristics of a 30-cm thruster operated with small hole accelerator grid ion optics

    NASA Technical Reports Server (NTRS)

    Vahrenkamp, R. P.

    1976-01-01

    Small hole accelerator grid ion optical systems have been tested as a possible means of improving 30-cm ion thruster performance. The effects of small hole grids on the critical aspects of thruster operation including discharge chamber performance, doubly-charged ion concentration, effluent beam characteristics, and plasma properties have been evaluated. In general, small hole accelerator grids are beneficial in improving thruster performance while maintaining low double ion ratios. However, extremely small accelerator aperture diameters tend to degrade beam divergence characteristics. A quantitative discussion of these advantages and disadvantages of small hole accelerator grids, as well as resulting variations in thruster operation characteristics, is presented.

  18. Review of the Need for a Large-scale Test Facility for Research on the Effects of Extreme Winds on Structures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    R. G. Little

    1999-03-01

    The Idaho National Engineering and Environmental Laboratory (INEEL), through the US Department of Energy (DOE), has proposed that a large-scale wind test facility (LSWTF) be constructed to study, in full-scale, the behavior of low-rise structures under simulated extreme wind conditions. To determine the need for, and potential benefits of, such a facility, the Idaho Operations Office of the DOE requested that the National Research Council (NRC) perform an independent assessment of the role and potential value of an LSWTF in the overall context of wind engineering research. The NRC established the Committee to Review the Need for a Large-scale Testmore » Facility for Research on the Effects of Extreme Winds on Structures, under the auspices of the Board on Infrastructure and the Constructed Environment, to perform this assessment. This report conveys the results of the committee's deliberations as well as its findings and recommendations. Data developed at large-scale would enhanced the understanding of how structures, particularly light-frame structures, are affected by extreme winds (e.g., hurricanes, tornadoes, sever thunderstorms, and other events). With a large-scale wind test facility, full-sized structures, such as site-built or manufactured housing and small commercial or industrial buildings, could be tested under a range of wind conditions in a controlled, repeatable environment. At this time, the US has no facility specifically constructed for this purpose. During the course of this study, the committee was confronted by three difficult questions: (1) does the lack of a facility equate to a need for the facility? (2) is need alone sufficient justification for the construction of a facility? and (3) would the benefits derived from information produced in an LSWTF justify the costs of producing that information? The committee's evaluation of the need and justification for an LSWTF was shaped by these realities.« less

  19. Nitrogen cycling in an extreme hyperarid environment inferred from δ(15)N analyses of plants, soils and herbivore diet.

    PubMed

    Díaz, Francisca P; Frugone, Matías; Gutiérrez, Rodrigo A; Latorre, Claudio

    2016-03-09

    Climate controls on the nitrogen cycle are suggested by the negative correlation between precipitation and δ(15)N values across different ecosystems. For arid ecosystems this is unclear, as water limitation among other factors can confound this relationship. We measured herbivore feces, foliar and soil δ(15)N and δ(13)C values and chemically characterized soils (pH and elemental composition) along an elevational/climatic gradient in the Atacama Desert, northern Chile. Although very positive δ(15)N values span the entire gradient, soil δ(15)N values show a positive correlation with aridity as expected. In contrast, foliar δ(15)N values and herbivore feces show a hump-shaped relationship with elevation, suggesting that plants are using a different N source, possibly of biotic origin. Thus at the extreme limits of plant life, biotic interactions may be just as important as abiotic processes, such as climate in explaining ecosystem δ(15)N values.

  20. Nitrogen cycling in an extreme hyperarid environment inferred from δ15N analyses of plants, soils and herbivore diet

    NASA Astrophysics Data System (ADS)

    Díaz, Francisca P.; Frugone, Matías; Gutiérrez, Rodrigo A.; Latorre, Claudio

    2016-03-01

    Climate controls on the nitrogen cycle are suggested by the negative correlation between precipitation and δ15N values across different ecosystems. For arid ecosystems this is unclear, as water limitation among other factors can confound this relationship. We measured herbivore feces, foliar and soil δ15N and δ13C values and chemically characterized soils (pH and elemental composition) along an elevational/climatic gradient in the Atacama Desert, northern Chile. Although very positive δ15N values span the entire gradient, soil δ15N values show a positive correlation with aridity as expected. In contrast, foliar δ15N values and herbivore feces show a hump-shaped relationship with elevation, suggesting that plants are using a different N source, possibly of biotic origin. Thus at the extreme limits of plant life, biotic interactions may be just as important as abiotic processes, such as climate in explaining ecosystem δ15N values.

Top