Extreme-value dependence: An application to exchange rate markets
NASA Astrophysics Data System (ADS)
Fernandez, Viviana
2007-04-01
Extreme value theory (EVT) focuses on modeling the tail behavior of a loss distribution using only extreme values rather than the whole data set. For a sample of 10 countries with dirty/free float regimes, we investigate whether paired currencies exhibit a pattern of asymptotic dependence. That is, whether an extremely large appreciation or depreciation in the nominal exchange rate of one country might transmit to another. In general, after controlling for volatility clustering and inertia in returns, we do not find evidence of extreme-value dependence between paired exchange rates. However, for asymptotic-independent paired returns, we find that tail dependency of exchange rates is stronger under large appreciations than under large depreciations.
Extreme Value Analysis of hydro meteorological extremes in the ClimEx Large-Ensemble
NASA Astrophysics Data System (ADS)
Wood, R. R.; Martel, J. L.; Willkofer, F.; von Trentini, F.; Schmid, F. J.; Leduc, M.; Frigon, A.; Ludwig, R.
2017-12-01
Many studies show an increase in the magnitude and frequency of hydrological extreme events in the course of climate change. However the contribution of natural variability to the magnitude and frequency of hydrological extreme events is not yet settled. A reliable estimate of extreme events is from great interest for water management and public safety. In the course of the ClimEx Project (www.climex-project.org) a new single-model large-ensemble was created by dynamically downscaling the CanESM2 large-ensemble with the Canadian Regional Climate Model version 5 (CRCM5) for an European Domain and a Northeastern North-American domain. By utilizing the ClimEx 50-Member Large-Ensemble (CRCM5 driven by CanESM2 Large-Ensemble) a thorough analysis of natural variability in extreme events is possible. Are the current extreme value statistical methods able to account for natural variability? How large is the natural variability for e.g. a 1/100 year return period derived from a 50-Member Large-Ensemble for Europe and Northeastern North-America? These questions should be answered by applying various generalized extreme value distributions (GEV) to the ClimEx Large-Ensemble. Hereby various return levels (5-, 10-, 20-, 30-, 60- and 100-years) based on various lengths of time series (20-, 30-, 50-, 100- and 1500-years) should be analyzed for the maximum one day precipitation (RX1d), the maximum three hourly precipitation (RX3h) and the streamflow for selected catchments in Europe. The long time series of the ClimEx Ensemble (7500 years) allows us to give a first reliable estimate of the magnitude and frequency of certain extreme events.
NASA Astrophysics Data System (ADS)
Otto, F. E. L.; Mitchell, D.; Sippel, S.; Black, M. T.; Dittus, A. J.; Harrington, L. J.; Mohd Saleh, N. H.
2014-12-01
A shift in the distribution of socially-relevant climate variables such as daily minimum winter temperatures and daily precipitation extremes, has been attributed to anthropogenic climate change for various mid-latitude regions. However, while there are many process-based arguments suggesting also a change in the shape of these distributions, attribution studies demonstrating this have not currently been undertaken. Here we use a very large initial condition ensemble of ~40,000 members simulating the European winter 2013/2014 using the distributed computing infrastructure under the weather@home project. Two separate scenarios are used:1. current climate conditions, and 2. a counterfactual scenario of "world that might have been" without anthropogenic forcing. Specifically focusing on extreme events, we assess how the estimated parameters of the Generalized Extreme Value (GEV) distribution vary depending on variable-type, sampling frequency (daily, monthly, …) and geographical region. We find that the location parameter changes for most variables but, depending on the region and variables, we also find significant changes in scale and shape parameters. The very large ensemble allows, furthermore, to assess whether such findings in the fitted GEV distributions are consistent with an empirical analysis of the model data, and whether the most extreme data still follow a known underlying distribution that in a small sample size might otherwise be thought of as an out-lier. The ~40,000 member ensemble is simulated using 12 different SST patterns (1 'observed', and 11 best guesses of SSTs with no anthropogenic warming). The range in SSTs, along with the corresponding changings in the NAO and high-latitude blocking inform on the dynamics governing some of these extreme events. While strong tele-connection patterns are not found in this particular experiment, the high number of simulated extreme events allows for a more thorough analysis of the dynamics than has been performed before. Therefore, combining extreme value theory with very large ensemble simulations allows us to understand the dynamics of changes in extreme events which is not possible just using the former but also shows in which cases statistics combined with smaller ensembles give as valid results as very large initial conditions.
Shim, Je-Myung; Kwon, Hae-Yeon; Kim, Ha-Roo; Kim, Bo-In; Jung, Ju-Hyeon
2013-12-01
[Purpose] The aim of this study was to assess the effect of Nordic pole walking on the electromyographic activities of upper extremity and lower extremity muscles. [Subjects and Methods] The subjects were randomly divided into two groups as follows: without Nordic pole walking group (n=13) and with Nordic pole walking group (n=13). The EMG data were collected by measurement while the subjects walking on a treadmill for 30 minutes by measuring from one heel strike to the next. [Results] Both the average values and maximum values of the muscle activity of the upper extremity increased in both the group that used Nordic poles and the group that did not use Nordic poles, and the values showed statistically significant differences. There was an increase in the average value for muscle activity of the latissimus dorsi, but the difference was not statistically significant, although there was a statistically significant increase in its maximum value. The average and maximum values for muscle activity of the lower extremity did not show large differences in either group, and the values did not show any statistically significant differences. [Conclusion] The use of Nordic poles by increased muscle activity of the upper extremity compared with regular walking but did not affect the lower extremity.
Shim, Je-myung; Kwon, Hae-yeon; Kim, Ha-roo; Kim, Bo-in; Jung, Ju-hyeon
2014-01-01
[Purpose] The aim of this study was to assess the effect of Nordic pole walking on the electromyographic activities of upper extremity and lower extremity muscles. [Subjects and Methods] The subjects were randomly divided into two groups as follows: without Nordic pole walking group (n=13) and with Nordic pole walking group (n=13). The EMG data were collected by measurement while the subjects walking on a treadmill for 30 minutes by measuring from one heel strike to the next. [Results] Both the average values and maximum values of the muscle activity of the upper extremity increased in both the group that used Nordic poles and the group that did not use Nordic poles, and the values showed statistically significant differences. There was an increase in the average value for muscle activity of the latissimus dorsi, but the difference was not statistically significant, although there was a statistically significant increase in its maximum value. The average and maximum values for muscle activity of the lower extremity did not show large differences in either group, and the values did not show any statistically significant differences. [Conclusion] The use of Nordic poles by increased muscle activity of the upper extremity compared with regular walking but did not affect the lower extremity. PMID:24409018
Extreme Mean and Its Applications
NASA Technical Reports Server (NTRS)
Swaroop, R.; Brownlow, J. D.
1979-01-01
Extreme value statistics obtained from normally distributed data are considered. An extreme mean is defined as the mean of p-th probability truncated normal distribution. An unbiased estimate of this extreme mean and its large sample distribution are derived. The distribution of this estimate even for very large samples is found to be nonnormal. Further, as the sample size increases, the variance of the unbiased estimate converges to the Cramer-Rao lower bound. The computer program used to obtain the density and distribution functions of the standardized unbiased estimate, and the confidence intervals of the extreme mean for any data are included for ready application. An example is included to demonstrate the usefulness of extreme mean application.
Dynamical systems proxies of atmospheric predictability and mid-latitude extremes
NASA Astrophysics Data System (ADS)
Messori, Gabriele; Faranda, Davide; Caballero, Rodrigo; Yiou, Pascal
2017-04-01
Extreme weather ocurrences carry enormous social and economic costs and routinely garner widespread scientific and media coverage. Many extremes (for e.g. storms, heatwaves, cold spells, heavy precipitation) are tied to specific patterns of midlatitude atmospheric circulation. The ability to identify these patterns and use them to enhance the predictability of the extremes is therefore a topic of crucial societal and economic value. We propose a novel predictability pathway for extreme events, by building upon recent advances in dynamical systems theory. We use two simple dynamical systems metrics - local dimension and persistence - to identify sets of similar large-scale atmospheric flow patterns which present a coherent temporal evolution. When these patterns correspond to weather extremes, they therefore afford a particularly good forward predictability. We specifically test this technique on European winter temperatures, whose variability largely depends on the atmospheric circulation in the North Atlantic region. We find that our dynamical systems approach provides predictability of large-scale temperature extremes up to one week in advance.
Causes of Glacier Melt Extremes in the Alps Since 1949
NASA Astrophysics Data System (ADS)
Thibert, E.; Dkengne Sielenou, P.; Vionnet, V.; Eckert, N.; Vincent, C.
2018-01-01
Recent record-breaking glacier melt values are attributable to peculiar extreme events and long-term warming trends that shift averages upward. Analyzing one of the world's longest mass balance series with extreme value statistics, we show that detrending melt anomalies makes it possible to disentangle these effects, leading to a fairer evaluation of the return period of melt extreme values such as 2003, and to characterize them by a more realistic bounded behavior. Using surface energy balance simulations, we show that three independent drivers control melt: global radiation, latent heat, and the amount of snow at the beginning of the melting season. Extremes are governed by large deviations in global radiation combined with sensible heat. Long-term trends are driven by the lengthening of melt duration due to earlier and longer-lasting melting of ice along with melt intensification caused by trends in long-wave irradiance and latent heat due to higher air moisture.
A Generalized Framework for Non-Stationary Extreme Value Analysis
NASA Astrophysics Data System (ADS)
Ragno, E.; Cheng, L.; Sadegh, M.; AghaKouchak, A.
2017-12-01
Empirical trends in climate variables including precipitation, temperature, snow-water equivalent at regional to continental scales are evidence of changes in climate over time. The evolving climate conditions and human activity-related factors such as urbanization and population growth can exert further changes in weather and climate extremes. As a result, the scientific community faces an increasing demand for updated appraisal of the time-varying climate extremes. The purpose of this study is to offer a robust and flexible statistical tool for non-stationary extreme value analysis which can better characterize the severity and likelihood of extreme climatic variables. This is critical to ensure a more resilient environment in a changing climate. Following the positive feedback on the first version of Non-Stationary Extreme Value Analysis (NEVA) Toolbox by Cheng at al. 2014, we present an improved version, i.e. NEVA2.0. The upgraded version herein builds upon a newly-developed hybrid evolution Markov Chain Monte Carlo (MCMC) approach for numerical parameters estimation and uncertainty assessment. This addition leads to a more robust uncertainty estimates of return levels, return periods, and risks of climatic extremes under both stationary and non-stationary assumptions. Moreover, NEVA2.0 is flexible in incorporating any user-specified covariate other than the default time-covariate (e.g., CO2 emissions, large scale climatic oscillation patterns). The new feature will allow users to examine non-stationarity of extremes induced by physical conditions that underlie the extreme events (e.g. antecedent soil moisture deficit, large-scale climatic teleconnections, urbanization). In addition, the new version offers an option to generate stationary and/or non-stationary rainfall Intensity - Duration - Frequency (IDF) curves that are widely used for risk assessment and infrastructure design. Finally, a Graphical User Interface (GUI) of the package is provided, making NEVA accessible to a broader audience.
Calculating p-values and their significances with the Energy Test for large datasets
NASA Astrophysics Data System (ADS)
Barter, W.; Burr, C.; Parkes, C.
2018-04-01
The energy test method is a multi-dimensional test of whether two samples are consistent with arising from the same underlying population, through the calculation of a single test statistic (called the T-value). The method has recently been used in particle physics to search for samples that differ due to CP violation. The generalised extreme value function has previously been used to describe the distribution of T-values under the null hypothesis that the two samples are drawn from the same underlying population. We show that, in a simple test case, the distribution is not sufficiently well described by the generalised extreme value function. We present a new method, where the distribution of T-values under the null hypothesis when comparing two large samples can be found by scaling the distribution found when comparing small samples drawn from the same population. This method can then be used to quickly calculate the p-values associated with the results of the test.
How Do Microphysical Processes Influence Large-Scale Precipitation Variability and Extremes?
Hagos, Samson; Ruby Leung, L.; Zhao, Chun; ...
2018-02-10
Convection permitting simulations using the Model for Prediction Across Scales-Atmosphere (MPAS-A) are used to examine how microphysical processes affect large-scale precipitation variability and extremes. An episode of the Madden-Julian Oscillation is simulated using MPAS-A with a refined region at 4-km grid spacing over the Indian Ocean. It is shown that cloud microphysical processes regulate the precipitable water (PW) statistics. Because of the non-linear relationship between precipitation and PW, PW exceeding a certain critical value (PWcr) contributes disproportionately to precipitation variability. However, the frequency of PW exceeding PWcr decreases rapidly with PW, so changes in microphysical processes that shift the columnmore » PW statistics relative to PWcr even slightly have large impacts on precipitation variability. Furthermore, precipitation variance and extreme precipitation frequency are approximately linearly related to the difference between the mean and critical PW values. Thus observed precipitation statistics could be used to directly constrain model microphysical parameters as this study demonstrates using radar observations from DYNAMO field campaign.« less
How Do Microphysical Processes Influence Large-Scale Precipitation Variability and Extremes?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hagos, Samson; Ruby Leung, L.; Zhao, Chun
Convection permitting simulations using the Model for Prediction Across Scales-Atmosphere (MPAS-A) are used to examine how microphysical processes affect large-scale precipitation variability and extremes. An episode of the Madden-Julian Oscillation is simulated using MPAS-A with a refined region at 4-km grid spacing over the Indian Ocean. It is shown that cloud microphysical processes regulate the precipitable water (PW) statistics. Because of the non-linear relationship between precipitation and PW, PW exceeding a certain critical value (PWcr) contributes disproportionately to precipitation variability. However, the frequency of PW exceeding PWcr decreases rapidly with PW, so changes in microphysical processes that shift the columnmore » PW statistics relative to PWcr even slightly have large impacts on precipitation variability. Furthermore, precipitation variance and extreme precipitation frequency are approximately linearly related to the difference between the mean and critical PW values. Thus observed precipitation statistics could be used to directly constrain model microphysical parameters as this study demonstrates using radar observations from DYNAMO field campaign.« less
NASA Astrophysics Data System (ADS)
Marani, M.; Zorzetto, E.; Hosseini, S. R.; Miniussi, A.; Scaioni, M.
2017-12-01
The Generalized Extreme Value (GEV) distribution is widely adopted irrespective of the properties of the stochastic process generating the extreme events. However, GEV presents several limitations, both theoretical (asymptotic validity for a large number of events/year or hypothesis of Poisson occurrences of Generalized Pareto events), and practical (fitting uses just yearly maxima or a few values above a high threshold). Here we describe the Metastatistical Extreme Value Distribution (MEVD, Marani & Ignaccolo, 2015), which relaxes asymptotic or Poisson/GPD assumptions and makes use of all available observations. We then illustrate the flexibility of the MEVD by applying it to daily precipitation, hurricane intensity, and storm surge magnitude. Application to daily rainfall from a global raingauge network shows that MEVD estimates are 50% more accurate than those from GEV when the recurrence interval of interest is much greater than the observational period. This makes MEVD suited for application to satellite rainfall observations ( 20 yrs length). Use of MEVD on TRMM data yields extreme event patterns that are in better agreement with surface observations than corresponding GEV estimates.Applied to the HURDAT2 Atlantic hurricane intensity dataset, MEVD significantly outperforms GEV estimates of extreme hurricanes. Interestingly, the Generalized Pareto distribution used for "ordinary" hurricane intensity points to the existence of a maximum limit wind speed that is significantly smaller than corresponding physically-based estimates. Finally, we applied the MEVD approach to water levels generated by tidal fluctuations and storm surges at a set of coastal sites spanning different storm-surge regimes. MEVD yields accurate estimates of large quantiles and inferences on tail thickness (fat vs. thin) of the underlying distribution of "ordinary" surges. In summary, the MEVD approach presents a number of theoretical and practical advantages, and outperforms traditional approaches in several applications. We conclude that the MEVD is a significant contribution to further generalize extreme value theory, with implications for a broad range of Earth Sciences.
Long-term statistics of extreme tsunami height at Crescent City
NASA Astrophysics Data System (ADS)
Dong, Sheng; Zhai, Jinjin; Tao, Shanshan
2017-06-01
Historically, Crescent City is one of the most vulnerable communities impacted by tsunamis along the west coast of the United States, largely attributed to its offshore geography. Trans-ocean tsunamis usually produce large wave runup at Crescent Harbor resulting in catastrophic damages, property loss and human death. How to determine the return values of tsunami height using relatively short-term observation data is of great significance to assess the tsunami hazards and improve engineering design along the coast of Crescent City. In the present study, the extreme tsunami heights observed along the coast of Crescent City from 1938 to 2015 are fitted using six different probabilistic distributions, namely, the Gumbel distribution, the Weibull distribution, the maximum entropy distribution, the lognormal distribution, the generalized extreme value distribution and the generalized Pareto distribution. The maximum likelihood method is applied to estimate the parameters of all above distributions. Both Kolmogorov-Smirnov test and root mean square error method are utilized for goodness-of-fit test and the better fitting distribution is selected. Assuming that the occurrence frequency of tsunami in each year follows the Poisson distribution, the Poisson compound extreme value distribution can be used to fit the annual maximum tsunami amplitude, and then the point and interval estimations of return tsunami heights are calculated for structural design. The results show that the Poisson compound extreme value distribution fits tsunami heights very well and is suitable to determine the return tsunami heights for coastal disaster prevention.
Generalized extreme gust wind speeds distributions
Cheng, E.; Yeung, C.
2002-01-01
Since summer 1996, the US wind engineers are using the extreme gust (or 3-s gust) as the basic wind speed to quantify the destruction of extreme winds. In order to better understand these destructive wind forces, it is important to know the appropriate representations of these extreme gust wind speeds. Therefore, the purpose of this study is to determine the most suitable extreme value distributions for the annual extreme gust wind speeds recorded in large selected areas. To achieve this objective, we are using the generalized Pareto distribution as the diagnostic tool for determining the types of extreme gust wind speed distributions. The three-parameter generalized extreme value distribution function is, thus, reduced to either Type I Gumbel, Type II Frechet or Type III reverse Weibull distribution function for the annual extreme gust wind speeds recorded at a specific site.With the considerations of the quality and homogeneity of gust wind data collected at more than 750 weather stations throughout the United States, annual extreme gust wind speeds at selected 143 stations in the contiguous United States were used in the study. ?? 2002 Elsevier Science Ltd. All rights reserved.
Climate and its change over the Tibetan Plateau and its Surroundings in 1963-2015
NASA Astrophysics Data System (ADS)
Ding, J.; Cuo, L.
2017-12-01
Tibetan Plateau and its surroundings (TPS, 23°-43°N, 73°-106°E) lies in the southwest of China and includes Tibet Autonomous Region, Qinghai Province, southern Xinjiang Uygur Autonomous Region, part of Gansu Province, western Sichuan Province, and northern Yunnan Province. The region is of strategic importance in water resources because it is the headwater of ten large rivers that support more than 16 billion population. In this study, we use daily temperature maximum and minimum, precipitation and wind speed in 1963-2015 obtained from Climate Data Center of China Meteorological Administration and Qinghai Meteorological Bureau to investigate extreme climate conditions and their changes over the TPS. The extreme events are selected based on annual extreme values and percentiles. Annual extreme value approach produces one value each year for all variables, which enables us to examine the magnitude of extreme events; whereas percentile approach selects extreme values by setting 95th percentile as thresholds for maximum temperature, precipitation and wind speed, and 5th percentile for minimum temperature. Percentile approach not only enables us to investigate the magnitude but also frequency of the extreme events. Also, Mann-Kendall trend and mutation analysis were applied to analyze the changes in mean and extreme conditions. The results will help us understand more about the extreme events during the past five decades on the TPS and will provide valuable information for the upcoming IPCC reports on climate change.
Wind and wave extremes over the world oceans from very large ensembles
NASA Astrophysics Data System (ADS)
Breivik, Øyvind; Aarnes, Ole Johan; Abdalla, Saleh; Bidlot, Jean-Raymond; Janssen, Peter A. E. M.
2014-07-01
Global return values of marine wind speed and significant wave height are estimated from very large aggregates of archived ensemble forecasts at +240 h lead time. Long lead time ensures that the forecasts represent independent draws from the model climate. Compared with ERA-Interim, a reanalysis, the ensemble yields higher return estimates for both wind speed and significant wave height. Confidence intervals are much tighter due to the large size of the data set. The period (9 years) is short enough to be considered stationary even with climate change. Furthermore, the ensemble is large enough for nonparametric 100 year return estimates to be made from order statistics. These direct return estimates compare well with extreme value estimates outside areas with tropical cyclones. Like any method employing modeled fields, it is sensitive to tail biases in the numerical model, but we find that the biases are moderate outside areas with tropical cyclones.
Rainfall extremes from TRMM data and the Metastatistical Extreme Value Distribution
NASA Astrophysics Data System (ADS)
Zorzetto, Enrico; Marani, Marco
2017-04-01
A reliable quantification of the probability of weather extremes occurrence is essential for designing resilient water infrastructures and hazard mitigation measures. However, it is increasingly clear that the presence of inter-annual climatic fluctuations determines a substantial long-term variability in the frequency of occurrence of extreme events. This circumstance questions the foundation of the traditional extreme value theory, hinged on stationary Poisson processes or on asymptotic assumptions to derive the Generalized Extreme Value (GEV) distribution. We illustrate here, with application to daily rainfall, a new approach to extreme value analysis, the Metastatistical Extreme Value Distribution (MEVD). The MEVD relaxes the above assumptions and is based on the whole distribution of daily rainfall events, thus allowing optimal use of all available observations. Using a global dataset of rain gauge observations, we show that the MEVD significantly outperforms the Generalized Extreme Value distribution, particularly for long average recurrence intervals and when small samples are available. The latter property suggests MEVD to be particularly suited for applications to satellite rainfall estimates, which only cover two decades, thus making extreme value estimation extremely challenging. Here we apply MEVD to the TRMM TMPA 3B42 product, an 18-year dataset of remotely-sensed daily rainfall providing a quasi-global coverage. Our analyses yield a global scale mapping of daily rainfall extremes and of their distributional tail properties, bridging the existing large gaps in ground-based networks. Finally, we illustrate how our global-scale analysis can provide insight into how properties of local rainfall regimes affect tail estimation uncertainty when using the GEV or MEVD approach. We find a dependence of the estimation uncertainty, for both the GEV- and MEV-based approaches, on the average annual number and on the inter-annual variability of rainy days. In particular, estimation uncertainty decreases 1) as the mean annual number of wet days increases, and 2) as the variability in the number of rainy days, expressed by its coefficient of variation, decreases. We tentatively explain this behavior in terms of the assumptions underlying the two approaches.
Investigating NARCCAP Precipitation Extremes via Bivariate Extreme Value Theory (Invited)
NASA Astrophysics Data System (ADS)
Weller, G. B.; Cooley, D. S.; Sain, S. R.; Bukovsky, M. S.; Mearns, L. O.
2013-12-01
We introduce methodology from statistical extreme value theory to examine the ability of reanalysis-drive regional climate models to simulate past daily precipitation extremes. Going beyond a comparison of summary statistics such as 20-year return values, we study whether the most extreme precipitation events produced by climate model simulations exhibit correspondence to the most extreme events seen in observational records. The extent of this correspondence is formulated via the statistical concept of tail dependence. We examine several case studies of extreme precipitation events simulated by the six models of the North American Regional Climate Change Assessment Program (NARCCAP) driven by NCEP reanalysis. It is found that the NARCCAP models generally reproduce daily winter precipitation extremes along the Pacific coast quite well; in contrast, simulation of past daily summer precipitation extremes in a central US region is poor. Some differences in the strength of extremal correspondence are seen in the central region between models which employ spectral nudging and those which do not. We demonstrate how these techniques may be used to draw a link between extreme precipitation events and large-scale atmospheric drivers, as well as to downscale extreme precipitation simulated by a future run of a regional climate model. Specifically, we examine potential future changes in the nature of extreme precipitation along the Pacific coast produced by the pineapple express (PE) phenomenon. A link between extreme precipitation events and a "PE Index" derived from North Pacific sea-surface pressure fields is found. This link is used to study PE-influenced extreme precipitation produced by a future-scenario climate model run.
Heavy Tail Behavior of Rainfall Extremes across Germany
NASA Astrophysics Data System (ADS)
Castellarin, A.; Kreibich, H.; Vorogushyn, S.; Merz, B.
2017-12-01
Distributions are termed heavy-tailed if extreme values are more likely than would be predicted by probability distributions that have exponential asymptotic behavior. Heavy-tail behavior often leads to surprise, because historical observations can be a poor guide for the future. Heavy-tail behavior seems to be widespread for hydro-meteorological extremes, such as extreme rainfall and flood events. To date there have been only vague hints to explain under which conditions these extremes show heavy-tail behavior. We use an observational data set consisting of 11 climate variables at 1440 stations across Germany. This homogenized, gap-free data set covers 110 years (1901-2010) at daily resolution. We estimate the upper tail behavior, including its uncertainty interval, of daily precipitation extremes for the 1,440 stations at the annual and seasonal time scales. Different tail indicators are tested, including the shape parameter of the Generalized Extreme Value distribution, the upper tail ratio and the obesity index. In a further step, we explore to which extent the tail behavior can be explained by geographical and climate factors. A large number of characteristics is derived, such as station elevation, degree of continentality, aridity, measures for quantifying the variability of humidity and wind velocity, or event-triggering large-scale atmospheric situation. The link between the upper tail behavior and these characteristics is investigated via data mining methods capable of detecting non-linear relationships in large data sets. This exceptionally rich observational data set, in terms of number of stations, length of time series and number of explaining variables, allows insights into the upper tail behavior which is rarely possible given the typical observational data sets available.
400 Years of summer hydroclimate from stable isotopes in Iberian trees
NASA Astrophysics Data System (ADS)
Andreu-Hayles, Laia; Ummenhofer, Caroline C.; Barriendos, Mariano; Schleser, Gerhard H.; Helle, Gerhard; Leuenberger, Markus; Gutiérrez, Emilia; Cook, Edward R.
2017-07-01
Tree rings are natural archives that annually record distinct types of past climate variability depending on the parameters measured. Here, we use ring-width and stable isotopes in cellulose of trees from the northwestern Iberian Peninsula (IP) to understand regional summer hydroclimate over the last 400 years and the associated atmospheric patterns. Correlations between tree rings and climate data demonstrate that isotope signatures in the targeted Iberian pine forests are very sensitive to water availability during the summer period, and are mainly controlled by stomatal conductance. Non-linear methods based on extreme events analysis allow for capturing distinct seasonal climatic variability recorded by tree-ring parameters and asymmetric signals of the associated atmospheric features. Moreover, years with extreme high (low) values in the tree-ring records were characterised by coherent large-scale atmospheric circulation patterns with reduced (enhanced) moisture transport onto the northwestern IP. These analyses of extremes revealed that high/low proxy values do not necessarily correspond to mirror images in the atmospheric anomaly patterns, suggesting different drivers of these patterns and the corresponding signature recorded in the proxies. Regional hydroclimate features across the broader IP and western Europe during extreme wet/dry summers detected by the northwestern IP trees compare favourably to independent multicentury sea level pressure and drought reconstructions for Europe. Historical records also validate our findings that attribute non-linear moisture signals recorded by extreme tree-ring values to distinct large-scale atmospheric patterns and allow for 400-year reconstructions of the frequency of occurrence of extreme conditions in late spring and summer hydroclimate.
400 years of summer hydroclimate from stable isotopes in Iberian trees
NASA Astrophysics Data System (ADS)
Andreu-Hayles, Laia; Ummenhofer, Caroline C.; Barriendos, Mariano; Schleser, Gerhard H.; Helle, Gerhard; Leuenberger, Markus; Gutierrez, Emilia; Cook, Edward R.
2017-04-01
Tree rings are natural archives that annually record distinct types of past climate variability depending on the parameters measured. Here, we use ring-width and stable isotopes in cellulose of trees from the northwestern Iberian Peninsula (IP) to understand regional summer hydroclimate over the last 400 years and the associated atmospheric patterns. Correlations between tree rings and climate data demonstrate that isotope signatures in the targeted Iberian pine forests are very sensitive to water availability during the summer period, and are mainly controlled by stomatal conductance. Non-linear methods based on extreme events analysis allow for capturing distinct seasonal climatic variability recorded by tree-ring parameters and asymmetric signals of the associated atmospheric features. Moreover, years with extreme high (low) values in the tree-ring records were characterised by coherent large-scale atmospheric circulation patterns with reduced (enhanced) moisture transport onto the northwestern IP. These analyses of extremes revealed that high/low proxy values do not necessarily correspond to mirror images in the atmospheric anomaly patterns, suggesting different drivers of these patterns and the corresponding signature recorded in the proxies. Regional hydroclimate features across the broader IP and western Europe during extreme wet/dry summers detected by the northwestern IP trees compare favourably to an independent multicentury sea level pressure and drought reconstruction for Europe. Historical records also validate our findings that attribute non-linear moisture signals recorded by extreme tree-ring values to distinct large-scale atmospheric patterns and allow for 400-yr reconstructions of the frequency of occurrence of extreme conditions in summer hydroclimate. We will discuss how the results for Lillo compare with other records.
Uncertainty in determining extreme precipitation thresholds
NASA Astrophysics Data System (ADS)
Liu, Bingjun; Chen, Junfan; Chen, Xiaohong; Lian, Yanqing; Wu, Lili
2013-10-01
Extreme precipitation events are rare and occur mostly on a relatively small and local scale, which makes it difficult to set the thresholds for extreme precipitations in a large basin. Based on the long term daily precipitation data from 62 observation stations in the Pearl River Basin, this study has assessed the applicability of the non-parametric, parametric, and the detrended fluctuation analysis (DFA) methods in determining extreme precipitation threshold (EPT) and the certainty to EPTs from each method. Analyses from this study show the non-parametric absolute critical value method is easy to use, but unable to reflect the difference of spatial rainfall distribution. The non-parametric percentile method can account for the spatial distribution feature of precipitation, but the problem with this method is that the threshold value is sensitive to the size of rainfall data series and is subjected to the selection of a percentile thus make it difficult to determine reasonable threshold values for a large basin. The parametric method can provide the most apt description of extreme precipitations by fitting extreme precipitation distributions with probability distribution functions; however, selections of probability distribution functions, the goodness-of-fit tests, and the size of the rainfall data series can greatly affect the fitting accuracy. In contrast to the non-parametric and the parametric methods which are unable to provide information for EPTs with certainty, the DFA method although involving complicated computational processes has proven to be the most appropriate method that is able to provide a unique set of EPTs for a large basin with uneven spatio-temporal precipitation distribution. The consistency between the spatial distribution of DFA-based thresholds with the annual average precipitation, the coefficient of variation (CV), and the coefficient of skewness (CS) for the daily precipitation further proves that EPTs determined by the DFA method are more reasonable and applicable for the Pearl River Basin.
NASA Astrophysics Data System (ADS)
Kergadallan, Xavier; Bernardara, Pietro; Benoit, Michel; Andreewsky, Marc; Weiss, Jérôme
2013-04-01
Estimating the probability of occurrence of extreme sea levels is a central issue for the protection of the coast. Return periods of sea level with wave set-up contribution are estimated here in one site : Cherbourg in France in the English Channel. The methodology follows two steps : the first one is computation of joint probability of simultaneous wave height and still sea level, the second one is interpretation of that joint probabilities to assess a sea level for a given return period. Two different approaches were evaluated to compute joint probability of simultaneous wave height and still sea level : the first one is multivariate extreme values distributions of logistic type in which all components of the variables become large simultaneously, the second one is conditional approach for multivariate extreme values in which only one component of the variables have to be large. Two different methods were applied to estimate sea level with wave set-up contribution for a given return period : Monte-Carlo simulation in which estimation is more accurate but needs higher calculation time and classical ocean engineering design contours of type inverse-FORM in which the method is simpler and allows more complex estimation of wave setup part (wave propagation to the coast for example). We compare results from the two different approaches with the two different methods. To be able to use both Monte-Carlo simulation and design contours methods, wave setup is estimated with an simple empirical formula. We show advantages of the conditional approach compared to the multivariate extreme values approach when extreme sea-level occurs when either surge or wave height is large. We discuss the validity of the ocean engineering design contours method which is an alternative when computation of sea levels is too complex to use Monte-Carlo simulation method.
Extremes in ecology: Avoiding the misleading effects of sampling variation in summary analyses
Link, W.A.; Sauer, J.R.
1996-01-01
Surveys such as the North American Breeding Bird Survey (BBS) produce large collections of parameter estimates. One's natural inclination when confronted with lists of parameter estimates is to look for the extreme values: in the BBS, these correspond to the species that appear to have the greatest changes in population size through time. Unfortunately, extreme estimates are liable to correspond to the most poorly estimated parameters. Consequently, the most extreme parameters may not match up with the most extreme parameter estimates. The ranking of parameter values on the basis of their estimates are a difficult statistical problem. We use data from the BBS and simulations to illustrate the potential misleading effects of sampling variation in rankings of parameters. We describe empirical Bayes and constrained empirical Bayes procedures which provide partial solutions to the problem of ranking in the presence of sampling variation.
Extreme between-study homogeneity in meta-analyses could offer useful insights.
Ioannidis, John P A; Trikalinos, Thomas A; Zintzaras, Elias
2006-10-01
Meta-analyses are routinely evaluated for the presence of large between-study heterogeneity. We examined whether it is also important to probe whether there is extreme between-study homogeneity. We used heterogeneity tests with left-sided statistical significance for inference and developed a Monte Carlo simulation test for testing extreme homogeneity in risk ratios across studies, using the empiric distribution of the summary risk ratio and heterogeneity statistic. A left-sided P=0.01 threshold was set for claiming extreme homogeneity to minimize type I error. Among 11,803 meta-analyses with binary contrasts from the Cochrane Library, 143 (1.21%) had left-sided P-value <0.01 for the asymptotic Q statistic and 1,004 (8.50%) had left-sided P-value <0.10. The frequency of extreme between-study homogeneity did not depend on the number of studies in the meta-analyses. We identified examples where extreme between-study homogeneity (left-sided P-value <0.01) could result from various possibilities beyond chance. These included inappropriate statistical inference (asymptotic vs. Monte Carlo), use of a specific effect metric, correlated data or stratification using strong predictors of outcome, and biases and potential fraud. Extreme between-study homogeneity may provide useful insights about a meta-analysis and its constituent studies.
Mathematical aspects of assessing extreme events for the safety of nuclear plants
NASA Astrophysics Data System (ADS)
Potempski, Slawomir; Borysiewicz, Mieczyslaw
2015-04-01
In the paper the review of mathematical methodologies applied for assessing low frequencies of rare natural events like earthquakes, tsunamis, hurricanes or tornadoes, floods (in particular flash floods and surge storms), lightning, solar flares, etc., will be given in the perspective of the safety assessment of nuclear plants. The statistical methods are usually based on the extreme value theory, which deals with the analysis of extreme deviation from the median (or the mean). In this respect application of various mathematical tools can be useful, like: the extreme value theorem of Fisher-Tippett-Gnedenko leading to possible choices of general extreme value distributions, or the Pickands-Balkema-de Haan theorem for tail fitting, or the methods related to large deviation theory. In the paper the most important stochastic distributions relevant for performing rare events statistical analysis will be presented. This concerns, for example, the analysis of the data with the annual extreme values (maxima - "Annual Maxima Series" or minima), or the peak values, exceeding given thresholds at some periods of interest ("Peak Over Threshold"), or the estimation of the size of exceedance. Despite of the fact that there is a lack of sufficient statistical data directly containing rare events, in some cases it is still possible to extract useful information from existing larger data sets. As an example one can consider some data sets available from the web sites for floods, earthquakes or generally natural hazards. Some aspects of such data sets will be also presented taking into account their usefulness for the practical assessment of risk for nuclear power plants coming from extreme weather conditions.
Near-term probabilistic forecast of significant wildfire events for the Western United States
Haiganoush K. Preisler; Karin L. Riley; Crystal S. Stonesifer; Dave E. Calkin; Matt Jolly
2016-01-01
Fire danger and potential for large fires in the United States (US) is currently indicated via several forecasted qualitative indices. However, landscape-level quantitative forecasts of the probability of a large fire are currently lacking. In this study, we present a framework for forecasting large fire occurrence - an extreme value event - and evaluating...
Measurements of storm-generated bottom stresses on the continental shelf.
Cacchione, D.A.; Drake, D.E.
1982-01-01
Large values of bottom friction velocity, u., and roughness length, zo, determined from burst-averaged speed data taken on the continental shelf in outer Norton Sound, Alaska, with the GEOPROBE tripod during a storm are correlated with extremely large values of near-bottom concentration of total suspended particulate matter (TSM). The values obtained from the 'law of the wall' velocity-depth relationship are diminished substantially throughout the storm period when the turbulence-reducing effects of the vertical cncentration gradient of TSM are considered. The values are compared to those obtained from other workers. -from Authors
NASA Astrophysics Data System (ADS)
Wintoft, Peter; Viljanen, Ari; Wik, Magnus
2016-05-01
High-frequency ( ≈ minutes) variability of ground magnetic fields is caused by ionospheric and magnetospheric processes driven by the changing solar wind. The varying magnetic fields induce electrical fields that cause currents to flow in man-made conductors like power grids and pipelines. Under extreme conditions the geomagnetically induced currents (GIC) may be harmful to the power grids. Increasing our understanding of the extreme events is thus important for solar-terrestrial science and space weather. In this work 1-min resolution of the time derivative of measured local magnetic fields (|dBh/dt|) and computed electrical fields (Eh), for locations in Europe, have been analysed with extreme value analysis (EVA). The EVA results in an estimate of the generalized extreme value probability distribution that is described by three parameters: location, width, and shape. The shape parameter controls the extreme behaviour. The stations cover geomagnetic latitudes from 40 to 70° N. All stations included in the study have contiguous coverage of 18 years or more with 1-min resolution data. As expected, the EVA shows that the higher latitude stations have higher probability of large |dBh/dt| and |Eh| compared to stations further south. However, the EVA also shows that the shape of the distribution changes with magnetic latitude. The high latitudes have distributions that fall off faster to zero than the low latitudes, and upward bounded distributions can not be ruled out. The transition occurs around 59-61° N magnetic latitudes. Thus, the EVA shows that the observed series north of ≈ 60° N have already measured values that are close to the expected maxima values, while stations south of ≈ ° N will measure larger values in the future.
NASA Astrophysics Data System (ADS)
Wen, Xian-Huan; Gómez-Hernández, J. Jaime
1998-03-01
The macrodispersion of an inert solute in a 2-D heterogeneous porous media is estimated numerically in a series of fields of varying heterogeneity. Four different random function (RF) models are used to model log-transmissivity (ln T) spatial variability, and for each of these models, ln T variance is varied from 0.1 to 2.0. The four RF models share the same univariate Gaussian histogram and the same isotropic covariance, but differ from one another in terms of the spatial connectivity patterns at extreme transmissivity values. More specifically, model A is a multivariate Gaussian model for which, by definition, extreme values (both high and low) are spatially uncorrelated. The other three models are non-multi-Gaussian: model B with high connectivity of high extreme values, model C with high connectivity of low extreme values, and model D with high connectivities of both high and low extreme values. Residence time distributions (RTDs) and macrodispersivities (longitudinal and transverse) are computed on ln T fields corresponding to the different RF models, for two different flow directions and at several scales. They are compared with each other, as well as with predicted values based on first-order analytical results. Numerically derived RTDs and macrodispersivities for the multi-Gaussian model are in good agreement with analytically derived values using first-order theories for log-transmissivity variance up to 2.0. The results from the non-multi-Gaussian models differ from each other and deviate largely from the multi-Gaussian results even when ln T variance is small. RTDs in non-multi-Gaussian realizations with high connectivity at high extreme values display earlier breakthrough than in multi-Gaussian realizations, whereas later breakthrough and longer tails are observed for RTDs from non-multi-Gaussian realizations with high connectivity at low extreme values. Longitudinal macrodispersivities in the non-multi-Gaussian realizations are, in general, larger than in the multi-Gaussian ones, while transverse macrodispersivities in the non-multi-Gaussian realizations can be larger or smaller than in the multi-Gaussian ones depending on the type of connectivity at extreme values. Comparing the numerical results for different flow directions, it is confirmed that macrodispersivities in multi-Gaussian realizations with isotropic spatial correlation are not flow direction-dependent. Macrodispersivities in the non-multi-Gaussian realizations, however, are flow direction-dependent although the covariance of ln T is isotropic (the same for all four models). It is important to account for high connectivities at extreme transmissivity values, a likely situation in some geological formations. Some of the discrepancies between first-order-based analytical results and field-scale tracer test data may be due to the existence of highly connected paths of extreme conductivity values.
Ensemble-based evaluation of extreme water levels for the eastern Baltic Sea
NASA Astrophysics Data System (ADS)
Eelsalu, Maris; Soomere, Tarmo
2016-04-01
The risks and damages associated with coastal flooding that are naturally associated with an increase in the magnitude of extreme storm surges are one of the largest concerns of countries with extensive low-lying nearshore areas. The relevant risks are even more contrast for semi-enclosed water bodies such as the Baltic Sea where subtidal (weekly-scale) variations in the water volume of the sea substantially contribute to the water level and lead to large spreading of projections of future extreme water levels. We explore the options for using large ensembles of projections to more reliably evaluate return periods of extreme water levels. Single projections of the ensemble are constructed by means of fitting several sets of block maxima with various extreme value distributions. The ensemble is based on two simulated data sets produced in the Swedish Meteorological and Hydrological Institute. A hindcast by the Rossby Centre Ocean model is sampled with a resolution of 6 h and a similar hindcast by the circulation model NEMO with a resolution of 1 h. As the annual maxima of water levels in the Baltic Sea are not always uncorrelated, we employ maxima for calendar years and for stormy seasons. As the shape parameter of the Generalised Extreme Value distribution changes its sign and substantially varies in magnitude along the eastern coast of the Baltic Sea, the use of a single distribution for the entire coast is inappropriate. The ensemble involves projections based on the Generalised Extreme Value, Gumbel and Weibull distributions. The parameters of these distributions are evaluated using three different ways: maximum likelihood method and method of moments based on both biased and unbiased estimates. The total number of projections in the ensemble is 40. As some of the resulting estimates contain limited additional information, the members of pairs of projections that are highly correlated are assigned weights 0.6. A comparison of the ensemble-based projection of extreme water levels and their return periods with similar estimates derived from local observations reveals an interesting pattern of match and mismatch. The match is almost perfect in measurement sites where local effects (e.g., wave-induced set-up or local surge in very shallow areas that are not resolved by circulation models) do not contribute to the observed values of water level. There is, however, substantial mismatch between projected and observed extreme values for most of the Estonian coast. The mismatch is largest for sections that are open to high waves and for several bays that are deeply cut into mainland but open for predominant strong wind directions. Detailed quantification of this mismatch eventually makes it possible to develop substantially improved estimates of extreme water levels in sections where local effects considerably contribute into the total water level.
Spatial variation of statistical properties of extreme water levels along the eastern Baltic Sea
NASA Astrophysics Data System (ADS)
Pindsoo, Katri; Soomere, Tarmo; Rocha, Eugénio
2016-04-01
Most of existing projections of future extreme water levels rely on the use of classic generalised extreme value distributions. The choice to use a particular distribution is often made based on the absolute value of the shape parameter of the Generalise Extreme Value distribution. If this parameter is small, the Gumbel distribution is most appropriate while in the opposite case the Weibull or Frechet distribution could be used. We demonstrate that the alongshore variation in the statistical properties of numerically simulated high water levels along the eastern coast of the Baltic Sea is so large that the use of a single distribution for projections of extreme water levels is highly questionable. The analysis is based on two simulated data sets produced in the Swedish Meteorological and Hydrological Institute. The output of the Rossby Centre Ocean model is sampled with a resolution of 6 h and the output of the circulation model NEMO with a resolution of 1 h. As the maxima of water levels of subsequent years may be correlated in the Baltic Sea, we also employ maxima for stormy seasons. We provide a detailed analysis of spatial variation of the parameters of the family of extreme value distributions along an approximately 600 km long coastal section from the north-western shore of Latvia in the Baltic Proper until the eastern Gulf of Finland. The parameters are evaluated using maximum likelihood method and method of moments. The analysis also covers the entire Gulf of Riga. The core parameter of this family of distributions, the shape parameter of the Generalised Extreme Value distribution, exhibits extensive variation in the study area. Its values evaluated using the Hydrognomon software and maximum likelihood method, vary from about -0.1 near the north-western coast of Latvia in the Baltic Proper up to about 0.05 in the eastern Gulf of Finland. This parameter is very close to zero near Tallinn in the western Gulf of Finland. Thus, it is natural that the Gumbel distribution gives adequate projections of extreme water levels for the vicinity of Tallinn. More importantly, this feature indicates that the use of a single distribution for the projections of extreme water levels and their return periods for the entire Baltic Sea coast is inappropriate. The physical reason is the interplay of the complex shape of large subbasins (such as the Gulf of Riga and Gulf of Finland) of the sea and highly anisotropic wind regime. The 'impact' of this anisotropy on the statistics of water level is amplified by the overall anisotropy of the distributions of the frequency of occurrence of high and low water levels. The most important conjecture is that long-term behaviour of water level extremes in different coastal sections of the Baltic Sea may be fundamentally different.
GPS FOM Chimney Analysis using Generalized Extreme Value Distribution
NASA Technical Reports Server (NTRS)
Ott, Rick; Frisbee, Joe; Saha, Kanan
2004-01-01
Many a time an objective of a statistical analysis is to estimate a limit value like 3-sigma 95% confidence upper limit from a data sample. The generalized Extreme Value Distribution method can be profitably employed in many situations for such an estimate. . .. It is well known that according to the Central Limit theorem the mean value of a large data set is normally distributed irrespective of the distribution of the data from which the mean value is derived. In a somewhat similar fashion it is observed that many times the extreme value of a data set has a distribution that can be formulated with a Generalized Distribution. In space shuttle entry with 3-string GPS navigation the Figure Of Merit (FOM) value gives a measure of GPS navigated state accuracy. A GPS navigated state with FOM of 6 or higher is deemed unacceptable and is said to form a FOM 6 or higher chimney. A FOM chimney is a period of time during which the FOM value stays higher than 5. A longer period of FOM of value 6 or higher causes navigated state to accumulate more error for a lack of state update. For an acceptable landing it is imperative that the state error remains low and hence at low altitude during entry GPS data of FOM greater than 5 must not last more than 138 seconds. I To test the GPS performAnce many entry test cases were simulated at the Avionics Development Laboratory. Only high value FoM chimneys are consequential. The extreme value statistical technique is applied to analyze high value FOM chimneys. The Maximum likelihood method is used to determine parameters that characterize the GEV distribution, and then the limit value statistics are estimated.
Lisi Pei; Nathan Moore; Shiyuan Zhong; Lifeng Luo; David W. Hyndman; Warren E. Heilman; Zhiqiu Gao
2014-01-01
Extreme weather and climate events, especially short-term excessive drought and wet periods over agricultural areas, have received increased attention. The Southern Great Plains (SGP) is one of the largest agricultural regions in North America and features the underlying Ogallala-High Plains Aquifer system worth great economic value in large part due to production...
New Insights into the Estimation of Extreme Geomagnetic Storm Occurrences
NASA Astrophysics Data System (ADS)
Ruffenach, Alexis; Winter, Hugo; Lavraud, Benoit; Bernardara, Pietro
2017-04-01
Space weather events such as intense geomagnetic storms are major disturbances of the near-Earth environment that may lead to serious impacts on our modern society. As such, it is of great importance to estimate their probability, and in particular that of extreme events. One approach largely used in statistical sciences for extreme events probability estimates is Extreme Value Analysis (EVA). Using this rigorous statistical framework, estimations of the occurrence of extreme geomagnetic storms are performed here based on the most relevant global parameters related to geomagnetic storms, such as ground parameters (e.g. geomagnetic Dst and aa indexes), and space parameters related to the characteristics of Coronal Mass Ejections (CME) (velocity, southward magnetic field component, electric field). Using our fitted model, we estimate the annual probability of a Carrington-type event (Dst = -850nT) to be on the order of 10-3, with a lower limit of the uncertainties on the return period of ˜500 years. Our estimate is significantly higher than that of most past studies, which typically had a return period of a few 100 years at maximum. Thus precautions are required when extrapolating intense values. Currently, the complexity of the processes and the length of available data inevitably leads to significant uncertainties in return period estimates for the occurrence of extreme geomagnetic storms. However, our application of extreme value models for extrapolating into the tail of the distribution provides a mathematically justified framework for the estimation of extreme return periods, thereby enabling the determination of more accurate estimates and reduced associated uncertainties.
NASA Astrophysics Data System (ADS)
Qian, Yu-Kun; Liang, Chang-Xia; Yuan, Zhuojian; Peng, Shiqiu; Wu, Junjie; Wang, Sihua
2016-05-01
Based on 25-year (1987-2011) tropical cyclone (TC) best track data, a statistical study was carried out to investigate the basic features of upper-tropospheric TC-environment interactions over the western North Pacific. Interaction was defined as the absolute value of eddy momentum flux convergence (EFC) exceeding 10 m s-1 d-1. Based on this definition, it was found that 18% of all six-hourly TC samples experienced interaction. Extreme interaction cases showed that EFC can reach ~120 m s-1 d-1 during the extratropical-cyclone (EC) stage, an order of magnitude larger than reported in previous studies. Composite analysis showed that positive interactions are characterized by a double-jet flow pattern, rather than the traditional trough pattern, because it is the jets that bring in large EFC from the upper-level environment to the TC center. The role of the outflow jet is also enhanced by relatively low inertial stability, as compared to the inflow jet. Among several environmental factors, it was found that extremely large EFC is usually accompanied by high inertial stability, low SST and strong vertical wind shear (VWS). Thus, the positive effect of EFC is cancelled by their negative effects. Only those samples during the EC stage, whose intensities were less dependent on VWS and the underlying SST, could survive in extremely large EFC environments, or even re-intensify. For classical TCs (not in the EC stage), it was found that environments with a moderate EFC value generally below ~25 m s-1 d-1 are more favorable for a TC's intensification than those with extremely large EFC.
Greenville, Aaron C; Wardle, Glenda M; Dickman, Chris R
2012-01-01
Extreme climatic events, such as flooding rains, extended decadal droughts and heat waves have been identified increasingly as important regulators of natural populations. Climate models predict that global warming will drive changes in rainfall and increase the frequency and severity of extreme events. Consequently, to anticipate how organisms will respond we need to document how changes in extremes of temperature and rainfall compare to trends in the mean values of these variables and over what spatial scales the patterns are consistent. Using the longest historical weather records available for central Australia – 100 years – and quantile regression methods, we investigate if extreme climate events have changed at similar rates to median events, if annual rainfall has increased in variability, and if the frequency of large rainfall events has increased over this period. Specifically, we compared local (individual weather stations) and regional (Simpson Desert) spatial scales, and quantified trends in median (50th quantile) and extreme weather values (5th, 10th, 90th, and 95th quantiles). We found that median and extreme annual minimum and maximum temperatures have increased at both spatial scales over the past century. Rainfall changes have been inconsistent across the Simpson Desert; individual weather stations showed increases in annual rainfall, increased frequency of large rainfall events or more prolonged droughts, depending on the location. In contrast to our prediction, we found no evidence that intra-annual rainfall had become more variable over time. Using long-term live-trapping records (22 years) of desert small mammals as a case study, we demonstrate that irruptive events are driven by extreme rainfalls (>95th quantile) and that increases in the magnitude and frequency of extreme rainfall events are likely to drive changes in the populations of these species through direct and indirect changes in predation pressure and wildfires. PMID:23170202
How to recover more value from small pine trees: Essential oils and resins
Vasant M. Kelkar; Brian W. Geils; Dennis R. Becker; Steven T. Overby; Daniel G. Neary
2006-01-01
In recent years, the young dense forests of northern Arizona have suffered extreme droughts, wildfires, and insect outbreaks. Improving forest health requires reducing forest density by cutting many small-diameter trees with the consequent production of large volumes of residual biomass. To offset the cost of handling this low-value timber, additional marketing options...
From Chebyshev to Bernstein: A Tour of Polynomials Small and Large
ERIC Educational Resources Information Center
Boelkins, Matthew; Miller, Jennifer; Vugteveen, Benjamin
2006-01-01
Consider the family of monic polynomials of degree n having zeros at -1 and +1 and all their other real zeros in between these two values. This article explores the size of these polynomials using the supremum of the absolute value on [-1, 1], showing that scaled Chebyshev and Bernstein polynomials give the extremes.
The Family in the Structure of Values of Young People
ERIC Educational Resources Information Center
Rean, A. A.
2018-01-01
Despite the fact that the family is extremely significant in the system of values of young people (in Russia), the number of divorces is increasing in this population group. Our analysis of this contradiction establishes that young people need to be specially prepared for family life. The paper presents the results of a large empirical study…
Exact extreme-value statistics at mixed-order transitions.
Bar, Amir; Majumdar, Satya N; Schehr, Grégory; Mukamel, David
2016-05-01
We study extreme-value statistics for spatially extended models exhibiting mixed-order phase transitions (MOT). These are phase transitions that exhibit features common to both first-order (discontinuity of the order parameter) and second-order (diverging correlation length) transitions. We consider here the truncated inverse distance squared Ising model, which is a prototypical model exhibiting MOT, and study analytically the extreme-value statistics of the domain lengths The lengths of the domains are identically distributed random variables except for the global constraint that their sum equals the total system size L. In addition, the number of such domains is also a fluctuating variable, and not fixed. In the paramagnetic phase, we show that the distribution of the largest domain length l_{max} converges, in the large L limit, to a Gumbel distribution. However, at the critical point (for a certain range of parameters) and in the ferromagnetic phase, we show that the fluctuations of l_{max} are governed by novel distributions, which we compute exactly. Our main analytical results are verified by numerical simulations.
NASA Astrophysics Data System (ADS)
Dhakal, N.; Jain, S.
2013-12-01
Rare and unusually large events (such as hurricanes and floods) can create unusual and interesting trends in statistics. Generalized Extreme Value (GEV) distribution is usually used to statistically describe extreme rainfall events. A number of the recent studies have shown that the frequency of extreme rainfall events has increased over the last century and as a result, there has been change in parameters of GEV distribution with the time (non-stationary). But what impact does a single unusually large rainfall event (e.g., hurricane Irene) have on the GEV parameters and consequently on the level of risks or the return periods used in designing the civil infrastructures? In other words, if such a large event occurs today, how will it influence the level of risks (estimated based on past rainfall records) for the civil infrastructures? To answer these questions, we performed sensitivity analysis of the distribution parameters of GEV as well as the return periods to unusually large outlier events. The long-term precipitation records over the period of 1981-2010 from 12 USHCN stations across the state of Maine were used for analysis. For most of the stations, addition of each outlier event caused an increase in the shape parameter with a huge decrease on the corresponding return period. This is a key consideration for time-varying engineering design. These isolated extreme weather events should simultaneously be considered with traditional statistical methodology related to extreme events while designing civil infrastructures (such as dams, bridges, and culverts). Such analysis is also useful in understanding the statistical uncertainty of projecting extreme events into future.
The end of trend-estimation for extreme floods under climate change?
NASA Astrophysics Data System (ADS)
Schulz, Karsten; Bernhardt, Matthias
2016-04-01
An increased risk of flood events is one of the major threats under future climate change conditions. Therefore, many recent studies have investigated trends in flood extreme occurences using historic long-term river discharge data as well as simulations from combined global/regional climate and hydrological models. Severe floods are relatively rare events and the robust estimation of their probability of occurrence requires long time series of data (6). Following a method outlined by the IPCC research community, trends in extreme floods are calculated based on the difference of discharge values exceeding e.g. a 100-year level (Q100) between two 30-year windows, which represents prevailing conditions in a reference and a future time period, respectively. Following this approach, we analysed multiple, synthetically derived 2,000-year trend-free, yearly maximum runoff data generated using three different extreme value distributions (EDV). The parameters were estimated from long term runoff data of four large European watersheds (Danube, Elbe, Rhine, Thames). Both, Q100-values estimated from 30-year moving windows, as well as the subsequently derived trends showed enormous variations with time: for example, estimating the Extreme Value (Gumbel) - distribution for the Danube data, trends of Q100 in the synthetic time-series range from -4,480 to 4,028 m³/s per 100 years (Q100 =10,071m³/s, for reference). Similar results were found when applying other extreme value distributions (Weibull, and log-Normal) to all of the watersheds considered. This variability or "background noise" of estimating trends in flood extremes makes it almost impossible to significantly distinguish any real trend in observed as well as modelled data when such an approach is applied. These uncertainties, even though known in principle are hardly addressed and discussed by the climate change impact community. Any decision making and flood risk management, including the dimensioning of flood protection measures, that is based on such studies might therefore be fundamentally flawed.
Baldi, Pierre
2010-01-01
As repositories of chemical molecules continue to expand and become more open, it becomes increasingly important to develop tools to search them efficiently and assess the statistical significance of chemical similarity scores. Here we develop a general framework for understanding, modeling, predicting, and approximating the distribution of chemical similarity scores and its extreme values in large databases. The framework can be applied to different chemical representations and similarity measures but is demonstrated here using the most common binary fingerprints with the Tanimoto similarity measure. After introducing several probabilistic models of fingerprints, including the Conditional Gaussian Uniform model, we show that the distribution of Tanimoto scores can be approximated by the distribution of the ratio of two correlated Normal random variables associated with the corresponding unions and intersections. This remains true also when the distribution of similarity scores is conditioned on the size of the query molecules in order to derive more fine-grained results and improve chemical retrieval. The corresponding extreme value distributions for the maximum scores are approximated by Weibull distributions. From these various distributions and their analytical forms, Z-scores, E-values, and p-values are derived to assess the significance of similarity scores. In addition, the framework allows one to predict also the value of standard chemical retrieval metrics, such as Sensitivity and Specificity at fixed thresholds, or ROC (Receiver Operating Characteristic) curves at multiple thresholds, and to detect outliers in the form of atypical molecules. Numerous and diverse experiments carried in part with large sets of molecules from the ChemDB show remarkable agreement between theory and empirical results. PMID:20540577
Extreme geomagnetically induced currents
NASA Astrophysics Data System (ADS)
Kataoka, Ryuho; Ngwira, Chigomezyo
2016-12-01
We propose an emergency alert framework for geomagnetically induced currents (GICs), based on the empirically extreme values and theoretical upper limits of the solar wind parameters and of d B/d t, the time derivative of magnetic field variations at ground. We expect this framework to be useful for preparing against extreme events. Our analysis is based on a review of various papers, including those presented during Extreme Space Weather Workshops held in Japan in 2011, 2012, 2013, and 2014. Large-amplitude d B/d t values are the major cause of hazards associated with three different types of GICs: (1) slow d B/d t with ring current evolution (RC-type), (2) fast d B/d t associated with auroral electrojet activity (AE-type), and (3) transient d B/d t of sudden commencements (SC-type). We set "caution," "warning," and "emergency" alert levels during the main phase of superstorms with the peak Dst index of less than -300 nT (once per 10 years), -600 nT (once per 60 years), or -900 nT (once per 100 years), respectively. The extreme d B/d t values of the AE-type GICs are 2000, 4000, and 6000 nT/min at caution, warning, and emergency levels, respectively. For the SC-type GICs, a "transient alert" is also proposed for d B/d t values of 40 nT/s at low latitudes and 110 nT/s at high latitudes, especially when the solar energetic particle flux is unusually high.
Improving power and robustness for detecting genetic association with extreme-value sampling design.
Chen, Hua Yun; Li, Mingyao
2011-12-01
Extreme-value sampling design that samples subjects with extremely large or small quantitative trait values is commonly used in genetic association studies. Samples in such designs are often treated as "cases" and "controls" and analyzed using logistic regression. Such a case-control analysis ignores the potential dose-response relationship between the quantitative trait and the underlying trait locus and thus may lead to loss of power in detecting genetic association. An alternative approach to analyzing such data is to model the dose-response relationship by a linear regression model. However, parameter estimation from this model can be biased, which may lead to inflated type I errors. We propose a robust and efficient approach that takes into consideration of both the biased sampling design and the potential dose-response relationship. Extensive simulations demonstrate that the proposed method is more powerful than the traditional logistic regression analysis and is more robust than the linear regression analysis. We applied our method to the analysis of a candidate gene association study on high-density lipoprotein cholesterol (HDL-C) which includes study subjects with extremely high or low HDL-C levels. Using our method, we identified several SNPs showing a stronger evidence of association with HDL-C than the traditional case-control logistic regression analysis. Our results suggest that it is important to appropriately model the quantitative traits and to adjust for the biased sampling when dose-response relationship exists in extreme-value sampling designs. © 2011 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Rieder, H. E.; Staehelin, J.; Maeder, J. A.; Ribatet, M.; Davison, A. C.
2009-04-01
Various generations of satellites (e.g. TOMS, GOME, OMI) made spatial datasets of column ozone available to the scientific community. This study has a special focus on column ozone over the northern mid-latitudes. Tools from geostatistics and extreme value theory are applied to analyze variability, long-term trends and frequency distributions of extreme events in total ozone. In a recent case study (Rieder et al., 2009) new tools from extreme value theory (Coles, 2001; Ribatet, 2007) have been applied to the world's longest total ozone record from Arosa, Switzerland (e.g. Staehelin 1998a,b), in order to describe extreme events in low and high total ozone. Within the current study this analysis is extended to satellite datasets for the northern mid-latitudes. Further special emphasis is given on patterns and spatial correlations and the influence of changes in atmospheric dynamics (e.g. tropospheric and lower stratospheric pressure systems) on column ozone. References: Coles, S.: An Introduction to Statistical Modeling of Extreme Values, Springer Series in Statistics, ISBN:1852334592, Springer, Berlin, 2001. Ribatet, M.: POT: Modelling peaks over a threshold, R News, 7, 34-36, 2007. Rieder, H.E., Staehelin, J., Maeder, J.A., Ribatet, M., Stübi, R., Weihs, P., Holawe, F., Peter, T., and Davison, A.C.: From ozone mini holes and mini highs towards extreme value theory: New insights from extreme events and non stationarity, submitted to J. Geophys. Res., 2009. Staehelin, J., Kegel, R., and Harris, N. R.: Trend analysis of the homogenized total ozone series of Arosa (Switzerland), 1929-1996, J. Geophys. Res., 103(D7), 8389-8400, doi:10.1029/97JD03650, 1998a. Staehelin, J., Renaud, A., Bader, J., McPeters, R., Viatte, P., Hoegger, B., Bugnion, V., Giroud, M., and Schill, H.: Total ozone series at Arosa (Switzerland): Homogenization and data comparison, J. Geophys. Res., 103(D5), 5827-5842, doi:10.1029/97JD02402, 1998b.
NASA Astrophysics Data System (ADS)
Gao, Tao; Xie, Lian
2016-12-01
Precipitation extremes are the dominated causes for the formation of severe flood disasters at regional and local scales under the background of global climate change. In the present study, five annual extreme precipitation events, including 1, 7 and 30 day annual maximum rainfall and 95th and 97.5th percentile threshold levels, are analyzed relating to the reference period 1960-2011 from 140 meteorological stations over Yangtze River basin (YRB). A generalized extreme value (GEV) distribution is applied to fit annual and percentile extreme precipitation events at each station with return periods up to 200 years. The entire time period is divided into preclimatic (preceding climatic) period 1960-1980 and aftclimatic (after climatic) period 1981-2011 by considering distinctly abrupt shift of precipitation regime in the late 1970s across YRB. And the Mann-Kendall trend test is adopted to conduct trend analysis during pre- and aftclimatic periods, respectively, for the purpose of exploring possible increasing/decreasing patterns in precipitation extremes. The results indicate that the increasing trends for return values during aftclimatic period change significantly in time and space in terms of different magnitudes of extreme precipitation, while the stations with significantly positive trends are mainly distributed in the vicinity of the mainstream and major tributaries as well as large lakes, this would result in more tremendous flood disasters in the mid-lower reaches of YRB, especially in southeast coastal regions. The increasing/decreasing linear trends based on annual maximum precipitation are also investigated in pre- and aftclimatic periods, respectively, whereas those changes are not significantly similar to the variations of return values during both subperiods. Moreover, spatiotemporal patterns of precipitation extremes become more uneven and unstable in the second half period over YRB.
Spatial extreme value analysis to project extremes of large-scale indicators for severe weather
Gilleland, Eric; Brown, Barbara G; Ammann, Caspar M
2013-01-01
Concurrently high values of the maximum potential wind speed of updrafts (Wmax) and 0–6 km wind shear (Shear) have been found to represent conducive environments for severe weather, which subsequently provides a way to study severe weather in future climates. Here, we employ a model for the product of these variables (WmSh) from the National Center for Atmospheric Research/United States National Center for Environmental Prediction reanalysis over North America conditioned on their having extreme energy in the spatial field in order to project the predominant spatial patterns of WmSh. The approach is based on the Heffernan and Tawn conditional extreme value model. Results suggest that this technique estimates the spatial behavior of WmSh well, which allows for exploring possible changes in the patterns over time. While the model enables a method for inferring the uncertainty in the patterns, such analysis is difficult with the currently available inference approach. A variation of the method is also explored to investigate how this type of model might be used to qualitatively understand how the spatial patterns of WmSh correspond to extreme river flow events. A case study for river flows from three rivers in northwestern Tennessee is studied, and it is found that advection of WmSh from the Gulf of Mexico prevails while elsewhere, WmSh is generally very low during such extreme events. © 2013 The Authors. Environmetrics published by JohnWiley & Sons, Ltd. PMID:24223482
Complex extreme learning machine applications in terahertz pulsed signals feature sets.
Yin, X-X; Hadjiloucas, S; Zhang, Y
2014-11-01
This paper presents a novel approach to the automatic classification of very large data sets composed of terahertz pulse transient signals, highlighting their potential use in biochemical, biomedical, pharmaceutical and security applications. Two different types of THz spectra are considered in the classification process. Firstly a binary classification study of poly-A and poly-C ribonucleic acid samples is performed. This is then contrasted with a difficult multi-class classification problem of spectra from six different powder samples that although have fairly indistinguishable features in the optical spectrum, they also possess a few discernable spectral features in the terahertz part of the spectrum. Classification is performed using a complex-valued extreme learning machine algorithm that takes into account features in both the amplitude as well as the phase of the recorded spectra. Classification speed and accuracy are contrasted with that achieved using a support vector machine classifier. The study systematically compares the classifier performance achieved after adopting different Gaussian kernels when separating amplitude and phase signatures. The two signatures are presented as feature vectors for both training and testing purposes. The study confirms the utility of complex-valued extreme learning machine algorithms for classification of the very large data sets generated with current terahertz imaging spectrometers. The classifier can take into consideration heterogeneous layers within an object as would be required within a tomographic setting and is sufficiently robust to detect patterns hidden inside noisy terahertz data sets. The proposed study opens up the opportunity for the establishment of complex-valued extreme learning machine algorithms as new chemometric tools that will assist the wider proliferation of terahertz sensing technology for chemical sensing, quality control, security screening and clinic diagnosis. Furthermore, the proposed algorithm should also be very useful in other applications requiring the classification of very large datasets. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
The large sample size fallacy.
Lantz, Björn
2013-06-01
Significance in the statistical sense has little to do with significance in the common practical sense. Statistical significance is a necessary but not a sufficient condition for practical significance. Hence, results that are extremely statistically significant may be highly nonsignificant in practice. The degree of practical significance is generally determined by the size of the observed effect, not the p-value. The results of studies based on large samples are often characterized by extreme statistical significance despite small or even trivial effect sizes. Interpreting such results as significant in practice without further analysis is referred to as the large sample size fallacy in this article. The aim of this article is to explore the relevance of the large sample size fallacy in contemporary nursing research. Relatively few nursing articles display explicit measures of observed effect sizes or include a qualitative discussion of observed effect sizes. Statistical significance is often treated as an end in itself. Effect sizes should generally be calculated and presented along with p-values for statistically significant results, and observed effect sizes should be discussed qualitatively through direct and explicit comparisons with the effects in related literature. © 2012 Nordic College of Caring Science.
Eum, Hyung-Il; Gachon, Philippe; Laprise, René
2016-01-01
This study examined the impact of model biases on climate change signals for daily precipitation and for minimum and maximum temperatures. Through the use of multiple climate scenarios from 12 regional climate model simulations, the ensemble mean, and three synthetic simulations generated by a weighting procedure, we investigated intermodel seasonal climate change signals between current and future periods, for both median and extreme precipitation/temperature values. A significant dependence of seasonal climate change signals on the model biases over southern Québec in Canada was detected for temperatures, but not for precipitation. This suggests that the regional temperature change signal is affectedmore » by local processes. Seasonally, model bias affects future mean and extreme values in winter and summer. In addition, potentially large increases in future extremes of temperature and precipitation values were projected. For three synthetic scenarios, systematically less bias and a narrow range of mean change for all variables were projected compared to those of climate model simulations. In addition, synthetic scenarios were found to better capture the spatial variability of extreme cold temperatures than the ensemble mean scenario. Finally, these results indicate that the synthetic scenarios have greater potential to reduce the uncertainty of future climate projections and capture the spatial variability of extreme climate events.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eum, Hyung-Il; Gachon, Philippe; Laprise, René
This study examined the impact of model biases on climate change signals for daily precipitation and for minimum and maximum temperatures. Through the use of multiple climate scenarios from 12 regional climate model simulations, the ensemble mean, and three synthetic simulations generated by a weighting procedure, we investigated intermodel seasonal climate change signals between current and future periods, for both median and extreme precipitation/temperature values. A significant dependence of seasonal climate change signals on the model biases over southern Québec in Canada was detected for temperatures, but not for precipitation. This suggests that the regional temperature change signal is affectedmore » by local processes. Seasonally, model bias affects future mean and extreme values in winter and summer. In addition, potentially large increases in future extremes of temperature and precipitation values were projected. For three synthetic scenarios, systematically less bias and a narrow range of mean change for all variables were projected compared to those of climate model simulations. In addition, synthetic scenarios were found to better capture the spatial variability of extreme cold temperatures than the ensemble mean scenario. Finally, these results indicate that the synthetic scenarios have greater potential to reduce the uncertainty of future climate projections and capture the spatial variability of extreme climate events.« less
Extreme-value statistics of work done in stretching a polymer in a gradient flow.
Vucelja, M; Turitsyn, K S; Chertkov, M
2015-02-01
We analyze the statistics of work generated by a gradient flow to stretch a nonlinear polymer. We obtain the large deviation function (LDF) of the work in the full range of appropriate parameters by combining analytical and numerical tools. The LDF shows two distinct asymptotes: "near tails" are linear in work and dominated by coiled polymer configurations, while "far tails" are quadratic in work and correspond to preferentially fully stretched polymers. We find the extreme value statistics of work for several singular elastic potentials, as well as the mean and the dispersion of work near the coil-stretch transition. The dispersion shows a maximum at the transition.
Impact of possible climate changes on river runoff under different natural conditions
NASA Astrophysics Data System (ADS)
Gusev, Yeugeniy M.; Nasonova, Olga N.; Kovalev, Evgeny E.; Ayzel, Georgy V.
2018-06-01
The present study was carried out within the framework of the International Inter-Sectoral Impact Model Intercomparison Project (ISI-MIP) for 11 large river basins located in different continents of the globe under a wide variety of natural conditions. The aim of the study was to investigate possible changes in various characteristics of annual river runoff (mean values, standard deviations, frequency of extreme annual runoff) up to 2100 on the basis of application of the land surface model SWAP and meteorological projections simulated by five General Circulation Models (GCMs) according to four RCP scenarios. Analysis of the obtained results has shown that changes in climatic runoff are different (both in magnitude and sign) for the river basins located in different regions of the planet due to differences in natural (primarily climatic) conditions. The climatic elasticities of river runoff to changes in air temperature and precipitation were estimated that makes it possible, as the first approximation, to project changes in climatic values of annual runoff, using the projected changes in mean annual air temperature and annual precipitation for the river basins. It was found that for most rivers under study, the frequency of occurrence of extreme runoff values increases. This is true both for extremely high runoff (when the projected climatic runoff increases) and for extremely low values (when the projected climatic runoff decreases).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Jiali; Han, Yuefeng; Stein, Michael L.
2016-02-10
The Weather Research and Forecast (WRF) model downscaling skill in extreme maximum daily temperature is evaluated by using the generalized extreme value (GEV) distribution. While the GEV distribution has been used extensively in climatology and meteorology for estimating probabilities of extreme events, accurately estimating GEV parameters based on data from a single pixel can be difficult, even with fairly long data records. This work proposes a simple method assuming that the shape parameter, the most difficult of the three parameters to estimate, does not vary over a relatively large region. This approach is applied to evaluate 31-year WRF-downscaled extreme maximummore » temperature through comparison with North American Regional Reanalysis (NARR) data. Uncertainty in GEV parameter estimates and the statistical significance in the differences of estimates between WRF and NARR are accounted for by conducting bootstrap resampling. Despite certain biases over parts of the United States, overall, WRF shows good agreement with NARR in the spatial pattern and magnitudes of GEV parameter estimates. Both WRF and NARR show a significant increase in extreme maximum temperature over the southern Great Plains and southeastern United States in January and over the western United States in July. The GEV model shows clear benefits from the regionally constant shape parameter assumption, for example, leading to estimates of the location and scale parameters of the model that show coherent spatial patterns.« less
Understanding extreme sea levels for coastal impact and adaptation analysis
NASA Astrophysics Data System (ADS)
Wahl, T.; Haigh, I. D.; Nicholls, R. J.; Arns, A.; Hinkel, J.; Dangendorf, S.; Slangen, A.
2016-12-01
Coastal impact and adaptation assessments require detailed knowledge on extreme sea levels, because increasing damage due to extreme events, such as storm surges and tropical cyclones, is one of the major consequences of sea level rise and climate change. In fact, the IPCC has highlighted in its AR4 report that "societal impacts of sea level change primarily occur via the extreme levels rather than as a direct consequence of mean sea level changes". Over the last few decades, substantial research efforts have been directed towards improved understanding of past and future mean sea level; different scenarios were developed with process-based or semi-empirical models and used for coastal impact assessments at various spatial scales to guide coastal management and adaptation efforts. The uncertainties in future sea level rise are typically accounted for by analyzing the impacts associated with a range of scenarios leading to a vertical displacement of the distribution of extreme sea-levels. And indeed most regional and global studies find little or no evidence for changes in storminess with climate change, although there is still low confidence in the results. However, and much more importantly, there is still a limited understanding of present-day extreme sea-levels which is largely ignored in most impact and adaptation analyses. The two key uncertainties stem from: (1) numerical models that are used to generate long time series of extreme sea-levels. The bias of these models varies spatially and can reach values much larger than the expected sea level rise; but it can be accounted for in most regions making use of in-situ measurements; (2) Statistical models used for determining present-day extreme sea-level exceedance probabilities. There is no universally accepted approach to obtain such values for flood risk assessments and while substantial research has explored inter-model uncertainties for mean sea level, we explore here, for the first time, inter-model uncertainties for extreme sea-levels at large spatial scales and compare them to the uncertainties in mean sea level projections.
NASA Astrophysics Data System (ADS)
Shih, Hong-Yan; Goldenfeld, Nigel
Experiments on transitional turbulence in pipe flow seem to show that turbulence is a transient metastable state since the measured mean lifetime of turbulence puffs does not diverge asymptotically at a critical Reynolds number. Yet measurements reveal that the lifetime scales with Reynolds number in a super-exponential way reminiscent of extreme value statistics, and simulations and experiments in Couette and channel flow exhibit directed percolation type scaling phenomena near a well-defined transition. This universality class arises from the interplay between small-scale turbulence and a large-scale collective zonal flow, which exhibit predator-prey behavior. Why is asymptotically divergent behavior not observed? Using directed percolation and a stochastic individual level model of predator-prey dynamics related to transitional turbulence, we investigate the relation between extreme value statistics and power law critical behavior, and show that the paradox is resolved by carefully defining what is measured in the experiments. We theoretically derive the super-exponential scaling law, and using finite-size scaling, show how the same data can give both super-exponential behavior and power-law critical scaling.
Assessment of extreme value distributions for maximum temperature in the Mediterranean area
NASA Astrophysics Data System (ADS)
Beck, Alexander; Hertig, Elke; Jacobeit, Jucundus
2015-04-01
Extreme maximum temperatures highly affect the natural as well as the societal environment Heat stress has great effects on flora, fauna and humans and culminates in heat related morbidity and mortality. Agriculture and different industries are severely affected by extreme air temperatures. Even more under climate change conditions, it is necessary to detect potential hazards which arise from changes in the distributional parameters of extreme values, and this is especially relevant for the Mediterranean region which is characterized as a climate change hot spot. Therefore statistical approaches are developed to estimate these parameters with a focus on non-stationarities emerging in the relationship between regional climate variables and their large-scale predictors like sea level pressure, geopotential heights, atmospheric temperatures and relative humidity. Gridded maximum temperature data from the daily E-OBS dataset (Haylock et al., 2008) with a spatial resolution of 0.25° x 0.25° from January 1950 until December 2012 are the predictands for the present analyses. A s-mode principal component analysis (PCA) has been performed in order to reduce data dimension and to retain different regions of similar maximum temperature variability. The grid box with the highest PC-loading represents the corresponding principal component. A central part of the analyses is the model development for temperature extremes under the use of extreme value statistics. A combined model is derived consisting of a Generalized Pareto Distribution (GPD) model and a quantile regression (QR) model which determines the GPD location parameters. The QR model as well as the scale parameters of the GPD model are conditioned by various large-scale predictor variables. In order to account for potential non-stationarities in the predictors-temperature relationships, a special calibration and validation scheme is applied, respectively. Haylock, M. R., N. Hofstra, A. M. G. Klein Tank, E. J. Klok, P. D. Jones, and M. New (2008), A European daily high-resolution gridded data set of surface temperature and precipitation for 1950 - 2006, J. Geophys. Res., 113, D20119, doi:10.1029/2008JD010201.
Multifractal Value at Risk model
NASA Astrophysics Data System (ADS)
Lee, Hojin; Song, Jae Wook; Chang, Woojin
2016-06-01
In this paper new Value at Risk (VaR) model is proposed and investigated. We consider the multifractal property of financial time series and develop a multifractal Value at Risk (MFVaR). MFVaR introduced in this paper is analytically tractable and not based on simulation. Empirical study showed that MFVaR can provide the more stable and accurate forecasting performance in volatile financial markets where large loss can be incurred. This implies that our multifractal VaR works well for the risk measurement of extreme credit events.
Characterization and prediction of extreme events in turbulence
NASA Astrophysics Data System (ADS)
Fonda, Enrico; Iyer, Kartik P.; Sreenivasan, Katepalli R.
2017-11-01
Extreme events in Nature such as tornadoes, large floods and strong earthquakes are rare but can have devastating consequences. The predictability of these events is very limited at present. Extreme events in turbulence are the very large events in small scales that are intermittent in character. We examine events in energy dissipation rate and enstrophy which are several tens to hundreds to thousands of times the mean value. To this end we use our DNS database of homogeneous and isotropic turbulence with Taylor Reynolds numbers spanning a decade, computed with different small scale resolutions and different box sizes, and study the predictability of these events using machine learning. We start with an aggressive data augmentation to virtually increase the number of these rare events by two orders of magnitude and train a deep convolutional neural network to predict their occurrence in an independent data set. The goal of the work is to explore whether extreme events can be predicted with greater assurance than can be done by conventional methods (e.g., D.A. Donzis & K.R. Sreenivasan, J. Fluid Mech. 647, 13-26, 2010).
Amplification of Angular Rotations Using Weak Measurements
NASA Astrophysics Data System (ADS)
Magaña-Loaiza, Omar S.; Mirhosseini, Mohammad; Rodenburg, Brandon; Boyd, Robert W.
2014-05-01
We present a weak measurement protocol that permits a sensitive estimation of angular rotations based on the concept of weak-value amplification. The shift in the state of a pointer, in both angular position and the conjugate orbital angular momentum bases, is used to estimate angular rotations. This is done by an amplification of both the real and imaginary parts of the weak-value of a polarization operator that has been coupled to the pointer, which is a spatial mode, via a spin-orbit coupling. Our experiment demonstrates the first realization of weak-value amplification in the azimuthal degree of freedom. We have achieved effective amplification factors as large as 100, providing a sensitivity that is on par with more complicated methods that employ quantum states of light or extremely large values of orbital angular momentum.
Nonlinear Large-Deflection Boundary-Value Problems of Rectangular Plates
1948-03-01
nondimensional %’a T ,a2/m,2 respectively) xy , extreme- fiber bending and shearing stresses (nondimensiozml forms are e’x"a2/E’h2 , Cry"a2/Eh2, and Vxy"a2/Eh2...respectively) membrane strains in middle surface (nondimensional forms are _x ’a2/h2, _y ’a2/h 2, and _x_ ’a2/h2, respectlvel_ ) extreme- fiber bending...median- fiber stresses are _2F (_y - 8x2 82F T ! -- xy _x and the n_diarJ- fiber strains are , 8 NACA TN No. 1425 7_’ = _2(i+ _) _2F _. axa_ The extreme
NASA Astrophysics Data System (ADS)
Mentaschi, Lorenzo; Vousdoukas, Michalis; Voukouvalas, Evangelos; Sartini, Ludovica; Feyen, Luc; Besio, Giovanni; Alfieri, Lorenzo
2016-09-01
Statistical approaches to study extreme events require, by definition, long time series of data. In many scientific disciplines, these series are often subject to variations at different temporal scales that affect the frequency and intensity of their extremes. Therefore, the assumption of stationarity is violated and alternative methods to conventional stationary extreme value analysis (EVA) must be adopted. Using the example of environmental variables subject to climate change, in this study we introduce the transformed-stationary (TS) methodology for non-stationary EVA. This approach consists of (i) transforming a non-stationary time series into a stationary one, to which the stationary EVA theory can be applied, and (ii) reverse transforming the result into a non-stationary extreme value distribution. As a transformation, we propose and discuss a simple time-varying normalization of the signal and show that it enables a comprehensive formulation of non-stationary generalized extreme value (GEV) and generalized Pareto distribution (GPD) models with a constant shape parameter. A validation of the methodology is carried out on time series of significant wave height, residual water level, and river discharge, which show varying degrees of long-term and seasonal variability. The results from the proposed approach are comparable with the results from (a) a stationary EVA on quasi-stationary slices of non-stationary series and (b) the established method for non-stationary EVA. However, the proposed technique comes with advantages in both cases. For example, in contrast to (a), the proposed technique uses the whole time horizon of the series for the estimation of the extremes, allowing for a more accurate estimation of large return levels. Furthermore, with respect to (b), it decouples the detection of non-stationary patterns from the fitting of the extreme value distribution. As a result, the steps of the analysis are simplified and intermediate diagnostics are possible. In particular, the transformation can be carried out by means of simple statistical techniques such as low-pass filters based on the running mean and the standard deviation, and the fitting procedure is a stationary one with a few degrees of freedom and is easy to implement and control. An open-source MATLAB toolbox has been developed to cover this methodology, which is available at https://github.com/menta78/tsEva/ (Mentaschi et al., 2016).
NASA Technical Reports Server (NTRS)
Levy, Samuel; Krupen, Philip
1943-01-01
The von Karman equations for flat plates are solved beyond the buckling load up to edge strains equal to eight time the buckling strain, for the extreme case of rigid clamping along the edges parallel to the load. Deflections, bending stresses, and membrane stresses are given as a function of end compressive load. The theoretical values of effective width are compared with the values derived for simple support along the edges parallel to the load. The increases in effective width due to rigid clamping drops from about 20 percent near the buckling strain to about 8 percent at an edge strain equal to eight times the buckling strain. Experimental values of effective width in the elastic range reported in NACA Technical Note No. 684 are between the theoretical curves for the extremes of simple support and rigid clamping.
Extreme events and event size fluctuations in biased random walks on networks.
Kishore, Vimal; Santhanam, M S; Amritkar, R E
2012-05-01
Random walk on discrete lattice models is important to understand various types of transport processes. The extreme events, defined as exceedences of the flux of walkers above a prescribed threshold, have been studied recently in the context of complex networks. This was motivated by the occurrence of rare events such as traffic jams, floods, and power blackouts which take place on networks. In this work, we study extreme events in a generalized random walk model in which the walk is preferentially biased by the network topology. The walkers preferentially choose to hop toward the hubs or small degree nodes. In this setting, we show that extremely large fluctuations in event sizes are possible on small degree nodes when the walkers are biased toward the hubs. In particular, we obtain the distribution of event sizes on the network. Further, the probability for the occurrence of extreme events on any node in the network depends on its "generalized strength," a measure of the ability of a node to attract walkers. The generalized strength is a function of the degree of the node and that of its nearest neighbors. We obtain analytical and simulation results for the probability of occurrence of extreme events on the nodes of a network using a generalized random walk model. The result reveals that the nodes with a larger value of generalized strength, on average, display lower probability for the occurrence of extreme events compared to the nodes with lower values of generalized strength.
Application of Radar-Rainfall Estimates to Probable Maximum Precipitation in the Carolinas
NASA Astrophysics Data System (ADS)
England, J. F.; Caldwell, R. J.; Sankovich, V.
2011-12-01
Extreme storm rainfall data are essential in the assessment of potential impacts on design precipitation amounts, which are used in flood design criteria for dams and nuclear power plants. Probable Maximum Precipitation (PMP) from National Weather Service Hydrometeorological Report 51 (HMR51) is currently used for design rainfall estimates in the eastern U.S. The extreme storm database associated with the report has not been updated since the early 1970s. In the past several decades, several extreme precipitation events have occurred that have the potential to alter the PMP values, particularly across the Southeast United States (e.g., Hurricane Floyd 1999). Unfortunately, these and other large precipitation-producing storms have not been analyzed with the detail required for application in design studies. This study focuses on warm-season tropical cyclones (TCs) in the Carolinas, as these systems are the critical maximum rainfall mechanisms in the region. The goal is to discern if recent tropical events may have reached or exceeded current PMP values. We have analyzed 10 storms using modern datasets and methodologies that provide enhanced spatial and temporal resolution relative to point measurements used in past studies. Specifically, hourly multisensor precipitation reanalysis (MPR) data are used to estimate storm total precipitation accumulations at various durations throughout each storm event. The accumulated grids serve as input to depth-area-duration calculations. Individual storms are then maximized using back-trajectories to determine source regions for moisture. The development of open source software has made this process time and resource efficient. Based on the current methodology, two of the ten storms analyzed have the potential to challenge HMR51 PMP values. Maximized depth-area curves for Hurricane Floyd indicate exceedance at 24- and 72-hour durations for large area sizes, while Hurricane Fran (1996) appears to exceed PMP at large area sizes for short-duration, 6-hour storms. Utilizing new methods and data, however, requires careful consideration of the potential limitations and caveats associated with the analysis and further evaluation of the newer storms within the context of historical storms from HMR51. Here, we provide a brief background on extreme rainfall in the Carolinas, along with an overview of the methods employed for converting MPR to depth-area relationships. Discussion of the issues and limitations, evaluation of the various techniques, and comparison to HMR51 storms and PMP values are also presented.
Replica and extreme-value analysis of the Jarzynski free-energy estimator
NASA Astrophysics Data System (ADS)
Palassini, Matteo; Ritort, Felix
2008-03-01
We analyze the Jarzynski estimator of free-energy differences from nonequilibrium work measurements. By a simple mapping onto Derrida's Random Energy Model, we obtain a scaling limit for the expectation of the bias of the estimator. We then derive analytical approximations in three different regimes of the scaling parameter x = log(N)/W, where N is the number of measurements and W the mean dissipated work. Our approach is valid for a generic distribution of the dissipated work, and is based on a replica symmetry breaking scheme for x >> 1, the asymptotic theory of extreme value statistics for x << 1, and a direct approach for x near one. The combination of the three analytic approximations describes well Monte Carlo data for the expectation value of the estimator, for a wide range of values of N, from N=1 to large N, and for different work distributions. Based on these results, we introduce improved free-energy estimators and discuss the application to the analysis of experimental data.
Geoelectric Hazard Maps for the Continental United States
NASA Technical Reports Server (NTRS)
Love, Jeffrey J.; Pulkkinen, Antti; Bedrosian, Paul A.; Jonas, Seth; Kelbert, Anna; Rigler, Joshua E.; Finn, Carol A.; Balch, Christopher C.; Rutledge, Robert; Waggle, Richard M.
2016-01-01
In support of a multiagency project for assessing induction hazards, we present maps of extreme-value geoelectric amplitudes over about half of the continental United States. These maps are constructed using a parameterization of induction: estimates of Earth surface impedance, obtained at discrete geographic sites from magnetotelluric survey data, are convolved with latitude-dependent statistical maps of extreme-value geomagnetic activity, obtained from decades of magnetic observatory data. Geoelectric amplitudes are estimated for geomagnetic waveforms having 240 s sinusoidal period and amplitudes over 10 min that exceed a once-per-century threshold. As a result of the combination of geographic differences in geomagnetic activity and Earth surface impedance, once-per-century geoelectric amplitudes span more than 2 orders of magnitude and are an intricate function of location. For north-south induction, once-per-century geoelectric amplitudes across large parts of the United States have a median value of 0.26 Vkm; for east-west geomagnetic variation the median value is 0.23 Vkm. At some locations,once-per-century geoelectric amplitudes exceed 3 Vkm.
Precipitation extremes and their relation to climatic indices in the Pacific Northwest USA
NASA Astrophysics Data System (ADS)
Zarekarizi, Mahkameh; Rana, Arun; Moradkhani, Hamid
2018-06-01
There has been focus on the influence of climate indices on precipitation extremes in the literature. Current study presents the evaluation of the precipitation-based extremes in Columbia River Basin (CRB) in the Pacific Northwest USA. We first analyzed the precipitation-based extremes using statistically (ten GCMs) and dynamically downscaled (three GCMs) past and future climate projections. Seven precipitation-based indices that help inform about the flood duration/intensity are used. These indices help in attaining first-hand information on spatial and temporal scales for different service sectors including energy, agriculture, forestry etc. Evaluation of these indices is first performed in historical period (1971-2000) followed by analysis of their relation to large scale tele-connections. Further we mapped these indices over the area to evaluate the spatial variation of past and future extremes in downscaled and observational data. The analysis shows that high values of extreme indices are clustered in either western or northern parts of the basin for historical period whereas the northern part is experiencing higher degree of change in the indices for future scenario. The focus is also on evaluating the relation of these extreme indices to climate tele-connections in historical period to understand their relationship with extremes over CRB. Various climate indices are evaluated for their relationship using Principal Component Analysis (PCA) and Singular Value Decomposition (SVD). Results indicated that, out of 13 climate tele-connections used in the study, CRB is being most affected inversely by East Pacific (EP), Western Pacific (WP), East Atlantic (EA) and North Atlaentic Oscillation (NAO).
Energy Storage Systems Evaluation | Transportation Research | NREL
extreme climates can have a dramatic impact on batteries and energy storage systems. Graph with numerous -E drive data to assess the vehicle batteries' value in second-use applications. Figure courtesy of could potentially absorb a large quantity of repurposed EV batteries. NREL's system evaluation project
NASA Astrophysics Data System (ADS)
Tsai, Kuang-Jung; Chiang, Jie-Lun; Lee, Ming-Hsi; Chen, Yie-Ruey
2017-04-01
Analysis on the Critical Rainfall Value For Predicting Large Scale Landslides Caused by Heavy Rainfall In Taiwan. Kuang-Jung Tsai 1, Jie-Lun Chiang 2,Ming-Hsi Lee 2, Yie-Ruey Chen 1, 1Department of Land Management and Development, Chang Jung Christian Universityt, Tainan, Taiwan. 2Department of Soil and Water Conservation, National Pingtung University of Science and Technology, Pingtung, Taiwan. ABSTRACT The accumulated rainfall amount was recorded more than 2,900mm that were brought by Morakot typhoon in August, 2009 within continuous 3 days. Very serious landslides, and sediment related disasters were induced by this heavy rainfall event. The satellite image analysis project conducted by Soil and Water Conservation Bureau after Morakot event indicated that more than 10,904 sites of landslide with total sliding area of 18,113ha were found by this project. At the same time, all severe sediment related disaster areas are also characterized based on their disaster type, scale, topography, major bedrock formations and geologic structures during the period of extremely heavy rainfall events occurred at the southern Taiwan. Characteristics and mechanism of large scale landslide are collected on the basis of the field investigation technology integrated with GPS/GIS/RS technique. In order to decrease the risk of large scale landslides on slope land, the strategy of slope land conservation, and critical rainfall database should be set up and executed as soon as possible. Meanwhile, study on the establishment of critical rainfall value used for predicting large scale landslides induced by heavy rainfall become an important issue which was seriously concerned by the government and all people live in Taiwan. The mechanism of large scale landslide, rainfall frequency analysis ,sediment budge estimation and river hydraulic analysis under the condition of extremely climate change during the past 10 years would be seriously concerned and recognized as a required issue by this research. Hopefully, all results developed from this research can be used as a warning system for Predicting Large Scale Landslides in the southern Taiwan. Keywords:Heavy Rainfall, Large Scale, landslides, Critical Rainfall Value
A single pH fluorescent probe for biosensing and imaging of extreme acidity and extreme alkalinity.
Chao, Jian-Bin; Wang, Hui-Juan; Zhang, Yong-Bin; Li, Zhi-Qing; Liu, Yu-Hong; Huo, Fang-Jun; Yin, Cai-Xia; Shi, Ya-Wei; Wang, Juan-Juan
2017-07-04
A simple tailor-made pH fluorescent probe 2-benzothiazole (N-ethylcarbazole-3-yl) hydrazone (Probe) is facilely synthesized by the condensation reaction of 2-hydrazinobenzothiazole with N-ethylcarbazole-3-formaldehyde, which is a useful fluorescent probe for monitoring extremely acidic and alkaline pH, quantitatively. The pH titrations indicate that Probe displays a remarkable emission enhancement with a pK a of 2.73 and responds linearly to minor pH fluctuations within the extremely acidic range of 2.21-3.30. Interestingly, Probe also exhibits strong pH-dependent characteristics with pK a 11.28 and linear response to extreme-alkalinity range of 10.41-12.43. In addition, Probe shows a large Stokes shift of 84 nm under extremely acidic and alkaline conditions, high selectivity, excellent sensitivity, good water-solubility and fine stability, all of which are favorable for intracellular pH imaging. The probe is further successfully applied to image extremely acidic and alkaline pH values fluctuations in E. coli cells. Copyright © 2017 Elsevier B.V. All rights reserved.
Large uncertainties in observed daily precipitation extremes over land
NASA Astrophysics Data System (ADS)
Herold, Nicholas; Behrangi, Ali; Alexander, Lisa V.
2017-01-01
We explore uncertainties in observed daily precipitation extremes over the terrestrial tropics and subtropics (50°S-50°N) based on five commonly used products: the Climate Hazards Group InfraRed Precipitation with Stations (CHIRPS) dataset, the Global Precipitation Climatology Centre-Full Data Daily (GPCC-FDD) dataset, the Tropical Rainfall Measuring Mission (TRMM) multi-satellite research product (T3B42 v7), the Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks-Climate Data Record (PERSIANN-CDR), and the Global Precipitation Climatology Project's One-Degree Daily (GPCP-1DD) dataset. We use the precipitation indices R10mm and Rx1day, developed by the Expert Team on Climate Change Detection and Indices, to explore the behavior of "moderate" and "extreme" extremes, respectively. In order to assess the sensitivity of extreme precipitation to different grid sizes we perform our calculations on four common spatial resolutions (0.25° × 0.25°, 1° × 1°, 2.5° × 2.5°, and 3.75° × 2.5°). The impact of the chosen "order of operation" in calculating these indices is also determined. Our results show that moderate extremes are relatively insensitive to product and resolution choice, while extreme extremes can be very sensitive. For example, at 0.25° × 0.25° quasi-global mean Rx1day values vary from 37 mm in PERSIANN-CDR to 62 mm in T3B42. We find that the interproduct spread becomes prominent at resolutions of 1° × 1° and finer, thus establishing a minimum effective resolution at which observational products agree. Without improvements in interproduct spread, these exceedingly large observational uncertainties at high spatial resolution may limit the usefulness of model evaluations. As has been found previously, resolution sensitivity can be largely eliminated by applying an order of operation where indices are calculated prior to regridding. However, this approach is not appropriate when true area averages are desired (e.g., for model evaluations).
Inter-model variability in hydrological extremes projections for Amazonian sub-basins
NASA Astrophysics Data System (ADS)
Andres Rodriguez, Daniel; Garofolo, Lucas; Lázaro de Siqueira Júnior, José; Samprogna Mohor, Guilherme; Tomasella, Javier
2014-05-01
Irreducible uncertainties due to knowledge's limitations, chaotic nature of climate system and human decision-making process drive uncertainties in Climate Change projections. Such uncertainties affect the impact studies, mainly when associated to extreme events, and difficult the decision-making process aimed at mitigation and adaptation. However, these uncertainties allow the possibility to develop exploratory analyses on system's vulnerability to different sceneries. The use of different climate model's projections allows to aboard uncertainties issues allowing the use of multiple runs to explore a wide range of potential impacts and its implications for potential vulnerabilities. Statistical approaches for analyses of extreme values are usually based on stationarity assumptions. However, nonstationarity is relevant at the time scales considered for extreme value analyses and could have great implications in dynamic complex systems, mainly under climate change transformations. Because this, it is required to consider the nonstationarity in the statistical distribution parameters. We carried out a study of the dispersion in hydrological extremes projections using climate change projections from several climate models to feed the Distributed Hydrological Model of the National Institute for Spatial Research, MHD-INPE, applied in Amazonian sub-basins. This model is a large-scale hydrological model that uses a TopModel approach to solve runoff generation processes at the grid-cell scale. MHD-INPE model was calibrated for 1970-1990 using observed meteorological data and comparing observed and simulated discharges by using several performance coeficients. Hydrological Model integrations were performed for present historical time (1970-1990) and for future period (2010-2100). Because climate models simulate the variability of the climate system in statistical terms rather than reproduce the historical behavior of climate variables, the performances of the model's runs during the historical period, when feed with climate model data, were tested using descriptors of the Flow Duration Curves. The analyses of projected extreme values were carried out considering the nonstationarity of the GEV distribution parameters and compared with extremes events in present time. Results show inter-model variability in a broad dispersion on projected extreme's values. Such dispersion implies different degrees of socio-economic impacts associated to extreme hydrological events. Despite the no existence of one optimum result, this variability allows the analyses of adaptation strategies and its potential vulnerabilities.
NASA Astrophysics Data System (ADS)
Ebrahimi, R.; Zohren, S.
2018-03-01
In this paper we extend the orthogonal polynomials approach for extreme value calculations of Hermitian random matrices, developed by Nadal and Majumdar (J. Stat. Mech. P04001 arXiv:1102.0738), to normal random matrices and 2D Coulomb gases in general. Firstly, we show that this approach provides an alternative derivation of results in the literature. More precisely, we show convergence of the rescaled eigenvalue with largest modulus of a normal Gaussian ensemble to a Gumbel distribution, as well as universality for an arbitrary radially symmetric potential. Secondly, it is shown that this approach can be generalised to obtain convergence of the eigenvalue with smallest modulus and its universality for ring distributions. Most interestingly, the here presented techniques are used to compute all slowly varying finite N correction of the above distributions, which is important for practical applications, given the slow convergence. Another interesting aspect of this work is the fact that we can use standard techniques from Hermitian random matrices to obtain the extreme value statistics of non-Hermitian random matrices resembling the large N expansion used in context of the double scaling limit of Hermitian matrix models in string theory.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Picard, Richard Roy; Bhat, Kabekode Ghanasham
2017-07-18
We examine sensitivity analysis and uncertainty quantification for molecular dynamics simulation. Extreme (large or small) output values for the LAMMPS code often occur at the boundaries of input regions, and uncertainties in those boundary values are overlooked by common SA methods. Similarly, input values for which code outputs are consistent with calibration data can also occur near boundaries. Upon applying approaches in the literature for imprecise probabilities (IPs), much more realistic results are obtained than for the complacent application of standard SA and code calibration.
Coupled extremely light Ca and Fe isotopes in peridotites
NASA Astrophysics Data System (ADS)
Zhao, Xinmiao; Zhang, Zhaofeng; Huang, Shichun; Liu, Yufei; Li, Xin; Zhang, Hongfu
2017-07-01
Large metal stable isotopic variations have been observed in both extraterrestrial and terrestrial samples. For example, Ca exhibits large mass-dependent isotopic variation in terrestrial igneous rocks and mantle minerals (on the order of ∼2‰ variation in 44Ca/40Ca). A thorough assessment and understanding of such isotopic variations in peridotites provides important constraints on the evolution and compositon of the Earth's mantle. In order to better understand the Ca and Fe isotopic variations in terrestrial silicate rocks, we report Ca isotopic compositions in a set of peridotitic xenoliths from North China Craton (NCC), which have been studied for Fe isotopes. These NCC peridotites have large Ca and Fe isotopic variations, with δ44/40Ca ranging from -0.08 to 0.92 (delta value relative to SRM915a) and δ57/54Fe (delta value relative to IRMM-014) ranging from -0.61 to 0.16, and these isotopic variations are correlated with large Mg# (100 × Mg/(Mg + Fe) molar ratio) variation, ranging from 80 to 90. Importantly, NCC Fe-rich peridotites have the lowest 44Ca/40Ca and 57Fe/54Fe ratios in all terrestrial silicate rocks. In contrast, although ureilites, mantle rocks from a now broken differentiated asteroid(s), have large Mg# variation, from 70 to 92, they have very limited δ57Fe/54Fe variation (0.03-0.21, delta value relative to IRMM-014). Our model calculations show that the coupled extremely light Ca-Fe isotopic signatures in NCC Fe-rich peridotites most likely reflect kinetic isotopic fractionation during melt-peridotite reaction on a timescale of several to 104 years. In addition, our new data and compiled literature data show a possible compositional effect on the inter-mineral Ca isotopic fractionation between co-existing clinopyroxene and orthopyroxene pairs.
NASA Astrophysics Data System (ADS)
Xu, Ying; Gao, Xuejie; Giorgi, Filippo; Zhou, Botao; Shi, Ying; Wu, Jie; Zhang, Yongxiang
2018-04-01
Future changes in the 50-yr return level for temperature and precipitation extremes over mainland China are investigated based on a CMIP5 multi-model ensemble for RCP2.6, RCP4.5 and RCP8.5 scenarios. The following indices are analyzed: TXx and TNn (the annual maximum and minimum of daily maximum and minimum surface temperature), RX5day (the annual maximum consecutive 5-day precipitation) and CDD (maximum annual number of consecutive dry days). After first validating the model performance, future changes in the 50-yr return values and return periods for these indices are investigated along with the inter-model spread. Multi-model median changes show an increase in the 50-yr return values of TXx and a decrease for TNn, more specifically, by the end of the 21st century under RCP8.5, the present day 50-yr return period of warm events is reduced to 1.2 yr, while extreme cold events over the country are projected to essentially disappear. A general increase in RX5day 50-yr return values is found in the future. By the end of the 21st century under RCP8.5, events of the present RX5day 50-yr return period are projected to reduce to < 10 yr over most of China. Changes in CDD-50 show a dipole pattern over China, with a decrease in the values and longer return periods in the north, and vice versa in the south. Our study also highlights the need for further improvements in the representation of extreme events in climate models to assess the future risks and engineering design related to large-scale infrastructure in China.
How Historical Information Can Improve Extreme Value Analysis of Coastal Water Levels
NASA Astrophysics Data System (ADS)
Le Cozannet, G.; Bulteau, T.; Idier, D.; Lambert, J.; Garcin, M.
2016-12-01
The knowledge of extreme coastal water levels is useful for coastal flooding studies or the design of coastal defences. While deriving such extremes with standard analyses using tide gauge measurements, one often needs to deal with limited effective duration of observation which can result in large statistical uncertainties. This is even truer when one faces outliers, those particularly extreme values distant from the others. In a recent work (Bulteau et al., 2015), we investigated how historical information of past events reported in archives can reduce statistical uncertainties and relativize such outlying observations. We adapted a Bayesian Markov Chain Monte Carlo method, initially developed in the hydrology field (Reis and Stedinger, 2005), to the specific case of coastal water levels. We applied this method to the site of La Rochelle (France), where the storm Xynthia in 2010 generated a water level considered so far as an outlier. Based on 30 years of tide gauge measurements and 8 historical events since 1890, the results showed a significant decrease in statistical uncertainties on return levels when historical information is used. Also, Xynthia's water level no longer appeared as an outlier and we could have reasonably predicted the annual exceedance probability of that level beforehand (predictive probability for 2010 based on data until the end of 2009 of the same order of magnitude as the standard estimative probability using data until the end of 2010). Such results illustrate the usefulness of historical information in extreme value analyses of coastal water levels, as well as the relevance of the proposed method to integrate heterogeneous data in such analyses.
Extreme risk assessment based on normalized historic loss data
NASA Astrophysics Data System (ADS)
Eichner, Jan
2017-04-01
Natural hazard risk assessment and risk management focuses on the expected loss magnitudes of rare and extreme events. Such large-scale loss events typically comprise all aspects of compound events and accumulate losses from multiple sectors (including knock-on effects). Utilizing Munich Re's NatCatSERVICE direct economic loss data, we beriefly recap a novel methodology of peril-specific loss data normalization which improves the stationarity properties of highly non-stationary historic loss data (due to socio-economic growth of assets prone to destructive forces), and perform extreme value analysis (peaks-over-threshold method) to come up with return level estimates of e.g. 100-yr loss event scenarios for various types of perils, globally or per continent, and discuss uncertainty in the results.
NASA Astrophysics Data System (ADS)
Shrestha, K.; Chou, M.; Graf, D.; Yang, H. D.; Lorenz, B.; Chu, C. W.
2017-05-01
Weak antilocalization (WAL) effects in Bi2Te3 single crystals have been investigated at high and low bulk charge-carrier concentrations. At low charge-carrier density the WAL curves scale with the normal component of the magnetic field, demonstrating the dominance of topological surface states in magnetoconductivity. At high charge-carrier density the WAL curves scale with neither the applied field nor its normal component, implying a mixture of bulk and surface conduction. WAL due to topological surface states shows no dependence on the nature (electrons or holes) of the bulk charge carriers. The observations of an extremely large nonsaturating magnetoresistance and ultrahigh mobility in the samples with lower carrier density further support the presence of surface states. The physical parameters characterizing the WAL effects are calculated using the Hikami-Larkin-Nagaoka formula. At high charge-carrier concentrations, there is a greater number of conduction channels and a decrease in the phase coherence length compared to low charge-carrier concentrations. The extremely large magnetoresistance and high mobility of topological insulators have great technological value and can be exploited in magnetoelectric sensors and memory devices.
Constraint elimination in dynamical systems
NASA Technical Reports Server (NTRS)
Singh, R. P.; Likins, P. W.
1989-01-01
Large space structures (LSSs) and other dynamical systems of current interest are often extremely complex assemblies of rigid and flexible bodies subjected to kinematical constraints. A formulation is presented for the governing equations of constrained multibody systems via the application of singular value decomposition (SVD). The resulting equations of motion are shown to be of minimum dimension.
Spatial variability of extreme rainfall at radar subpixel scale
NASA Astrophysics Data System (ADS)
Peleg, Nadav; Marra, Francesco; Fatichi, Simone; Paschalis, Athanasios; Molnar, Peter; Burlando, Paolo
2018-01-01
Extreme rainfall is quantified in engineering practice using Intensity-Duration-Frequency curves (IDF) that are traditionally derived from rain-gauges and more recently also from remote sensing instruments, such as weather radars. These instruments measure rainfall at different spatial scales: rain-gauge samples rainfall at the point scale while weather radar averages precipitation on a relatively large area, generally around 1 km2. As such, a radar derived IDF curve is representative of the mean areal rainfall over a given radar pixel and neglects the within-pixel rainfall variability. In this study, we quantify subpixel variability of extreme rainfall by using a novel space-time rainfall generator (STREAP model) that downscales in space the rainfall within a given radar pixel. The study was conducted using a unique radar data record (23 years) and a very dense rain-gauge network in the Eastern Mediterranean area (northern Israel). Radar-IDF curves, together with an ensemble of point-based IDF curves representing the radar subpixel extreme rainfall variability, were developed fitting Generalized Extreme Value (GEV) distributions to annual rainfall maxima. It was found that the mean areal extreme rainfall derived from the radar underestimate most of the extreme values computed for point locations within the radar pixel (on average, ∼70%). The subpixel variability of rainfall extreme was found to increase with longer return periods and shorter durations (e.g. from a maximum variability of 10% for a return period of 2 years and a duration of 4 h to 30% for 50 years return period and 20 min duration). For the longer return periods, a considerable enhancement of extreme rainfall variability was found when stochastic (natural) climate variability was taken into account. Bounding the range of the subpixel extreme rainfall derived from radar-IDF can be of major importance for different applications that require very local estimates of rainfall extremes.
Capturing spatial and temporal patterns of widespread, extreme flooding across Europe
NASA Astrophysics Data System (ADS)
Busby, Kathryn; Raven, Emma; Liu, Ye
2013-04-01
Statistical characterisation of physical hazards is an integral part of probabilistic catastrophe models used by the reinsurance industry to estimate losses from large scale events. Extreme flood events are not restricted by country boundaries which poses an issue for reinsurance companies as their exposures often extend beyond them. We discuss challenges and solutions that allow us to appropriately capture the spatial and temporal dependence of extreme hydrological events on a continental-scale, which in turn enables us to generate an industry-standard stochastic event set for estimating financial losses for widespread flooding. By presenting our event set methodology, we focus on explaining how extreme value theory (EVT) and dependence modelling are used to account for short, inconsistent hydrological data from different countries, and how to make appropriate statistical decisions that best characterise the nature of flooding across Europe. The consistency of input data is of vital importance when identifying historical flood patterns. Collating data from numerous sources inherently causes inconsistencies and we demonstrate our robust approach to assessing the data and refining it to compile a single consistent dataset. This dataset is then extrapolated using a parameterised EVT distribution to estimate extremes. Our method then captures the dependence of flood events across countries using an advanced multivariate extreme value model. Throughout, important statistical decisions are explored including: (1) distribution choice; (2) the threshold to apply for extracting extreme data points; (3) a regional analysis; (4) the definition of a flood event, which is often linked with reinsurance industry's hour's clause; and (5) handling of missing values. Finally, having modelled the historical patterns of flooding across Europe, we sample from this model to generate our stochastic event set comprising of thousands of events over thousands of years. We then briefly illustrate how this is applied within a probabilistic model to estimate catastrophic loss curves used by the reinsurance industry.
Extreme events in optics: Challenges of the MANUREVA project
NASA Astrophysics Data System (ADS)
Dudley, J. M.; Finot, C.; Millot, G.; Garnier, J.; Genty, G.; Agafontsev, D.; Dias, F.
2010-07-01
In this contribution we describe and discuss a series of challenges and questions relating to understanding extreme wave phenomena in optics. Many aspects of these questions are being studied in the framework of the MANUREVA project: a multidisciplinary consortium aiming to carry out mathematical, numerical and experimental studies in this field. The central motivation of this work is the 2007 results from optical physics [D. Solli et al., Nature 450, 1054 (2007)] that showed how a fibre-optical system can generate large amplitude extreme wave events with similar statistical properties to the infamous hydrodynamic rogue waves on the surface of the ocean. We review our recent work in this area, and discuss how this observation may open the possibility for an optical system to be used to directly study both the dynamics and statistics of extreme-value processes, a potential advance comparable to the introduction of optical systems to study chaos in the 1970s.
The Generation of a Stochastic Flood Event Catalogue for Continental USA
NASA Astrophysics Data System (ADS)
Quinn, N.; Wing, O.; Smith, A.; Sampson, C. C.; Neal, J. C.; Bates, P. D.
2017-12-01
Recent advances in the acquisition of spatiotemporal environmental data and improvements in computational capabilities has enabled the generation of large scale, even global, flood hazard layers which serve as a critical decision-making tool for a range of end users. However, these datasets are designed to indicate only the probability and depth of inundation at a given location and are unable to describe the likelihood of concurrent flooding across multiple sites.Recent research has highlighted that although the estimation of large, widespread flood events is of great value to flood mitigation and insurance industries, to date it has been difficult to deal with this spatial dependence structure in flood risk over relatively large scales. Many existing approaches have been restricted to empirical estimates of risk based on historic events, limiting their capability of assessing risk over the full range of plausible scenarios. Therefore, this research utilises a recently developed model-based approach to describe the multisite joint distribution of extreme river flows across continental USA river gauges. Given an extreme event at a site, the model characterises the likelihood neighbouring sites are also impacted. This information is used to simulate an ensemble of plausible synthetic extreme event footprints from which flood depths are extracted from an existing global flood hazard catalogue. Expected economic losses are then estimated by overlaying flood depths with national datasets defining asset locations, characteristics and depth damage functions. The ability of this approach to quantify probabilistic economic risk and rare threshold exceeding events is expected to be of value to those interested in the flood mitigation and insurance sectors.This work describes the methodological steps taken to create the flood loss catalogue over a national scale; highlights the uncertainty in the expected annual economic vulnerability within the USA from extreme river flows; and presents future developments to the modelling approach.
Min and Max Exponential Extreme Interval Values and Statistics
ERIC Educational Resources Information Center
Jance, Marsha; Thomopoulos, Nick
2009-01-01
The extreme interval values and statistics (expected value, median, mode, standard deviation, and coefficient of variation) for the smallest (min) and largest (max) values of exponentially distributed variables with parameter ? = 1 are examined for different observation (sample) sizes. An extreme interval value g[subscript a] is defined as a…
NASA Astrophysics Data System (ADS)
Zheng, Qin; Yang, Zubin; Sha, Jianxin; Yan, Jun
2017-02-01
In predictability problem research, the conditional nonlinear optimal perturbation (CNOP) describes the initial perturbation that satisfies a certain constraint condition and causes the largest prediction error at the prediction time. The CNOP has been successfully applied in estimation of the lower bound of maximum predictable time (LBMPT). Generally, CNOPs are calculated by a gradient descent algorithm based on the adjoint model, which is called ADJ-CNOP. This study, through the two-dimensional Ikeda model, investigates the impacts of the nonlinearity on ADJ-CNOP and the corresponding precision problems when using ADJ-CNOP to estimate the LBMPT. Our conclusions are that (1) when the initial perturbation is large or the prediction time is long, the strong nonlinearity of the dynamical model in the prediction variable will lead to failure of the ADJ-CNOP method, and (2) when the objective function has multiple extreme values, ADJ-CNOP has a large probability of producing local CNOPs, hence making a false estimation of the LBMPT. Furthermore, the particle swarm optimization (PSO) algorithm, one kind of intelligent algorithm, is introduced to solve this problem. The method using PSO to compute CNOP is called PSO-CNOP. The results of numerical experiments show that even with a large initial perturbation and long prediction time, or when the objective function has multiple extreme values, PSO-CNOP can always obtain the global CNOP. Since the PSO algorithm is a heuristic search algorithm based on the population, it can overcome the impact of nonlinearity and the disturbance from multiple extremes of the objective function. In addition, to check the estimation accuracy of the LBMPT presented by PSO-CNOP and ADJ-CNOP, we partition the constraint domain of initial perturbations into sufficiently fine grid meshes and take the LBMPT obtained by the filtering method as a benchmark. The result shows that the estimation presented by PSO-CNOP is closer to the true value than the one by ADJ-CNOP with the forecast time increasing.
The value of flexibility in conservation financing.
Lennox, Gareth D; Fargione, Joseph; Spector, Sacha; Williams, Gwyn; Armsworth, Paul R
2017-06-01
Land-acquisition strategies employed by conservation organizations vary in their flexibility. Conservation-planning theory largely fails to reflect this by presenting models that are either extremely inflexible-parcel acquisitions are irreversible and budgets are fixed-or extremely flexible-previously acquired parcels can readily be sold. This latter approach, the selling of protected areas, is infeasible or problematic in many situations. We considered the value to conservation organizations of increasing the flexibility of their land-acquisition strategies through their approach to financing deals. Specifically, we modeled 2 acquisition-financing methods commonly used by conservation organizations: borrowing and budget carry-over. Using simulated data, we compared results from these models with those from an inflexible fixed-budget model and an extremely flexible selling model in which previous acquisitions could be sold to fund new acquisitions. We then examined 3 case studies of how conservation organizations use borrowing and budget carry-over in practice. Model comparisons showed that borrowing and budget carry-over always returned considerably higher rewards than the fixed-budget model. How they performed relative to the selling model depended on the relative conservation value of past acquisitions. Both the models and case studies showed that incorporating flexibility through borrowing or budget carry-over gives conservation organizations the ability to purchase parcels of higher conservation value than when budgets are fixed without the problems associated with the selling of protected areas. © 2016 Society for Conservation Biology.
Synoptic and meteorological drivers of extreme ozone concentrations over Europe
NASA Astrophysics Data System (ADS)
Otero, Noelia Felipe; Sillmann, Jana; Schnell, Jordan L.; Rust, Henning W.; Butler, Tim
2016-04-01
The present work assesses the relationship between local and synoptic meteorological conditions and surface ozone concentration over Europe in spring and summer months, during the period 1998-2012 using a new interpolated data set of observed surface ozone concentrations over the European domain. Along with local meteorological conditions, the influence of large-scale atmospheric circulation on surface ozone is addressed through a set of airflow indices computed with a novel implementation of a grid-by-grid weather type classification across Europe. Drivers of surface ozone over the full distribution of maximum daily 8-hour average values are investigated, along with drivers of the extreme high percentiles and exceedances or air quality guideline thresholds. Three different regression techniques are applied: multiple linear regression to assess the drivers of maximum daily ozone, logistic regression to assess the probability of threshold exceedances and quantile regression to estimate the meteorological influence on extreme values, as represented by the 95th percentile. The relative importance of the input parameters (predictors) is assessed by a backward stepwise regression procedure that allows the identification of the most important predictors in each model. Spatial patterns of model performance exhibit distinct variations between regions. The inclusion of the ozone persistence is particularly relevant over Southern Europe. In general, the best model performance is found over Central Europe, where the maximum temperature plays an important role as a driver of maximum daily ozone as well as its extreme values, especially during warmer months.
Geoelectric hazard maps for the continental United States
NASA Astrophysics Data System (ADS)
Love, Jeffrey J.; Pulkkinen, Antti; Bedrosian, Paul A.; Jonas, Seth; Kelbert, Anna; Rigler, E. Joshua; Finn, Carol A.; Balch, Christopher C.; Rutledge, Robert; Waggel, Richard M.; Sabata, Andrew T.; Kozyra, Janet U.; Black, Carrie E.
2016-09-01
In support of a multiagency project for assessing induction hazards, we present maps of extreme-value geoelectric amplitudes over about half of the continental United States. These maps are constructed using a parameterization of induction: estimates of Earth surface impedance, obtained at discrete geographic sites from magnetotelluric survey data, are convolved with latitude-dependent statistical maps of extreme-value geomagnetic activity, obtained from decades of magnetic observatory data. Geoelectric amplitudes are estimated for geomagnetic waveforms having 240 s sinusoidal period and amplitudes over 10 min that exceed a once-per-century threshold. As a result of the combination of geographic differences in geomagnetic activity and Earth surface impedance, once-per-century geoelectric amplitudes span more than 2 orders of magnitude and are an intricate function of location. For north-south induction, once-per-century geoelectric amplitudes across large parts of the United States have a median value of 0.26 V/km; for east-west geomagnetic variation the median value is 0.23 V/km. At some locations, once-per-century geoelectric amplitudes exceed 3 V/km.
Geoelectric hazard maps for the continental United States
Love, Jeffrey J.; Pulkkinen, Antti; Bedrosian, Paul A.; Jonas, Seth; Kelbert, Anna; Rigler, Erin (Josh); Finn, Carol; Balch, Christopher; Rutledge, Robert; Waggel, Richard; Sabata, Andrew; Kozyra, Janet; Black, Carrie
2016-01-01
In support of a multiagency project for assessing induction hazards, we present maps of extreme-value geoelectric amplitudes over about half of the continental United States. These maps are constructed using a parameterization of induction: estimates of Earth surface impedance, obtained at discrete geographic sites from magnetotelluric survey data, are convolved with latitude-dependent statistical maps of extreme-value geomagnetic activity, obtained from decades of magnetic observatory data. Geoelectric amplitudes are estimated for geomagnetic waveforms having 240 s sinusoidal period and amplitudes over 10 min that exceed a once-per-century threshold. As a result of the combination of geographic differences in geomagnetic activity and Earth surface impedance, once-per-century geoelectric amplitudes span more than 2 orders of magnitude and are an intricate function of location. For north-south induction, once-per-century geoelectric amplitudes across large parts of the United States have a median value of 0.26 V/km; for east-west geomagnetic variation the median value is 0.23 V/km. At some locations, once-per-century geoelectric amplitudes exceed 3 V/km.
Use of NARCCAP results for extremes: British Columbia case studies
NASA Astrophysics Data System (ADS)
Murdock, T. Q.; Eckstrand, H.; Buerger, G.; Hiebert, J.
2011-12-01
Demand for projections of extremes has arisen out of local infrastructure vulnerability assessments and adaptation planning. Four preliminary analyses of extremes have been undertaken in British Columbia in the past two years in collaboration with users: BC Ministry of Transportation and Infrastructure, Engineers Canada, City of Castelgar, and Columbia Basin Trust. Projects have included analysis of extremes for stormwater management, highways, and community adaptation in different areas of the province. This need for projections of extremes has been met using an ensemble of Regional Climate Model (RCM) results from NARCCAP, in some cases supplemented by and compared to statistical downscaling. Before assessing indices of extremes, each RCM simulation in the NARCCAP ensemble driven by reanalysis (NCEP) was compared to historical observations to assess RCM skill. Next, the anomalies according to each RCM future projection were compared to those of their driving GCM to determine the "value added" by the RCMs. Selected results will be shown for several indices of extremes, including the Climdex set of indices that has been widely used elsewhere (e.g., Stardex) and specific parameters of interest defined by users. Finally, the need for threshold scaling of some indices and use of as large an ensemble as possible will be illustrated.
Spatial distribution of precipitation extremes in Norway
NASA Astrophysics Data System (ADS)
Verpe Dyrrdal, Anita; Skaugen, Thomas; Lenkoski, Alex; Thorarinsdottir, Thordis; Stordal, Frode; Førland, Eirik J.
2015-04-01
Estimates of extreme precipitation, in terms of return levels, are crucial in planning and design of important infrastructure. Through two separate studies, we have examined the levels and spatial distribution of daily extreme precipitation over catchments in Norway, and hourly extreme precipitation in a point. The analyses were carried out through the development of two new methods for estimating extreme precipitation in Norway. For daily precipitation we fit the Generalized Extreme Value (GEV) distribution to areal time series from a gridded dataset, consisting of daily precipitation during the period 1957-today with a resolution of 1x1 km². This grid-based method is more objective and less manual and time-consuming compared to the existing method at MET Norway. In addition, estimates in ungauged catchments are easier to obtain, and the GEV approach includes a measure of uncertainty, which is a requirement in climate studies today. Further, we go into depth on the debated GEV shape parameter, which plays an important role for longer return periods. We show that it varies according to dominating precipitation types, having positive values in the southeast and negative values in the southwest. We also find indications that the degree of orographic enhancement might affect the shape parameter. For hourly precipitation, we estimate return levels on a 1x1 km² grid, by linking GEV distributions with latent Gaussian fields in a Bayesian hierarchical model (BHM). Generalized linear models on the GEV parameters, estimated from observations, are able to incorporate location-specific geographic and meteorological information and thereby accommodate these effects on extreme precipitation. Gaussian fields capture additional unexplained spatial heterogeneity and overcome the sparse grid on which observations are collected, while a Bayesian model averaging component directly assesses model uncertainty. We find that mean summer precipitation, mean summer temperature, latitude, longitude, mean annual precipitation and elevation are good covariate candidates for hourly precipitation in our model. Summer indices succeed because hourly precipitation extremes often occur during the convective season. The spatial distribution of hourly and daily precipitation differs in Norway. Daily precipitation extremes are larger along the southwestern coast, where large-scale frontal systems dominate during fall season and the mountain ridge generates strong orographic enhancement. The largest hourly precipitation extremes are mostly produced by intense convective showers during summer, and are thus found along the entire southern coast, including the Oslo-region.
Characteristics and present trends of wave extremes in the Mediterranean Sea
NASA Astrophysics Data System (ADS)
Pino, Cosimo; Lionello, Piero; Galati, Maria Barbara
2010-05-01
Wind generated surface waves are an important factor characterizing marine storminess and the marine environment. This contribution considers characteristics and trends of SWH (Significant Wave Height) extremes (both high and low extremes, such as dead calm duration are analyzed). The data analysis is based on a 44-year long simulation (1958-2001) of the wave field in the Mediterranean Sea. The quality of the model simulation is controlled using satellite data. The results show the different characteristics of the different parts of the basin with the variability being higher in the western (where the highest values are produced) than in the eastern areas of the basin (where absence of wave is a rare condition). In fact, both duration of storms and of dead calm episodes is larger in the east than in the west part of the Mediterranean. The African coast and the southern Ionian Sea are the areas were exceptional values of SWH are expected to occur in correspondence with exceptional meteorological events. Significant trends of storm characteristics are present only in sparse areas and suggest a decrease of both storm intensity and duration (a marginal increase of storm intensity is present in the center of the Mediterranean). The statistics of extremes and high SWH values is substantially steady during the second half of the 20th century. The influence of the large-scale teleconnection patterns (TlcP) that are known to be relevant for the Mediterranean climate on the intensity and spatial distribution of extreme SWH (Significant Wave Height) has been investigated. The analysis was focused on the monthly scale analysing the variability of links along the annual cycle. The considered TlcP are the North Atlantic Oscillation, the East-Atlantic / West-Russian pattern and the Scandinavian pattern and their effect on the intensity and the frequency of high/low SWH conditions. The results show it is difficult to establish a dominant TlcP for SWH extremes, because all 4 patterns considered are important for at least few months in the year and none of them is important for the whole year. High extremes in winter and fall are more influenced by the TlcPs than in other seasons and low extremes.
Extreme-event geoelectric hazard maps
NASA Astrophysics Data System (ADS)
Love, J. J.; Bedrosian, P.
2017-12-01
Maps covering about half of the continental United States are presented of geoelectric field amplitude that will be exceeded, on average, once per century in response to extreme-intensity geomagnetic disturbance. These maps are constructed using an empirical parameterization of induction: convolving latitude-dependent statistical maps of extreme-value geomagnetic disturbance, obtained from decades of 1-minute magnetic observatory data, with local estimates of Earth-surface impedance, obtained at discrete geographic sites from magnetotelluric surveys. Geoelectric amplitudes are estimated for geomagnetic waveforms having 240-s (and 1200-s) sinusoidal period and amplitudes over 10 minutes (1-hr) that exceed a once-per-century threshold. As a result of the combination of geographic differences in geomagnetic variation and Earth-surface impedance, once-per-century geoelectric amplitudes span more than two orders of magnitude and are a highly granular function of location. Specifically: for north-south 240-s induction, once-per-century geoelectric amplitudes across large parts of the United States have a median value of 0.34 V/km; for east-west variation, they have a median value of 0.23 V/km. In Northern Minnesota, amplitudes exceed 14.00 V/km for north-south geomagnetic variation (23.34 V/km for east-west variation), while just over 100 km away, amplitudes are only 0.08 V/km (0.02 V/km). At some sites in the Northern Central United States, once-per-century geoelectric amplitudes exceed the 2 V/km realized in Quebec during the March 1989 storm. These hazard maps are incomplete over large parts of the United States, including major population centers in the southern United States, due to a lack of publically available impedance data.
Xiang, Yang; Lu, Kewei; James, Stephen L.; Borlawsky, Tara B.; Huang, Kun; Payne, Philip R.O.
2011-01-01
The Unified Medical Language System (UMLS) is the largest thesaurus in the biomedical informatics domain. Previous works have shown that knowledge constructs comprised of transitively-associated UMLS concepts are effective for discovering potentially novel biomedical hypotheses. However, the extremely large size of the UMLS becomes a major challenge for these applications. To address this problem, we designed a k-neighborhood Decentralization Labeling Scheme (kDLS) for the UMLS, and the corresponding method to effectively evaluate the kDLS indexing results. kDLS provides a comprehensive solution for indexing the UMLS for very efficient large scale knowledge discovery. We demonstrated that it is highly effective to use kDLS paths to prioritize disease-gene relations across the whole genome, with extremely high fold-enrichment values. To our knowledge, this is the first indexing scheme capable of supporting efficient large scale knowledge discovery on the UMLS as a whole. Our expectation is that kDLS will become a vital engine for retrieving information and generating hypotheses from the UMLS for future medical informatics applications. PMID:22154838
Xiang, Yang; Lu, Kewei; James, Stephen L; Borlawsky, Tara B; Huang, Kun; Payne, Philip R O
2012-04-01
The Unified Medical Language System (UMLS) is the largest thesaurus in the biomedical informatics domain. Previous works have shown that knowledge constructs comprised of transitively-associated UMLS concepts are effective for discovering potentially novel biomedical hypotheses. However, the extremely large size of the UMLS becomes a major challenge for these applications. To address this problem, we designed a k-neighborhood Decentralization Labeling Scheme (kDLS) for the UMLS, and the corresponding method to effectively evaluate the kDLS indexing results. kDLS provides a comprehensive solution for indexing the UMLS for very efficient large scale knowledge discovery. We demonstrated that it is highly effective to use kDLS paths to prioritize disease-gene relations across the whole genome, with extremely high fold-enrichment values. To our knowledge, this is the first indexing scheme capable of supporting efficient large scale knowledge discovery on the UMLS as a whole. Our expectation is that kDLS will become a vital engine for retrieving information and generating hypotheses from the UMLS for future medical informatics applications. Copyright © 2011 Elsevier Inc. All rights reserved.
Assessing the features of extreme smog in China and the differentiated treatment strategy
NASA Astrophysics Data System (ADS)
Deng, Lu; Zhang, Zhengjun
2018-01-01
Extreme smog can have potentially harmful effects on human health, the economy and daily life. However, the average (mean) values do not provide strategically useful information on the hazard analysis and control of extreme smog. This article investigates China's smog extremes by applying extreme value analysis to hourly PM2.5 data from 2014 to 2016 obtained from monitoring stations across China. By fitting a generalized extreme value (GEV) distribution to exceedances over a station-specific extreme smog level at each monitoring location, all study stations are grouped into eight different categories based on the estimated mean and shape parameter values of fitted GEV distributions. The extreme features characterized by the mean of the fitted extreme value distribution, the maximum frequency and the tail index of extreme smog at each location are analysed. These features can provide useful information for central/local government to conduct differentiated treatments in cities within different categories and conduct similar prevention goals and control strategies among those cities belonging to the same category in a range of areas. Furthermore, hazardous hours, breaking probability and the 1-year return level of each station are demonstrated by category, based on which the future control and reduction targets of extreme smog are proposed for the cities of Beijing, Tianjin and Hebei as an example.
Investigation of the relationship between hurricane waves and extreme runup
NASA Astrophysics Data System (ADS)
Thompson, D. M.; Stockdon, H. F.
2006-12-01
In addition to storm surge, the elevation of wave-induced runup plays a significant role in forcing geomorphic change during extreme storms. Empirical formulations for extreme runup, defined as the 2% exceedence level, are dependent on some measure of significant offshore wave height. Accurate prediction of extreme runup, particularly during hurricanes when wave heights are large, depends on selecting the most appropriate measure of wave height that provides energy to the nearshore system. Using measurements from deep-water wave buoys results in an overprediction of runup elevation. Under storm forcing these large waves dissipate across the shelf through friction, whitecapping and depth-limited breaking before reaching the beach and forcing swash processes. The use of a local, shallow water wave height has been shown to provide a more accurate estimate of extreme runup elevation (Stockdon, et. al. 2006); however, a specific definition of this local wave height has yet to be defined. Using observations of nearshore waves from the U.S. Army Corps of Engineers' Field Research Facility (FRF) in Duck, NC during Hurricane Isabel, the most relevant measure of wave height for use in empirical runup parameterizations was examined. Spatial and temporal variability of the hurricane wave field, which made landfall on September 18, 2003, were modeled using SWAN. Comparisons with wave data from FRF gages and deep-water buoys operated by NOAA's National Data Buoy Center were used for model calibration. Various measures of local wave height (breaking, dissipation-based, etc.) were extracted from the model domain and used as input to the runup parameterizations. Video based observations of runup collected at the FRF during the storm were used to ground truth modeled values. Assessment of the most appropriate measure of wave height can be extended over a large area through comparisons to observations of storm- induced geomorphic change.
NASA Astrophysics Data System (ADS)
Peterson, D. A.; Hyer, E. J.; Campbell, J. R.; Fromm, M. D.; Hair, J. W.; Butler, C. F.; Fenn, M. A.
2014-12-01
A variety of regional smoke forecasting applications are currently available to identify air quality, visibility, and societal impacts during large fire events. However, these systems typically assume persistent fire activity, and therefore can have large errors before, during, and after short-term periods of extreme fire behavior. This study employs a wide variety of ground, airborne, and satellite observations, including data collected during a major NASA airborne and field campaign, to examine the conditions required for both extreme spread and pyrocumulonimbus (pyroCb) development. Results highlight the importance of upper-level and nocturnal meteorology, as well as the limitations of traditional fire weather indices. Increasing values of fire radiative power (FRP) at the pixel and sub-pixel level are shown to systematically correspond to higher altitude smoke plumes, and an increased probability of injection above the boundary layer. Lidar data collected during the 2013 Rim Fire, one of the most severe fire events in California's history, show that high FRP observed during extreme spread can facilitate long-distance smoke transport, but fails to loft smoke to the altitude of a large pyroCb. The most extreme fire spread was also observed on days without pyroCb activity or significant regional convection. By incorporating additional fire events across North America, conflicting hypotheses surrounding the primary source of moisture during pyroCb development are examined. The majority of large pyroCbs, and therefore the highest direct injection altitude of smoke particles, is shown to occur with conditions very similar to those that produce dry thunderstorms. The current suite of automated forecasting applications predict only general trends in fire behavior, and specifically do not predict (1) extreme fire spread events and (2) injection of smoke to high altitudes. While (1) and (2) are related, results show that they are not predicted by the same set of conditions and variables. The combination of meteorology from numerical forecast models and satellite observations exhibits great potential for improving regional forecasts of fire behavior and smoke production in automated systems, especially in remote areas where detailed observations are unavailable
Extreme values in the Chinese and American stock markets based on detrended fluctuation analysis
NASA Astrophysics Data System (ADS)
Cao, Guangxi; Zhang, Minjia
2015-10-01
This paper focuses on the comparative analysis of extreme values in the Chinese and American stock markets based on the detrended fluctuation analysis (DFA) algorithm using the daily data of Shanghai composite index and Dow Jones Industrial Average. The empirical results indicate that the multifractal detrended fluctuation analysis (MF-DFA) method is more objective than the traditional percentile method. The range of extreme value of Dow Jones Industrial Average is smaller than that of Shanghai composite index, and the extreme value of Dow Jones Industrial Average is more time clustering. The extreme value of the Chinese or American stock markets is concentrated in 2008, which is consistent with the financial crisis in 2008. Moreover, we investigate whether extreme events affect the cross-correlation between the Chinese and American stock markets using multifractal detrended cross-correlation analysis algorithm. The results show that extreme events have nothing to do with the cross-correlation between the Chinese and American stock markets.
Isosurface Extraction in Time-Varying Fields Using a Temporal Hierarchical Index Tree
NASA Technical Reports Server (NTRS)
Shen, Han-Wei; Gerald-Yamasaki, Michael (Technical Monitor)
1998-01-01
Many high-performance isosurface extraction algorithms have been proposed in the past several years as a result of intensive research efforts. When applying these algorithms to large-scale time-varying fields, the storage overhead incurred from storing the search index often becomes overwhelming. this paper proposes an algorithm for locating isosurface cells in time-varying fields. We devise a new data structure, called Temporal Hierarchical Index Tree, which utilizes the temporal coherence that exists in a time-varying field and adoptively coalesces the cells' extreme values over time; the resulting extreme values are then used to create the isosurface cell search index. For a typical time-varying scalar data set, not only does this temporal hierarchical index tree require much less storage space, but also the amount of I/O required to access the indices from the disk at different time steps is substantially reduced. We illustrate the utility and speed of our algorithm with data from several large-scale time-varying CID simulations. Our algorithm can achieve more than 80% of disk-space savings when compared with the existing techniques, while the isosurface extraction time is nearly optimal.
Data informatics for the Detection, Characterization, and Attribution of Climate Extremes
NASA Astrophysics Data System (ADS)
Collins, W.; Wehner, M. F.; O'Brien, T. A.; Paciorek, C. J.; Krishnan, H.; Johnson, J. N.; Prabhat, M.
2015-12-01
The potential for increasing frequency and intensity of extremephenomena including downpours, heat waves, and tropical cyclonesconstitutes one of the primary risks of climate change for society andthe environment. The challenge of characterizing these risks is thatextremes represent the "tails" of distributions of atmosphericphenomena and are, by definition, highly localized and typicallyrelatively transient. Therefore very large volumes of observationaldata and projections of future climate are required to quantify theirproperties in a robust manner. Massive data analytics are required inorder to detect individual extremes, accumulate statistics on theirproperties, quantify how these statistics are changing with time, andattribute the effects of anthropogenic global warming on thesestatistics. We describe examples of the suite of techniques the climate communityis developing to address these analytical challenges. The techniquesinclude massively parallel methods for detecting and trackingatmospheric rivers and cyclones; data-intensive extensions togeneralized extreme value theory to summarize the properties ofextremes; and multi-model ensembles of hindcasts to quantify theattributable risk of anthropogenic influence on individual extremes.We conclude by highlighting examples of these methods developed by ourCASCADE (Calibrated and Systematic Characterization, Attribution, andDetection of Extremes) project.
NASA Astrophysics Data System (ADS)
Li, Donghuan; Zhou, Tianjun; Zou, Liwei; Zhang, Wenxia; Zhang, Lixia
2018-02-01
Extreme high-temperature events have large socioeconomic and human health impacts. East Asia (EA) is a populous region, and it is crucial to assess the changes in extreme high-temperature events in this region under different climate change scenarios. The Community Earth System Model low-warming experiment data were applied to investigate the changes in the mean and extreme high temperatures in EA under 1.5°C and 2°C warming conditions above preindustrial levels. The results show that the magnitude of warming in EA is approximately 0.2°C higher than the global mean. Most populous subregions, including eastern China, the Korean Peninsula, and Japan, will see more intense, more frequent, and longer-lasting extreme temperature events under 1.5°C and 2°C warming. The 0.5°C lower warming will help avoid 35%-46% of the increases in extreme high-temperature events in terms of intensity, frequency, and duration in EA with maximal avoidance values (37%-49%) occurring in Mongolia. Thus, it is beneficial for EA to limit the warming target to 1.5°C rather than 2°C.
NASA Astrophysics Data System (ADS)
Lazar, Boaz; Erez, Jonathan
1990-12-01
Extreme depletions in the 13C content of the total dissolved inorganic carbon (CT) were found in brines overlying microbial mat communities. Total alkalinity (AT) and CT in the brines suggest that intense photosynthetic activity of the microbial mat communities depletes the CT from the brine. We suggest that this depletion drives a large, kinetic, negative fractionation of carbon isotopes similar to that observed in highly alkaline solutions. In brines of extreme salinity where microbial mat communities no longer exist, the 13C content of the CT increases, probably because photosynthesis no longer dominates the gas-exchange processes. This mechanism explains light carbon-isotope compositions of carbonate rocks from evaporitic sections and bears on the interpretation of δ13C values in bedded stromatolitic limestones that are ca. 3.5 b.y. old.
Sensitivity Analysis of Expected Wind Extremes over the Northwestern Sahara and High Atlas Region.
NASA Astrophysics Data System (ADS)
Garcia-Bustamante, E.; González-Rouco, F. J.; Navarro, J.
2017-12-01
A robust statistical framework in the scientific literature allows for the estimation of probabilities of occurrence of severe wind speeds and wind gusts, but does not prevent however from large uncertainties associated with the particular numerical estimates. An analysis of such uncertainties is thus required. A large portion of this uncertainty arises from the fact that historical observations are inherently shorter that the timescales of interest for the analysis of return periods. Additional uncertainties stem from the different choices of probability distributions and other aspects related to methodological issues or physical processes involved. The present study is focused on historical observations over the Ouarzazate Valley (Morocco) and in a high-resolution regional simulation of the wind in the area of interest. The aim is to provide extreme wind speed and wind gust return values and confidence ranges based on a systematic sampling of the uncertainty space for return periods up to 120 years.
Valuing happiness is associated with bipolar disorder.
Ford, Brett Q; Mauss, Iris B; Gruber, June
2015-04-01
Although people who experience happiness tend to have better psychological health, people who value happiness to an extreme tend to have worse psychological health, including more depression. We propose that the extreme valuing of happiness may be a general risk factor for mood disturbances, both depressive and manic. To test this hypothesis, we examined the relationship between the extreme valuing of happiness and risk for, diagnosis of, and illness course for bipolar disorder (BD). Supporting our hypothesis, the extreme valuing of happiness was associated with a measure of increased risk for developing BD (Studies 1 and 2), increased likelihood of past diagnosis of BD (Studies 2 and 3), and worse prospective illness course in BD (Study 3), even when controlling for current mood symptoms (Studies 1-3). These findings indicate that the extreme valuing of happiness is associated with and even predicts BD. Taken together with previous evidence, these findings suggest that the extreme valuing of happiness is a general risk factor for mood disturbances. More broadly, what emotions people strive to feel may play a critical role in psychological health. (c) 2015 APA, all rights reserved).
Valuing happiness is associated with bipolar disorder
Ford, Brett Q.; Mauss, Iris B.; Gruber, June
2015-01-01
While people who experience happiness tend to have better psychological health, people who value happiness to an extreme tend to have worse psychological health, including more depression. We propose that the extreme valuing of happiness may be a general risk factor for mood disturbances, both depressive and manic. To test this hypothesis, we examined the relationship between the extreme valuing of happiness and risk for, diagnosis of, and illness course for Bipolar Disorder (BD). Supporting our hypothesis, the extreme valuing of happiness was associated with a measure of increased risk for developing BD (Studies 1–2), increased likelihood of past diagnosis of BD (Studies 2–3), and worse prospective illness course in BD (Study 3), even when controlling for current mood symptoms (Studies 1–3). These findings indicate that the extreme valuing of happiness is associated with and even predicts BD. Taken together with previous evidence, these findings suggest that the extreme valuing of happiness is a general risk factor for mood disturbances. More broadly, what emotions people strive to feel may play a critical role in psychological health. PMID:25603134
Matrix Perturbation Techniques in Structural Dynamics
NASA Technical Reports Server (NTRS)
Caughey, T. K.
1973-01-01
Matrix perturbation are developed techniques which can be used in the dynamical analysis of structures where the range of numerical values in the matrices extreme or where the nature of the damping matrix requires that complex valued eigenvalues and eigenvectors be used. The techniques can be advantageously used in a variety of fields such as earthquake engineering, ocean engineering, aerospace engineering and other fields concerned with the dynamical analysis of large complex structures or systems of second order differential equations. A number of simple examples are included to illustrate the techniques.
NASA Astrophysics Data System (ADS)
Panziera, Luca; Gabella, Marco; Zanini, Stefano; Hering, Alessandro; Germann, Urs; Berne, Alexis
2016-06-01
This paper presents a regional extreme rainfall analysis based on 10 years of radar data for the 159 regions adopted for official natural hazard warnings in Switzerland. Moreover, a nowcasting tool aimed at issuing heavy precipitation regional alerts is introduced. The two topics are closely related, since the extreme rainfall analysis provides the thresholds used by the nowcasting system for the alerts. Warm and cold seasons' monthly maxima of several statistical quantities describing regional rainfall are fitted to a generalized extreme value distribution in order to derive the precipitation amounts corresponding to sub-annual return periods for durations of 1, 3, 6, 12, 24 and 48 h. It is shown that regional return levels exhibit a large spatial variability in Switzerland, and that their spatial distribution strongly depends on the duration of the aggregation period: for accumulations of 3 h and shorter, the largest return levels are found over the northerly alpine slopes, whereas for longer durations the southern Alps exhibit the largest values. The inner alpine chain shows the lowest values, in agreement with previous rainfall climatologies. The nowcasting system presented here is aimed to issue heavy rainfall alerts for a large variety of end users, who are interested in different precipitation characteristics and regions, such as, for example, small urban areas, remote alpine catchments or administrative districts. The alerts are issued not only if the rainfall measured in the immediate past or forecast in the near future exceeds some predefined thresholds but also as soon as the sum of past and forecast precipitation is larger than threshold values. This precipitation total, in fact, has primary importance in applications for which antecedent rainfall is as important as predicted one, such as urban floods early warning systems. The rainfall fields, the statistical quantity representing regional rainfall and the frequency of alerts issued in case of continuous threshold exceedance are some of the configurable parameters of the tool. The analysis of the urban flood which occurred in the city of Schaffhausen in May 2013 suggests that this alert tool might have complementary skill with respect to radar-based thunderstorm nowcasting systems for storms which do not show a clear convective signature.
Exchangeability, extreme returns and Value-at-Risk forecasts
NASA Astrophysics Data System (ADS)
Huang, Chun-Kai; North, Delia; Zewotir, Temesgen
2017-07-01
In this paper, we propose a new approach to extreme value modelling for the forecasting of Value-at-Risk (VaR). In particular, the block maxima and the peaks-over-threshold methods are generalised to exchangeable random sequences. This caters for the dependencies, such as serial autocorrelation, of financial returns observed empirically. In addition, this approach allows for parameter variations within each VaR estimation window. Empirical prior distributions of the extreme value parameters are attained by using resampling procedures. We compare the results of our VaR forecasts to that of the unconditional extreme value theory (EVT) approach and the conditional GARCH-EVT model for robust conclusions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
R. G. Little
1999-03-01
The Idaho National Engineering and Environmental Laboratory (INEEL), through the US Department of Energy (DOE), has proposed that a large-scale wind test facility (LSWTF) be constructed to study, in full-scale, the behavior of low-rise structures under simulated extreme wind conditions. To determine the need for, and potential benefits of, such a facility, the Idaho Operations Office of the DOE requested that the National Research Council (NRC) perform an independent assessment of the role and potential value of an LSWTF in the overall context of wind engineering research. The NRC established the Committee to Review the Need for a Large-scale Testmore » Facility for Research on the Effects of Extreme Winds on Structures, under the auspices of the Board on Infrastructure and the Constructed Environment, to perform this assessment. This report conveys the results of the committee's deliberations as well as its findings and recommendations. Data developed at large-scale would enhanced the understanding of how structures, particularly light-frame structures, are affected by extreme winds (e.g., hurricanes, tornadoes, sever thunderstorms, and other events). With a large-scale wind test facility, full-sized structures, such as site-built or manufactured housing and small commercial or industrial buildings, could be tested under a range of wind conditions in a controlled, repeatable environment. At this time, the US has no facility specifically constructed for this purpose. During the course of this study, the committee was confronted by three difficult questions: (1) does the lack of a facility equate to a need for the facility? (2) is need alone sufficient justification for the construction of a facility? and (3) would the benefits derived from information produced in an LSWTF justify the costs of producing that information? The committee's evaluation of the need and justification for an LSWTF was shaped by these realities.« less
NASA Astrophysics Data System (ADS)
Nguyen, Minh D.; Houwman, Evert; Dekkers, Matthijn; Schlom, Darrell; Rijnders, Guus
2017-07-01
All-oxide free-standing cantilevers were fabricated with epitaxial (001)-oriented Pb(Zr0.52Ti0.48)O3 (PZT) and Pb(Zr0.52Ti0.48)0.99Nb0.01O3 (PNZT) as piezoelectric layers and SrRuO3 electrodes. The ferroelectric and piezoelectric hysteresis loops were measured. From the zero-bias values, the figure-of-merits (FOMs) for piezoelectric energy harvesting systems were calculated. For the PNZT cantilever, an extremely large value FOM = 55 GPa was obtained. This very high value is due to the large shifts of the hysteresis loops such that the zero-bias piezoelectric coefficient e31f is maximum and the zero-bias dielectric constant is strongly reduced compared to the value in the undoped PZT device. The results show that by engineering the self-bias field the energy-harvesting properties of piezoelectric systems can be increased significantly.
Alves, Gelio; Yu, Yi-Kuo
2016-09-01
There is a growing trend for biomedical researchers to extract evidence and draw conclusions from mass spectrometry based proteomics experiments, the cornerstone of which is peptide identification. Inaccurate assignments of peptide identification confidence thus may have far-reaching and adverse consequences. Although some peptide identification methods report accurate statistics, they have been limited to certain types of scoring function. The extreme value statistics based method, while more general in the scoring functions it allows, demands accurate parameter estimates and requires, at least in its original design, excessive computational resources. Improving the parameter estimate accuracy and reducing the computational cost for this method has two advantages: it provides another feasible route to accurate significance assessment, and it could provide reliable statistics for scoring functions yet to be developed. We have formulated and implemented an efficient algorithm for calculating the extreme value statistics for peptide identification applicable to various scoring functions, bypassing the need for searching large random databases. The source code, implemented in C ++ on a linux system, is available for download at ftp://ftp.ncbi.nlm.nih.gov/pub/qmbp/qmbp_ms/RAId/RAId_Linux_64Bit yyu@ncbi.nlm.nih.gov Supplementary data are available at Bioinformatics online. Published by Oxford University Press 2016. This work is written by US Government employees and is in the public domain in the US.
Hannah, J.L.; Stein, H.J.
1986-01-01
Quartz phenocrysts from 31 granitoid stocks in the Colorado Mineral Belt yield ??18O values less than 10.4???, with most values between 9.3 and 10.4???. An average magmatic value of about 8.5??? is suggested. The stocks resemble A-type granites; these data support magma genesis by partial melting of previously depleted, fluorine-enriched, lower crustal granulites, followed by extreme differentiation and volatile evolution in the upper crust. Subsolidus interaction of isotopically light water with stocks has reduced most feldspar and whole rock ??18O values. Unaltered samples from Climax-type molybdenumbearing granites, however, show no greater isotopic disturbance than samples from unmineralized stocks. Although meteoric water certainly played a role in post-mineralization alteration, particularly in feldspars, it is not required during high-temperature mineralization processes. We suggest that slightly low ??18O values in some vein and replacement minerals associated with molybdenum mineralization may have resulted from equilibration with isotopically light magmatic water and/or heavy isotope depletion of the ore fluid by precipitation of earlier phases. Accumulation of sufficient quantities of isotopically light magmatic water to produce measured depletions of 18O requires extreme chemical stratification in a large magma reservoir. Upward migration of a highly fractionated, volatile-rich magma into a small apical Climax-type diapir, including large scale transport of silica, alkalis, molybdenum, and other vapor soluble elements, may occur with depression of the solidus temperature and reduction of magma viscosity by fluorine. Climax-type granites may provide examples of 18O depletion in magmatic systems without meteoric water influx. ?? 1986 Springer-Verlag.
Statistic analysis of annual total ozone extremes for the period 1964-1988
NASA Technical Reports Server (NTRS)
Krzyscin, Janusz W.
1994-01-01
Annual extremes of total column amount of ozone (in the period 1964-1988) from a network of 29 Dobson stations have been examined using the extreme value analysis. The extremes have been calculated as the highest deviation of daily mean total ozone from its long-term monthly mean, normalized by the monthly standard deviations. The extremes have been selected from the direct-Sun total ozone observations only. The extremes resulting from abrupt changes in ozone (day to day changes greater than 20 percent) have not been considered. The ordered extremes (maxima in ascending way, minima in descending way) have been fitted to one of three forms of the Fisher-Tippet extreme value distribution by the nonlinear least square method (Levenberg-Marguard method). We have found that the ordered extremes from a majority of Dobson stations lie close to Fisher-Tippet type III. The extreme value analysis of the composite annual extremes (combined from averages of the annual extremes selected at individual stations) has shown that the composite maxima are fitted by the Fisher-Tippet type III and the composite minima by the Fisher-Tippet type I. The difference between the Fisher-Tippet types of the composite extremes seems to be related to the ozone downward trend. Extreme value prognoses for the period 1964-2014 (derived from the data taken at: all analyzed stations, the North American, and the European stations) have revealed that the prognostic extremes are close to the largest annual extremes in the period 1964-1988 and there are only small regional differences in the prognoses.
NASA Astrophysics Data System (ADS)
Guimarães Nobre, Gabriela; Arnbjerg-Nielsen, Karsten; Rosbjerg, Dan; Madsen, Henrik
2016-04-01
Traditionally, flood risk assessment studies have been carried out from a univariate frequency analysis perspective. However, statistical dependence between hydrological variables, such as extreme rainfall and extreme sea surge, is plausible to exist, since both variables to some extent are driven by common meteorological conditions. Aiming to overcome this limitation, multivariate statistical techniques has the potential to combine different sources of flooding in the investigation. The aim of this study was to apply a range of statistical methodologies for analyzing combined extreme hydrological variables that can lead to coastal and urban flooding. The study area is the Elwood Catchment, which is a highly urbanized catchment located in the city of Port Phillip, Melbourne, Australia. The first part of the investigation dealt with the marginal extreme value distributions. Two approaches to extract extreme value series were applied (Annual Maximum and Partial Duration Series), and different probability distribution functions were fit to the observed sample. Results obtained by using the Generalized Pareto distribution demonstrate the ability of the Pareto family to model the extreme events. Advancing into multivariate extreme value analysis, first an investigation regarding the asymptotic properties of extremal dependence was carried out. As a weak positive asymptotic dependence between the bivariate extreme pairs was found, the Conditional method proposed by Heffernan and Tawn (2004) was chosen. This approach is suitable to model bivariate extreme values, which are relatively unlikely to occur together. The results show that the probability of an extreme sea surge occurring during a one-hour intensity extreme precipitation event (or vice versa) can be twice as great as what would occur when assuming independent events. Therefore, presuming independence between these two variables would result in severe underestimation of the flooding risk in the study area.
NASA Astrophysics Data System (ADS)
Gavrishchaka, V. V.; Ganguli, S. B.
2001-12-01
Reliable forecasting of rare events in a complex dynamical system is a challenging problem that is important for many practical applications. Due to the nature of rare events, data set available for construction of the statistical and/or machine learning model is often very limited and incomplete. Therefore many widely used approaches including such robust algorithms as neural networks can easily become inadequate for rare events prediction. Moreover in many practical cases models with high-dimensional inputs are required. This limits applications of the existing rare event modeling techniques (e.g., extreme value theory) that focus on univariate cases. These approaches are not easily extended to multivariate cases. Support vector machine (SVM) is a machine learning system that can provide an optimal generalization using very limited and incomplete training data sets and can efficiently handle high-dimensional data. These features may allow to use SVM to model rare events in some applications. We have applied SVM-based system to the problem of large-amplitude substorm prediction and extreme event forecasting in stock and currency exchange markets. Encouraging preliminary results will be presented and other possible applications of the system will be discussed.
NASA Technical Reports Server (NTRS)
Finley, D.; Malina, R. F.; Bowyer, S.
1985-01-01
The four flight Wolter-Schwarzschild mirrors currently under fabrication for the Extreme Ultraviolet Explorer (EUVE) satellite are described. The principal figuring operation of these grazing incidence metal mirrors (gold over nickel on an aluminum substrate) is carried out by diamond turning at the Lawrence Livermore National Laboratories. Turning has been accomplished and optical testing results analyzed for three of the mirrors. As-turned values of 1.7 arc sec full width at half maximum (FWHM) and half energy width (HEW) of 5 arc seconds in the visible have been achieved. These results illustrate the great potential of precision fabrication technology for the production of large grazing incidence optics.
Seasonal extreme value statistics for precipitation in Germany
NASA Astrophysics Data System (ADS)
Fischer, Madlen; Rust, Henning W.; Ulbrich, Uwe
2013-04-01
Extreme precipitation has a strong influence on environment, society and economy. It leads to large damage due to floods, mudslides, increased erosion or hail. While standard annual return levels are important for hydrological structures, seasonaly resolved return levels provide additional information for risk managment, e.g., for the agricultural sector. For 1208 stations in Germany, we calculate monthly resolved return levels. Instead of estimating parameters separately for every month in the year, we use a non-stationary approach and benefit from smoothly varying return levels throughout the year. This natural approach is more suitable to characterise seasonal variability of extreme precipitation and leads to more accurate return level estimates. Harmonic functions of different orders are used to describe the seasonal variation of GEV parameters and crossvalidation is used to determine a suitable model forall stations. Finally particularly vulnerable regions and associated month are investigated in more detail.
NASA Astrophysics Data System (ADS)
Rieder, H. E.; Staehelin, J.; Maeder, J. A.; Peter, T.; Ribatet, M.; Davison, A. C.; Stübi, R.; Weihs, P.; Holawe, F.
2010-10-01
In this study the frequency of days with extreme low (termed ELOs) and extreme high (termed EHOs) total ozone values and their influence on mean values and trends are analyzed for the world's longest total ozone record (Arosa, Switzerland). The results show (i) an increase in ELOs and (ii) a decrease in EHOs during the last decades and (iii) that the overall trend during the 1970s and 1980s in total ozone is strongly dominated by changes in these extreme events. After removing the extremes, the time series shows a strongly reduced trend (reduction by a factor of 2.5 for trend in annual mean). Excursions in the frequency of extreme events reveal "fingerprints" of dynamical factors such as ENSO or NAO, and chemical factors, such as cold Arctic vortex ozone losses, as well as major volcanic eruptions of the 20th century (Gunung Agung, El Chichón, Mt. Pinatubo). Furthermore, atmospheric loading of ozone depleting substances leads to a continuous modification of column ozone in the Northern Hemisphere also with respect to extreme values (partly again in connection with polar vortex contributions). Application of extreme value theory allows the identification of many more such "fingerprints" than conventional time series analysis of annual and seasonal mean values. The analysis shows in particular the strong influence of dynamics, revealing that even moderate ENSO and NAO events have a discernible effect on total ozone. Overall the approach to extremal modelling provides new information on time series properties, variability, trends and the influence of dynamics and chemistry, complementing earlier analyses focusing only on monthly (or annual) mean values.
NASA Astrophysics Data System (ADS)
Rieder, H. E.; Staehelin, J.; Maeder, J. A.; Peter, T.; Ribatet, M.; Davison, A. C.; Stübi, R.; Weihs, P.; Holawe, F.
2010-05-01
In this study the frequency of days with extreme low (termed ELOs) and extreme high (termed EHOs) total ozone values and their influence on mean values and trends are analyzed for the world's longest total ozone record (Arosa, Switzerland). The results show (a) an increase in ELOs and (b) a decrease in EHOs during the last decades and (c) that the overall trend during the 1970s and 1980s in total ozone is strongly dominated by changes in these extreme events. After removing the extremes, the time series shows a strongly reduced trend (reduction by a factor of 2.5 for trend in annual mean). Excursions in the frequency of extreme events reveal "fingerprints" of dynamical factors such as ENSO or NAO, and chemical factors, such as cold Arctic vortex ozone losses, as well as major volcanic eruptions of the 20th century (Gunung Agung, El Chichón, Mt. Pinatubo). Furthermore, atmospheric loading of ozone depleting substances leads to a continuous modification of column ozone in the Northern Hemisphere also with respect to extreme values (partly again in connection with polar vortex contributions). Application of extreme value theory allows the identification of many more such "fingerprints" than conventional time series analysis of annual and seasonal mean values. The analysis shows in particular the strong influence of dynamics, revealing that even moderate ENSO and NAO events have a discernible effect on total ozone. Overall the approach to extremal modelling provides new information on time series properties, variability, trends and the influence of dynamics and chemistry, complementing earlier analyses focusing only on monthly (or annual) mean values.
NASA Technical Reports Server (NTRS)
Hopkins, Randall C.; Benzing, Daniel A.
1998-01-01
Improvements in uncertainties in the values of radiant intensity (I) can be accomplished mainly by improvements in the calibration process and in minimizing the difference between the background and engine plume radiance. For engine tests in which the plume is extremely bright, the difference in luminance between the calibration lamp and the engine plume radiance can be so large as to cause relatively large uncertainties in the values of R. This is due to the small aperture necessary on the receiving optics to avoid saturating the instrument. However, this is not a problem with the SSME engine since the liquid oxygen/hydrogen combustion is not as bright as some other fuels. Applying the instrumentation to other type engine tests may require a much brighter calibration lamp.
Lee, Miryoung; Pascoe, John M; McNicholas, Caroline I
2017-01-01
Objectives The prevalence of extreme prematurity at birth has increased, but little research has examined its impact on developmental outcomes in large representative samples within the United States. This study examined the association of extreme prematurity with kindergarteners' reading skills, mathematics skills and fine motor skills. Methods The early childhood longitudinal study-birth cohort, a representative sample of the US children born in 2001 was analyzed for this study. Early reading and mathematics skills and fine motor skills were compared among 200 extremely premature children (EPC) (gestational age <28 wks or birthweight <1000 g), 500 premature children (PC), and 4300 term children (TC) (≥37wks or ≥2500 g). Generalized linear regression analyses included sampling weights, children's age, race, sex, and general health status, and parental marital status and education among singleton children. Results At age 5 years, EPC were 2.6(95 % CI 1.7-3.8) times more likely to fail build a gate and were 3.1(95 % CI 1.6-5.8) times more likely to fail all four drawing tasks compared to TC (p values <0.001). Fine motor performance of PC (failed to build a gate, 1.3[95 % CI 1.0-1.7]; failed to draw all four shapes, 1.1[95 % CI 0.8-1.6]) was not significantly different from TC. Mean early reading scale score (36.8[SE:1.3]) of EPC was 4.0 points lower than TC (p value < 0.0001) while mean reading score (39.9[SE:1.4]) of PC was not significantly different from TC (40.8[SE:1.1]). Mean mathematics scale score were significantly lower for both EPC (35.5[SE:1.0], p value < 0.001) and PC (39.8[SE:0.8], p value = 0.023) compared to TC (41.0[SE:0.6]). Conclusions for Practice Extreme prematurity at birth was associated with cognitive and fine motor delays at age 5 years. This suggests that based on a nationally representative sample of infants, the biological risk of extreme prematurity persists after adjusting for other factors related to development.
Extreme summer temperatures in Iberia: health impacts and associated synoptic conditions
NASA Astrophysics Data System (ADS)
García-Herrera, R.; Díaz, J.; Trigo, R. M.; Hernández, E.
2005-02-01
This paper examines the effect of extreme summer temperatures on daily mortality in two large cities of Iberia: Lisbon (Portugal) and Madrid (Spain). Daily mortality and meteorological variables are analysed using the same methodology based on Box-Jenkins models. Results reveal that in both cases there is a triggering effect on mortality when maximum daily temperature exceeds a given threshold (34°C in Lisbon and 36°C in Madrid). The impact of most intense heat events is very similar for both cities, with significant mortality values occurring up to 3 days after the temperature threshold has been surpassed. This impact is measured as the percentual increase of mortality associated to a 1°C increase above the threshold temperature. In this respect, Lisbon shows a higher impact, 31%, as compared with Madrid at 21%. The difference can be attributed to demographic and socio-economic factors. Furthermore, the longer life span of Iberian women is critical to explain why, in both cities, females are more susceptible than males to heat effects, with an almost double mortality impact value. The analysis of Sea Level Pressure (SLP), 500hPa geopotential height and temperature fields reveals that, despite being relatively close to each other, Lisbon and Madrid have relatively different synoptic circulation anomalies associated with their respective extreme summer temperature days. The SLP field reveals higher anomalies for Lisbon, but extending over a smaller area. Extreme values in Madrid seem to require a more western location of the Azores High, embracing a greater area over Europe, even if it is not as deep as for Lisbon. The origin of the hot and dry air masses that usually lead to extreme heat days in both cities is located in Northern Africa. However, while Madrid maxima require wind blowing directly from the south, transporting heat from Southern Spain and Northern Africa, Lisbon maxima occur under more easterly conditions, when Northern African air flows over the central Iberian plateau, which had been previously heated.
Nonstationary Extreme Value Analysis in a Changing Climate: A Software Package
NASA Astrophysics Data System (ADS)
Cheng, L.; AghaKouchak, A.; Gilleland, E.
2013-12-01
Numerous studies show that climatic extremes have increased substantially in the second half of the 20th century. For this reason, analysis of extremes under a nonstationary assumption has received a great deal of attention. This paper presents a software package developed for estimation of return levels, return periods, and risks of climatic extremes in a changing climate. This MATLAB software package offers tools for analysis of climate extremes under both stationary and non-stationary assumptions. The Nonstationary Extreme Value Analysis (hereafter, NEVA) provides an efficient and generalized framework for analyzing extremes using Bayesian inference. NEVA estimates the extreme value parameters using a Differential Evolution Markov Chain (DE-MC) which utilizes the genetic algorithm Differential Evolution (DE) for global optimization over the real parameter space with the Markov Chain Monte Carlo (MCMC) approach and has the advantage of simplicity, speed of calculation and convergence over conventional MCMC. NEVA also offers the confidence interval and uncertainty bounds of estimated return levels based on the sampled parameters. NEVA integrates extreme value design concepts, data analysis tools, optimization and visualization, explicitly designed to facilitate analysis extremes in geosciences. The generalized input and output files of this software package make it attractive for users from across different fields. Both stationary and nonstationary components of the package are validated for a number of case studies using empirical return levels. The results show that NEVA reliably describes extremes and their return levels.
Applied extreme-value statistics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kinnison, R.R.
1983-05-01
The statistical theory of extreme values is a well established part of theoretical statistics. Unfortunately, it is seldom part of applied statistics and is infrequently a part of statistical curricula except in advanced studies programs. This has resulted in the impression that it is difficult to understand and not of practical value. In recent environmental and pollution literature, several short articles have appeared with the purpose of documenting all that is necessary for the practical application of extreme value theory to field problems (for example, Roberts, 1979). These articles are so concise that only a statistician can recognise all themore » subtleties and assumptions necessary for the correct use of the material presented. The intent of this text is to expand upon several recent articles, and to provide the necessary statistical background so that the non-statistician scientist can recognize and extreme value problem when it occurs in his work, be confident in handling simple extreme value problems himself, and know when the problem is statistically beyond his capabilities and requires consultation.« less
Compound Extremes and Bunched Black (or Grouped Grey) Swans.
NASA Astrophysics Data System (ADS)
Watkins, Nicholas
2013-04-01
Observed "wild" natural fluctuations may differ substantially in their character. Some events may be genuinely unforeseen (and unforeseeable), as with Taleb's "black swans". These may occur singly, or may have their impact further magnified by being ``bunched" in time. Some of the others may, however, be the rare extreme events from a light-tailed underlying distribution. Studying their occurrence may then be tractable with the methods of extreme value theory [e.g. Coles, 2001], suitably adapted to allow correlation if that is observed to be present. Yet others may belong to a third broad class, described in today's presentation [ reviewed in Watkins, GRL Frontiers, 2013, doi: 10.1002/grl.50103]. Such "bursty" time series may show comparatively frequent high amplitude events, and/or long range correlations between successive values. The frequent large values due to the first of these effects, modelled in economics by Mandelbrot in 1963 using heavy- tailed probability distributions, can give rise to an "IPCC type I" burst composed of successive wild events. Conversely, long range dependence, even in a light-tailed Gaussian model like Mandelbrot and van Ness' fractional Brownian motion, can integrate ``mild" events into an extreme "IPCC type III" burst. I will show how a standard statistical time series model, linear fractional stable motion (LFSM), which descends from the two special cases advocated by Mandelbrot, allows these two effects to be varied independently, and will present results from a preliminary study of such bursts in LFSM. The consequences for burst scaling when low frequency effects due to dissipation (FARIMA models), and multiplicative cascades (such as multifractals) are included will also be discussed, and the physical assumptions and constraints associated with making a given choice of model.
Bivariate extreme value distributions
NASA Technical Reports Server (NTRS)
Elshamy, M.
1992-01-01
In certain engineering applications, such as those occurring in the analyses of ascent structural loads for the Space Transportation System (STS), some of the load variables have a lower bound of zero. Thus, the need for practical models of bivariate extreme value probability distribution functions with lower limits was identified. We discuss the Gumbel models and present practical forms of bivariate extreme probability distributions of Weibull and Frechet types with two parameters. Bivariate extreme value probability distribution functions can be expressed in terms of the marginal extremel distributions and a 'dependence' function subject to certain analytical conditions. Properties of such bivariate extreme distributions, sums and differences of paired extremals, as well as the corresponding forms of conditional distributions, are discussed. Practical estimation techniques are also given.
NASA Astrophysics Data System (ADS)
Wehner, Michael; Stone, Dáithí; Mitchell, Dann; Shiogama, Hideo; Fischer, Erich; Graff, Lise S.; Kharin, Viatcheslav V.; Lierhammer, Ludwig; Sanderson, Benjamin; Krishnan, Harinarayan
2018-03-01
The half a degree additional warming, prognosis and projected impacts (HAPPI) experimental protocol provides a multi-model database to compare the effects of stabilizing anthropogenic global warming of 1.5 °C over preindustrial levels to 2.0 °C over these levels. The HAPPI experiment is based upon large ensembles of global atmospheric models forced by sea surface temperature and sea ice concentrations plausible for these stabilization levels. This paper examines changes in extremes of high temperatures averaged over three consecutive days. Changes in this measure of extreme temperature are also compared to changes in hot season temperatures. We find that over land this measure of extreme high temperature increases from about 0.5 to 1.5 °C over present-day values in the 1.5 °C stabilization scenario, depending on location and model. We further find an additional 0.25 to 1.0 °C increase in extreme high temperatures over land in the 2.0 °C stabilization scenario. Results from the HAPPI models are consistent with similar results from the one available fully coupled climate model. However, a complicating factor in interpreting extreme temperature changes across the HAPPI models is their diversity of aerosol forcing changes.
Future Projection of Summer Extreme Precipitation from High Resolution Multi-RCMs over East Asia
NASA Astrophysics Data System (ADS)
Kim, Gayoung; Park, Changyong; Cha, Dong-Hyun; Lee, Dong-Kyou; Suh, Myoung-Seok; Ahn, Joong-Bae; Min, Seung-Ki; Hong, Song-You; Kang, Hyun-Suk
2017-04-01
Recently, the frequency and intensity of natural hazards have been increasing due to human-induced climate change. Because most damages of natural hazards over East Asia have been related to extreme precipitation events, it is important to estimate future change in extreme precipitation characteristics caused by climate change. We investigate future changes in extremal values of summer precipitation simulated by five regional climate models participating in the CORDEX-East Asia project (i.e., HadGEM3-RA, RegCM4, MM5, WRF, and GRIMs) over East Asia. 100-year return value calculated from the generalized extreme value (GEV) parameters is analysed as an indicator of extreme intensity. In the future climate, the mean values as well as the extreme values of daily precipitation tend to increase over land region. The increase of 100-year return value can be significantly associated with the changes in the location (intensity) and scale (variability) GEV parameters for extreme precipitation. It is expected that the results of this study can be used as fruitful references when making the policy of disaster management. Acknowledgements The research was supported by the Ministry of Public Safety and Security of Korean government and Development program under grant MPSS-NH-2013-63 and the National Research Foundation of Korea Grant funded by the Ministry of Science, ICT and Future Planning of Korea (NRF-2016M3C4A7952637) for its support and assistant in completion of the study.
NASA Astrophysics Data System (ADS)
Hutterer, Victoria; Ramlau, Ronny
2018-03-01
The new generation of extremely large telescopes includes adaptive optics systems to correct for atmospheric blurring. In this paper, we present a new method of wavefront reconstruction from non-modulated pyramid wavefront sensor data. The approach is based on a simplified sensor model represented as the finite Hilbert transform of the incoming phase. Due to the non-compactness of the finite Hilbert transform operator the classical theory for singular systems is not applicable. Nevertheless, we can express the Moore-Penrose inverse as a singular value type expansion with weighted Chebychev polynomials.
Extremely cold events and sudden air temperature drops during winter season in the Czech Republic
NASA Astrophysics Data System (ADS)
Crhová, Lenka; Valeriánová, Anna; Holtanová, Eva; Müller, Miloslav; Kašpar, Marek; Stříž, Martin
2014-05-01
Today a great attention is turned to analysis of extreme weather events and frequency of their occurrence under changing climate. In most cases, these studies are focused on extremely warm events in summer season. However, extremely low values of air temperature during winter can have serious impacts on many sectors as well (e.g. power engineering, transportation, industry, agriculture, human health). Therefore, in present contribution we focus on extremely and abnormally cold air temperature events in winter season in the Czech Republic. Besides the seasonal extremes of minimum air temperature determined from station data, the standardized data with removed annual cycle are used as well. Distribution of extremely cold events over the season and the temporal evolution of frequency of occurrence during the period 1961-2010 are analyzed. Furthermore, the connection of cold events with extreme sudden temperature drops is studied. The extreme air temperature events and events of extreme sudden temperature drop are assessed using the Weather Extremity Index, which evaluates the extremity (based on return periods) and spatial extent of the meteorological extreme event of interest. The generalized extreme value distribution parameters are used to estimate return periods of daily temperature values. The work has been supported by the grant P209/11/1990 funded by the Czech Science Foundation.
Spatiotemporal variability of extreme temperature frequency and amplitude in China
NASA Astrophysics Data System (ADS)
Zhang, Yuanjie; Gao, Zhiqiu; Pan, Zaitao; Li, Dan; Huang, Xinhui
2017-03-01
Temperature extremes in China are examined based on daily maximum and minimum temperatures from station observations and multiple global climate models. The magnitude and frequency of extremes are expressed in terms of return values and periods, respectively, estimated by the fitted Generalized Extreme Value (GEV) distribution of annual extreme temperatures. The observations suggest that changes in temperature extremes considerably exceed changes in the respective climatological means during the past five decades, with greater amplitude of increases in cold extremes than in warm extremes. The frequency of warm (cold) extremes increases (decreases) over most areas, with an increasingly faster rate as the extremity level rises. Changes in warm extremes are more dependent on the varying shape of GEV distribution than the location shift, whereas changes in cold extremes are more closely associated with the location shift. The models simulate the overall pattern of temperature extremes during 1961-1981 reasonably well in China, but they show a smaller asymmetry between changes in warm and cold extremes primarily due to their underestimation of increases in cold extremes especially over southern China. Projections from a high emission scenario show the multi-model median change in warm and cold extremes by 2040 relative to 1971 will be 2.6 °C and 2.8 °C, respectively, with the strongest changes in cold extremes shifting southward. By 2040, warm extremes at the 1971 20-year return values would occur about every three years, while the 1971 cold extremes would occur once in > 500 years.
Changes in US extreme sea levels and the role of large scale climate variations
NASA Astrophysics Data System (ADS)
Wahl, T.; Chambers, D. P.
2015-12-01
We analyze a set of 20 tide gauge records covering the contiguous United States (US) coastline and the period from 1929 to 2013 to identify long-term trends and multi-decadal variations in extreme sea levels (ESLs) relative to changes in mean sea level (MSL). Significant but small long-term trends in ESLs above/below MSL are found at individual sites along most coastline stretches, but are mostly confined to the southeast coast and the winter season when storm surges are primarily driven by extra-tropical cyclones. We identify six regions with broadly coherent and considerable multi-decadal ESL variations unrelated to MSL changes. Using a quasi-non-stationary extreme value analysis approach we show that the latter would have caused variations in design relevant return water levels (RWLs; 50 to 200 year return periods) ranging from ~10 cm to as much as 110 cm across the six regions. To explore the origin of these temporal changes and the role of large-scale climate variability we develop different sets of simple and multiple linear regression models with RWLs as dependent variables and climate indices, or tailored (toward the goal of predicting multi-decadal RWL changes) versions of them, and wind stress curl as independent predictors. The models, after being tested for spatial and temporal stability, explain up to 97% of the observed variability at individual sites and almost 80% on average. Using the model predictions as covariates for the quasi-non-stationary extreme value analysis also significantly reduces the range of change in the 100-year RWLs over time, turning a non-stationary process into a stationary one. This highlights that the models - when used with regional and global climate model output of the predictors - should also be capable of projecting future RWL changes to be used by decision makers for improved flood preparedness and long-term resiliency.
Identifying and Clarifying Organizational Values.
ERIC Educational Resources Information Center
Seevers, Brenda S.
2000-01-01
Of the 14 organizational values ranked by a majority of 146 New Mexico Cooperative Extension educators as extremely valued, 9 were extremely evident in organizational policies and procedures. A values audit such as this forms an important initial step in strategic planning. (SK)
Applications of Extreme Value Theory in Public Health.
Thomas, Maud; Lemaitre, Magali; Wilson, Mark L; Viboud, Cécile; Yordanov, Youri; Wackernagel, Hans; Carrat, Fabrice
2016-01-01
We present how Extreme Value Theory (EVT) can be used in public health to predict future extreme events. We applied EVT to weekly rates of Pneumonia and Influenza (P&I) deaths over 1979-2011. We further explored the daily number of emergency department visits in a network of 37 hospitals over 2004-2014. Maxima of grouped consecutive observations were fitted to a generalized extreme value distribution. The distribution was used to estimate the probability of extreme values in specified time periods. An annual P&I death rate of 12 per 100,000 (the highest maximum observed) should be exceeded once over the next 30 years and each year, there should be a 3% risk that the P&I death rate will exceed this value. Over the past 10 years, the observed maximum increase in the daily number of visits from the same weekday between two consecutive weeks was 1133. We estimated at 0.37% the probability of exceeding a daily increase of 1000 on each month. The EVT method can be applied to various topics in epidemiology thus contributing to public health planning for extreme events.
Modelling probabilities of heavy precipitation by regional approaches
NASA Astrophysics Data System (ADS)
Gaal, L.; Kysely, J.
2009-09-01
Extreme precipitation events are associated with large negative consequences for human society, mainly as they may trigger floods and landslides. The recent series of flash floods in central Europe (affecting several isolated areas) on June 24-28, 2009, the worst one over several decades in the Czech Republic as to the number of persons killed and the extent of damage to buildings and infrastructure, is an example. Estimates of growth curves and design values (corresponding e.g. to 50-yr and 100-yr return periods) of precipitation amounts, together with their uncertainty, are important in hydrological modelling and other applications. The interest in high quantiles of precipitation distributions is also related to possible climate change effects, as climate model simulations tend to project increased severity of precipitation extremes in a warmer climate. The present study compares - in terms of Monte Carlo simulation experiments - several methods to modelling probabilities of precipitation extremes that make use of ‘regional approaches’: the estimation of distributions of extremes takes into account data in a ‘region’ (‘pooling group’), in which one may assume that the distributions at individual sites are identical apart from a site-specific scaling factor (the condition is referred to as ‘regional homogeneity’). In other words, all data in a region - often weighted in some way - are taken into account when estimating the probability distribution of extremes at a given site. The advantage is that sampling variations in the estimates of model parameters and high quantiles are to a large extent reduced compared to the single-site analysis. We focus on the ‘region-of-influence’ (ROI) method which is based on the identification of unique pooling groups (forming the database for the estimation) for each site under study. The similarity of sites is evaluated in terms of a set of site attributes related to the distributions of extremes. The issue of the size of the region is linked with a built-in test on regional homogeneity of data. Once a pooling group is delineated, weights based on a dissimilarity measure are assigned to individual sites involved in a pooling group, and all (weighted) data are employed in the estimation of model parameters and high quantiles at a given location. The ROI method is compared with the Hosking-Wallis (HW) regional frequency analysis, which is based on delineating fixed regions (instead of flexible pooling groups) and assigning unit weights to all sites in a region. The comparison of the performance of the individual regional models makes use of data on annual maxima of 1-day precipitation amounts at 209 stations covering the Czech Republic, with altitudes ranging from 150 to 1490 m a.s.l. We conclude that the ROI methodology is superior to the HW analysis, particularly for very high quantiles (100-yr return values). Another advantage of the ROI approach is that subjective decisions - unavoidable when fixed regions in the HW analysis are formed - may efficiently be suppressed, and almost all settings of the ROI method may be justified by results of the simulation experiments. The differences between (any) regional method and single-site analysis are very pronounced and suggest that the at-site estimation is highly unreliable. The ROI method is then applied to estimate high quantiles of precipitation amounts at individual sites. The estimates and their uncertainty are compared with those from a single-site analysis. We focus on the eastern part of the Czech Republic, i.e. an area with complex orography and a particularly pronounced role of Mediterranean cyclones in producing precipitation extremes. The design values are compared with precipitation amounts recorded during the recent heavy precipitation events, including the one associated with the flash flood on June 24, 2009. We also show that the ROI methodology may easily be transferred to the analysis of precipitation extremes in climate model outputs. It efficiently reduces (random) variations in the estimates of parameters of the extreme value distributions in individual gridboxes that result from large spatial variability of heavy precipitation, and represents a straightforward tool for ‘weighting’ data from neighbouring gridboxes within the estimation procedure. The study is supported by the Grant Agency of AS CR under project B300420801.
NASA Astrophysics Data System (ADS)
Smith, N.; Sandal, G. M.; Leon, G. R.; Kjærgaard, A.
2017-08-01
Land-based extreme environments (e.g. polar expeditions, Antarctic research stations, confinement chambers) have often been used as analog settings for spaceflight. These settings share similarities with the conditions experienced during space missions, including confinement, isolation and limited possibilities for evacuation. To determine the utility of analog settings for understanding human spaceflight, researchers have examined the extent to which the individual characteristics (e.g., personality) of people operating in extreme environments can be generalized across contexts (Sandal, 2000) [1]. Building on previous work, and utilising new and pre-existing data, the present study examined the extent to which personal value motives could be generalized across extreme environments. Four populations were assessed; mountaineers (N =59), military personnel (N = 25), Antarctic over-winterers (N = 21) and Mars simulation participants (N = 12). All participants completed the Portrait Values Questionnaire (PVQ; Schwartz; 2) capturing information on 10 personal values. Rank scores suggest that all groups identified Self-direction, Stimulation, Universalism and Benevolence as important values and acknowledged Power and Tradition as being low priorities. Results from difference testing suggest the extreme environment groups were most comparable on Self-direction, Stimulation, Benevolence, Tradition and Security. There were significant between-group differences on five of the ten values. Overall, findings pinpointed specific values that may be important for functioning in challenging environments. However, the differences that emerged on certain values highlight the importance of considering the specific population when comparing results across extreme settings. We recommend that further research examine the impact of personal value motives on indicators of adjustment, group working, and performance. Information from such studies could then be used to aid selection and training processes for personnel operating in extreme settings, and in space.
NASA Astrophysics Data System (ADS)
Amor, T. A.; Russo, R.; Diez, I.; Bharath, P.; Zirovich, M.; Stramaglia, S.; Cortes, J. M.; de Arcangelis, L.; Chialvo, D. R.
2015-09-01
The brain exhibits a wide variety of spatiotemporal patterns of neuronal activity recorded using functional magnetic resonance imaging as the so-called blood-oxygenated-level-dependent (BOLD) signal. An active area of work includes efforts to best describe the plethora of these patterns evolving continuously in the brain. Here we explore the third-moment statistics of the brain BOLD signals in the resting state as a proxy to capture extreme BOLD events. We find that the brain signal exhibits typically nonzero skewness, with positive values for cortical regions and negative values for subcortical regions. Furthermore, the combined analysis of structural and functional connectivity demonstrates that relatively more connected regions exhibit activity with high negative skewness. Overall, these results highlight the relevance of recent results emphasizing that the spatiotemporal location of the relatively large-amplitude events in the BOLD time series contains relevant information to reproduce a number of features of the brain dynamics during resting state in health and disease.
Spatial dependence of extreme rainfall
NASA Astrophysics Data System (ADS)
Radi, Noor Fadhilah Ahmad; Zakaria, Roslinazairimah; Satari, Siti Zanariah; Azman, Muhammad Az-zuhri
2017-05-01
This study aims to model the spatial extreme daily rainfall process using the max-stable model. The max-stable model is used to capture the dependence structure of spatial properties of extreme rainfall. Three models from max-stable are considered namely Smith, Schlather and Brown-Resnick models. The methods are applied on 12 selected rainfall stations in Kelantan, Malaysia. Most of the extreme rainfall data occur during wet season from October to December of 1971 to 2012. This period is chosen to assure the available data is enough to satisfy the assumption of stationarity. The dependence parameters including the range and smoothness, are estimated using composite likelihood approach. Then, the bootstrap approach is applied to generate synthetic extreme rainfall data for all models using the estimated dependence parameters. The goodness of fit between the observed extreme rainfall and the synthetic data is assessed using the composite likelihood information criterion (CLIC). Results show that Schlather model is the best followed by Brown-Resnick and Smith models based on the smallest CLIC's value. Thus, the max-stable model is suitable to be used to model extreme rainfall in Kelantan. The study on spatial dependence in extreme rainfall modelling is important to reduce the uncertainties of the point estimates for the tail index. If the spatial dependency is estimated individually, the uncertainties will be large. Furthermore, in the case of joint return level is of interest, taking into accounts the spatial dependence properties will improve the estimation process.
Extreme event statistics in a drifting Markov chain
NASA Astrophysics Data System (ADS)
Kindermann, Farina; Hohmann, Michael; Lausch, Tobias; Mayer, Daniel; Schmidt, Felix; Widera, Artur
2017-07-01
We analyze extreme event statistics of experimentally realized Markov chains with various drifts. Our Markov chains are individual trajectories of a single atom diffusing in a one-dimensional periodic potential. Based on more than 500 individual atomic traces we verify the applicability of the Sparre Andersen theorem to our system despite the presence of a drift. We present detailed analysis of four different rare-event statistics for our system: the distributions of extreme values, of record values, of extreme value occurrence in the chain, and of the number of records in the chain. We observe that, for our data, the shape of the extreme event distributions is dominated by the underlying exponential distance distribution extracted from the atomic traces. Furthermore, we find that even small drifts influence the statistics of extreme events and record values, which is supported by numerical simulations, and we identify cases in which the drift can be determined without information about the underlying random variable distributions. Our results facilitate the use of extreme event statistics as a signal for small drifts in correlated trajectories.
Risk assessment of precipitation extremes in northern Xinjiang, China
NASA Astrophysics Data System (ADS)
Yang, Jun; Pei, Ying; Zhang, Yanwei; Ge, Quansheng
2018-05-01
This study was conducted using daily precipitation records gathered at 37 meteorological stations in northern Xinjiang, China, from 1961 to 2010. We used the extreme value theory model, generalized extreme value (GEV) and generalized Pareto distribution (GPD), statistical distribution function to fit outputs of precipitation extremes with different return periods to estimate risks of precipitation extremes and diagnose aridity-humidity environmental variation and corresponding spatial patterns in northern Xinjiang. Spatiotemporal patterns of daily maximum precipitation showed that aridity-humidity conditions of northern Xinjiang could be well represented by the return periods of the precipitation data. Indices of daily maximum precipitation were effective in the prediction of floods in the study area. By analyzing future projections of daily maximum precipitation (2, 5, 10, 30, 50, and 100 years), we conclude that the flood risk will gradually increase in northern Xinjiang. GEV extreme value modeling yielded the best results, proving to be extremely valuable. Through example analysis for extreme precipitation models, the GEV statistical model was superior in terms of favorable analog extreme precipitation. The GPD model calculation results reflect annual precipitation. For most of the estimated sites' 2 and 5-year T for precipitation levels, GPD results were slightly greater than GEV results. The study found that extreme precipitation reaching a certain limit value level will cause a flood disaster. Therefore, predicting future extreme precipitation may aid warnings of flood disaster. A suitable policy concerning effective water resource management is thus urgently required.
NASA Astrophysics Data System (ADS)
Sun, Qiaohong; Miao, Chiyuan; Qiao, Yuanyuan; Duan, Qingyun
2017-12-01
The El Niño-Southern Oscillation (ENSO) and local temperature are important drivers of extreme precipitation. Understanding the impact of ENSO and temperature on the risk of extreme precipitation over global land will provide a foundation for risk assessment and climate-adaptive design of infrastructure in a changing climate. In this study, nonstationary generalized extreme value distributions were used to model extreme precipitation over global land for the period 1979-2015, with ENSO indicator and temperature as covariates. Risk factors were estimated to quantify the contrast between the influence of different ENSO phases and temperature. The results show that extreme precipitation is dominated by ENSO over 22% of global land and by temperature over 26% of global land. With a warming climate, the risk of high-intensity daily extreme precipitation increases at high latitudes but decreases in tropical regions. For ENSO, large parts of North America, southern South America, and southeastern and northeastern China are shown to suffer greater risk in El Niño years, with more than double the chance of intense extreme precipitation in El Niño years compared with La Niña years. Moreover, regions with more intense precipitation are more sensitive to ENSO. Global climate models were used to investigate the changing relationship between extreme precipitation and the covariates. The risk of extreme, high-intensity precipitation increases across high latitudes of the Northern Hemisphere but decreases in middle and lower latitudes under a warming climate scenario, and will likely trigger increases in severe flooding and droughts across the globe. However, there is some uncertainties associated with the influence of ENSO on predictions of future extreme precipitation, with the spatial extent and risk varying among the different models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Werth, D.; NOEMAIL), A.; Shine, G.
Recent data sets for three meteorological phenomena with the potential to inflict damage on SRS facilities - tornadoes, straight winds, and heavy precipitation - are analyzed using appropriate statistical techniques to estimate occurrence probabilities for these events in the future. Summaries of the results for DOE-mandated return periods and comparisons to similar calculations performed in 1998 by Weber, et al., are given. Using tornado statistics for the states of Georgia and South Carolina, we calculated the probability per year of any location within a 2⁰ square area surrounding SRS being struck by a tornado (the ‘strike’ probability) and the probabilitymore » that any point will experience winds above set thresholds. The strike probability was calculated to be 1.15E-3 (1 chance in 870) per year and wind speeds for DOE mandated return periods of 50,000 years, 125,000 years, and 1E+7 years (USDOE, 2012) were estimated to be 136 mph, 151 mph and 221 mph, respectively. In 1998 the strike probability for SRS was estimated to be 3.53 E-4 and the return period wind speeds were 148 mph every 50,000 years and 180 mph every 125,000 years. A 1E+7 year tornado wind speed was not calculated in 1998; however a 3E+6 year wind speed was 260 mph. The lower wind speeds resulting from this most recent analysis are largely due to new data since 1998, and to a lesser degree differences in the models used. By contrast, default tornado wind speeds taken from ANSI/ANS-2.3-2011 are somewhat higher: 161 mph for return periods of 50,000 years, 173 mph every 125,000 years, and 230 mph every 1E+7 years (ANS, 2011). Although the ANS model and the SRS models are very similar, the region defined in ANS 2.3 that encompasses the SRS also includes areas of the Great Plains and lower Midwest, regions with much higher occurrence frequencies of strong tornadoes. The SRS straight wind values associated with various return periods were calculated by fitting existing wind data to a Gumbel distribution, and extrapolating the values for any return period from the tail of that function. For the DOE mandated return periods, we expect straight winds of 123 mph every 2500 years, and 132mph every 6250 years at any point within the SRS. These values are similar to those from the W98 report (which also used the Gumbel distribution for wind speeds) which gave wind speeds of 115mph and 122 mph for return periods of 2500 years and 6250 years, respectively. For extreme precipitation accumulation periods, we compared the fits of three different theoretical extreme-value distributions, and in the end decided to maintain the use of the Gumbel distribution for each period. The DOE mandated 6-hr accumulated rainfall for return periods of 2500 years and 6250 years was estimated as 7.8 inches and 8.4 inches, respectively. For the 24- hr rainfall return periods of 10,000 years and 25,000 years, total rainfall estimates were 10.4 inches and 11.1 inches, respectively. These values are substantially lower than comparable values provided in the W98 report. This is largely a consequence of the W98 use of a different extreme value distribution with its corresponding higher extreme probabilities.« less
The application of the statistical theory of extreme values to gust-load problems
NASA Technical Reports Server (NTRS)
Press, Harry
1950-01-01
An analysis is presented which indicates that the statistical theory of extreme values is applicable to the problems of predicting the frequency of encountering the larger gust loads and gust velocities for both specific test conditions as well as commercial transport operations. The extreme-value theory provides an analytic form for the distributions of maximum values of gust load and velocity. Methods of fitting the distribution are given along with a method of estimating the reliability of the predictions. The theory of extreme values is applied to available load data from commercial transport operations. The results indicate that the estimates of the frequency of encountering the larger loads are more consistent with the data and more reliable than those obtained in previous analyses. (author)
NASA Astrophysics Data System (ADS)
Rieder, Harald E.; Staehelin, Johannes; Maeder, Jörg A.; Peter, Thomas; Ribatet, Mathieu; Davison, Anthony C.; Stübi, Rene; Weihs, Philipp; Holawe, Franz
2010-05-01
In this study tools from extreme value theory (e.g. Coles, 2001; Ribatet, 2007) are applied for the first time in the field of stratospheric ozone research, as statistical analysis showed that previously used concepts assuming a Gaussian distribution (e.g. fixed deviations from mean values) of total ozone data do not address the internal data structure concerning extremes adequately. The study illustrates that tools based on extreme value theory are appropriate to identify ozone extremes and to describe the tails of the world's longest total ozone record (Arosa, Switzerland - for details see Staehelin et al., 1998a,b) (Rieder et al., 2010a). A daily moving threshold was implemented for consideration of the seasonal cycle in total ozone. The frequency of days with extreme low (termed ELOs) and extreme high (termed EHOs) total ozone and the influence of those on mean values and trends is analyzed for Arosa total ozone time series. The results show (a) an increase in ELOs and (b) a decrease in EHOs during the last decades and (c) that the overall trend during the 1970s and 1980s in total ozone is strongly dominated by changes in these extreme events. After removing the extremes, the time series shows a strongly reduced trend (reduction by a factor of 2.5 for trend in annual mean). Furthermore, it is shown that the fitted model represents the tails of the total ozone data set with very high accuracy over the entire range (including absolute monthly minima and maxima). Also the frequency distribution of ozone mini-holes (using constant thresholds) can be calculated with high accuracy. Analyzing the tails instead of a small fraction of days below constant thresholds provides deeper insight in time series properties. Excursions in the frequency of extreme events reveal "fingerprints" of dynamical factors such as ENSO or NAO, and chemical factors, such as cold Arctic vortex ozone losses, as well as major volcanic eruptions of the 20th century (e.g. Gunung Agung, El Chichón, Mt. Pinatubo). Furthermore, atmospheric loading in ozone depleting substances lead to a continuous modification of column ozone in the northern hemisphere also with respect to extreme values (partly again in connection with polar vortex contributions). It is shown that application of extreme value theory allows the identification of many more such fingerprints than conventional time series analysis of annual and seasonal mean values. Especially, the analysis shows the strong influence of dynamics, revealing that even moderate ENSO and NAO events have a discernible effect on total ozone (Rieder et al., 2010b). Overall the presented new extremes concept provides new information on time series properties, variability, trends and the influence of dynamics and chemistry, complementing earlier analyses focusing only on monthly (or annual) mean values. References: Coles, S.: An Introduction to Statistical Modeling of Extreme Values, Springer Series in Statistics, ISBN:1852334592, Springer, Berlin, 2001. Ribatet, M.: POT: Modelling peaks over a threshold, R News, 7, 34-36, 2007. Rieder ,H.E., Staehelin, J., Maeder, J.A., Ribatet, M., Stübi, R., Weihs, P., Holawe, F., Peter, T., and A.D., Davison (2010): Extreme events in total ozone over Arosa - Part I: Application of extreme value theory, to be submitted to ACPD. Rieder, H.E., Staehelin, J., Maeder, J.A., Ribatet, M., Stübi, R., Weihs, P., Holawe, F., Peter, T., and A.D., Davison (2010): Extreme events in total ozone over Arosa - Part II: Fingerprints of atmospheric dynamics and chemistry and effects on mean values and long-term changes, to be submitted to ACPD. Staehelin, J., Renaud, A., Bader, J., McPeters, R., Viatte, P., Hoegger, B., Bugnion, V., Giroud, M., and Schill, H.: Total ozone series at Arosa (Switzerland): Homogenization and data comparison, J. Geophys. Res., 103(D5), 5827-5842, doi:10.1029/97JD02402, 1998a. Staehelin, J., Kegel, R., and Harris, N. R.: Trend analysis of the homogenized total ozone series of Arosa (Switzerland), 1929-1996, J. Geophys. Res., 103(D7), 8389-8400, doi:10.1029/97JD03650, 1998b.
NASA Technical Reports Server (NTRS)
Chao, Luen-Yuan; Shetty, Dinesh K.
1992-01-01
Statistical analysis and correlation between pore-size distribution and fracture strength distribution using the theory of extreme-value statistics is presented for a sintered silicon nitride. The pore-size distribution on a polished surface of this material was characterized, using an automatic optical image analyzer. The distribution measured on the two-dimensional plane surface was transformed to a population (volume) distribution, using the Schwartz-Saltykov diameter method. The population pore-size distribution and the distribution of the pore size at the fracture origin were correllated by extreme-value statistics. Fracture strength distribution was then predicted from the extreme-value pore-size distribution, usin a linear elastic fracture mechanics model of annular crack around pore and the fracture toughness of the ceramic. The predicted strength distribution was in good agreement with strength measurements in bending. In particular, the extreme-value statistics analysis explained the nonlinear trend in the linearized Weibull plot of measured strengths without postulating a lower-bound strength.
On alternative q-Weibull and q-extreme value distributions: Properties and applications
NASA Astrophysics Data System (ADS)
Zhang, Fode; Ng, Hon Keung Tony; Shi, Yimin
2018-01-01
Tsallis statistics and Tsallis distributions have been attracting a significant amount of research work in recent years. Importantly, the Tsallis statistics, q-distributions have been applied in different disciplines. Yet, a relationship between some existing q-Weibull distributions and q-extreme value distributions that is parallel to the well-established relationship between the conventional Weibull and extreme value distributions through a logarithmic transformation has not be established. In this paper, we proposed an alternative q-Weibull distribution that leads to a q-extreme value distribution via the q-logarithm transformation. Some important properties of the proposed q-Weibull and q-extreme value distributions are studied. Maximum likelihood and least squares estimation methods are used to estimate the parameters of q-Weibull distribution and their performances are investigated through a Monte Carlo simulation study. The methodologies and the usefulness of the proposed distributions are illustrated by fitting the 2014 traffic fatalities data from The National Highway Traffic Safety Administration.
Extreme values and the level-crossing problem: An application to the Feller process
NASA Astrophysics Data System (ADS)
Masoliver, Jaume
2014-04-01
We review the question of the extreme values attained by a random process. We relate it to level crossings to one boundary (first-passage problems) as well as to two boundaries (escape problems). The extremes studied are the maximum, the minimum, the maximum absolute value, and the range or span. We specialize in diffusion processes and present detailed results for the Wiener and Feller processes.
NASA Technical Reports Server (NTRS)
Ko, William L.; Lung, Shun-Fat
2017-01-01
Non-classical stress concentration behavior in a stretched circular hyperelastic sheet (outer radius b = 10 in., thickness t = 0.0625 in.) containing a central hole (radius a = 0.5 in.) was analyzed. The hyperelastic sheet was subjected to different levels of remote radial stretchings. Nastran large-strain large-deformation analysis and the Blatz-Ko large deformation theory were used to calculate the equal-biaxial stress concentration factors K. The results show that the values of K calculated from the Blatz-Ko theory and Nastran are extremely close. Unlike the classical linear elasticity theory, which gives the constant K = 2 for the equal-biaxial stress field, the hyperelastic K values were found to increase with increased stretching and can exceed the value K = 6 at a remote radial extension ratio of 2.35. The present K-values compare fairly well with the K-values obtained by previous works. The effect of the hole-size on K-values was investigated. The values of K start to decrease from a hole radius a = 0.125 in. down to K = 1 (no stress concentration) as a shrinks to a = 0 in. (no hole). Also, the newly introduced stretch and strain magnification factors {K(sub ?),K(sub e) } are also material- and deformation-dependent, and can increase from linear levels of {1.0, 4.0} and reaching {3.07, 4.61}, respectively at a remote radial extension ratio of 2.35.
Efficient bootstrap estimates for tail statistics
NASA Astrophysics Data System (ADS)
Breivik, Øyvind; Aarnes, Ole Johan
2017-03-01
Bootstrap resamples can be used to investigate the tail of empirical distributions as well as return value estimates from the extremal behaviour of the sample. Specifically, the confidence intervals on return value estimates or bounds on in-sample tail statistics can be obtained using bootstrap techniques. However, non-parametric bootstrapping from the entire sample is expensive. It is shown here that it suffices to bootstrap from a small subset consisting of the highest entries in the sequence to make estimates that are essentially identical to bootstraps from the entire sample. Similarly, bootstrap estimates of confidence intervals of threshold return estimates are found to be well approximated by using a subset consisting of the highest entries. This has practical consequences in fields such as meteorology, oceanography and hydrology where return values are calculated from very large gridded model integrations spanning decades at high temporal resolution or from large ensembles of independent and identically distributed model fields. In such cases the computational savings are substantial.
NASA Astrophysics Data System (ADS)
Lazoglou, Georgia; Anagnostopoulou, Christina; Tolika, Konstantia; Kolyva-Machera, Fotini
2018-04-01
The increasing trend of the intensity and frequency of temperature and precipitation extremes during the past decades has substantial environmental and socioeconomic impacts. Thus, the objective of the present study is the comparison of several statistical methods of the extreme value theory (EVT) in order to identify which is the most appropriate to analyze the behavior of the extreme precipitation, and high and low temperature events, in the Mediterranean region. The extremes choice was made using both the block maxima and the peaks over threshold (POT) technique and as a consequence both the generalized extreme value (GEV) and generalized Pareto distributions (GPDs) were used to fit them. The results were compared, in order to select the most appropriate distribution for extremes characterization. Moreover, this study evaluates the maximum likelihood estimation, the L-moments and the Bayesian method, based on both graphical and statistical goodness-of-fit tests. It was revealed that the GPD can characterize accurately both precipitation and temperature extreme events. Additionally, GEV distribution with the Bayesian method is proven to be appropriate especially for the greatest values of extremes. Another important objective of this investigation was the estimation of the precipitation and temperature return levels for three return periods (50, 100, and 150 years) classifying the data into groups with similar characteristics. Finally, the return level values were estimated with both GEV and GPD and with the three different estimation methods, revealing that the selected method can affect the return level values for both the parameter of precipitation and temperature.
2007-04-01
industrial and commercial companies have been very successful at developing this type of organization and management structure, beginning with a... companies like large management consulting firms depends on being extremely proficient at this kind of knowledge and information management. The...technology services company , recognizes that their information base and experience is their most valued corporate asset and they treat it as such. They
Basic Snow Pressure Calculation
NASA Astrophysics Data System (ADS)
Hao, Shouzhi; Su, Jian
2018-03-01
As extreme weather rising in recent years, the damage of large steel structures caused by weather is frequent in China. How to consider the effect of wind and snow loads on the structure in structural design has become the focus of attention in engineering field. In this paper, based on the serious snow disasters in recent years and comparative analysis of some scholars, influence factors and the value of the snow load, the probability model are described.
An Ethical Response to State-Sponsored Terrorism.
1987-04-17
34 voluntary martyrs" compel others to make apparently futile sacrifices. While it may be merely foolish to greatly hazard one’s own life in an...physical life is sacrosanct -- consider the sharp battles on abor- tion, capital punishment, and euthanasia . All major Western reli- gions have strong... voluntary personal conduct engendered by those values. At the other extreme is the notion that an individual is wholly or largely "the product of his
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whitfield, R. G.; Buehring, W. A.; Bassett, G. W.
2011-04-08
Get a GRiP (Gravitational Risk Procedure) on risk by using an approach inspired by the physics of gravitational forces between body masses! In April 2010, U.S. Department of Homeland Security Special Events staff (Protective Security Advisors [PSAs]) expressed concern about how to calculate risk given measures of consequence, vulnerability, and threat. The PSAs believed that it is not 'right' to assign zero risk, as a multiplicative formula would imply, to cases in which the threat is reported to be extremely small, and perhaps could even be assigned a value of zero, but for which consequences and vulnerability are potentially high.more » They needed a different way to aggregate the components into an overall measure of risk. To address these concerns, GRiP was proposed and developed. The inspiration for GRiP is Sir Isaac Newton's Universal Law of Gravitation: the attractive force between two bodies is directly proportional to the product of their masses and inversely proportional to the squares of the distance between them. The total force on one body is the sum of the forces from 'other bodies' that influence that body. In the case of risk, the 'other bodies' are the components of risk (R): consequence, vulnerability, and threat (which we denote as C, V, and T, respectively). GRiP treats risk as if it were a body within a cube. Each vertex (corner) of the cube represents one of the eight combinations of minimum and maximum 'values' for consequence, vulnerability, and threat. The risk at each of the vertices is a variable that can be set. Naturally, maximum risk occurs when consequence, vulnerability, and threat are at their maximum values; minimum risk occurs when they are at their minimum values. Analogous to gravitational forces among body masses, the GRiP formula for risk states that the risk at any interior point of the box depends on the squares of the distances from that point to each of the eight vertices. The risk value at an interior (movable) point will be dominated by the value of one vertex as that point moves closer and closer to that one vertex. GRiP is a visualization tool that helps analysts better understand risk and its relationship to consequence, vulnerability, and threat. Estimates of consequence, vulnerability, and threat are external to GRiP; however, the GRiP approach can be linked to models or data that provide estimates of consequence, vulnerability, and threat. For example, the Enhanced Critical Infrastructure Program/Infrastructure Survey Tool produces a vulnerability index (scaled from 0 to 100) that can be used for the vulnerability component of GRiP. We recognize that the values used for risk components can be point estimates and that, in fact, there is uncertainty regarding the exact values of C, V, and T. When we use T = t{sub o} (where t{sub o} is a value of threat in its range), we mean that threat is believed to be in an interval around t{sub o}. Hence, a value of t{sub o} = 0 indicates a 'best estimate' that the threat level is equal to zero, but still allows that it is not impossible for the threat to occur. When t{sub o} = 0 but is potentially small and not exactly zero, there will be little impact on the overall risk value as long as the C and V components are not large. However, when C and/or V have large values, there can be large differences in risk given t{sub o} = 0, and t{sub o} = epsilon (where epsilon is small but greater than a value of zero). We believe this scenario explains the PSA's intuition that risk is not equal to zero when t{sub o} = 0 and C and/or V have large values. (They may also be thinking that if C has an extremely large value, it is unlikely that T is equal to 0; in the terrorist context, T would likely be dependent on C when C is extremely large.) The PSAs are implicitly recognizing the potential that t{sub o} = epsilon. One way to take this possible scenario into account is to replace point estimates for risk with interval values that reflect the uncertainty in the risk components. In fact, one could argue that T never equals zero for a man-made hazard. This paper describes the thought process that led to the GRiP approach and the mathematical formula for GRiP and presents a few examples that will provide insights about how to use GRiP and interpret its results.« less
Application of the Haines Index in the fire warning system
NASA Astrophysics Data System (ADS)
Kalin, Lovro; Marija, Mokoric; Tomislav, Kozaric
2016-04-01
Croatia, as all Mediterranean countries, is strongly affected by large wildfires, particularly in the coastal region. In the last two decades the number and intensity of fires has been significantly increased, which is unanimously associated with climate change, e.g. global warming. More extreme fires are observed, and the fire-fighting season has been expanded to June and September. The meteorological support for fire protection and planning is therefore even more important. At the Meteorological and Hydrological Service of Croatia a comprehensive monitoring and warning system has been established. It includes standard components, such as short term forecast of Fire Weather Index (FWI), but long range forecast as well. However, due to more frequent hot and dry seasons, FWI index often does not provide additional information of extremely high fire danger, since it regularly takes the highest values for long periods. Therefore the additional tools have been investigated. One of widely used meteorological products is the Haines index (HI). It provides information of potential fire growth, taking into account only the vertical instability of the atmosphere, and not the state of the fuel. Several analyses and studies carried out at the Service confirmed the correlation of high HI values with large and extreme fires. The Haines index forecast has been used at the Service for several years, employing European Centre for Medium Range Weather Forecast (ECMWF) global prediction model, as well as the limited-area Aladin model. The verification results show that these forecast are reliable, when compared to radiosonde measurements. All these results provided the introduction of the additional fire warnings, that are issued by the Service's Forecast Department.
A Fast SVD-Hidden-nodes based Extreme Learning Machine for Large-Scale Data Analytics.
Deng, Wan-Yu; Bai, Zuo; Huang, Guang-Bin; Zheng, Qing-Hua
2016-05-01
Big dimensional data is a growing trend that is emerging in many real world contexts, extending from web mining, gene expression analysis, protein-protein interaction to high-frequency financial data. Nowadays, there is a growing consensus that the increasing dimensionality poses impeding effects on the performances of classifiers, which is termed as the "peaking phenomenon" in the field of machine intelligence. To address the issue, dimensionality reduction is commonly employed as a preprocessing step on the Big dimensional data before building the classifiers. In this paper, we propose an Extreme Learning Machine (ELM) approach for large-scale data analytic. In contrast to existing approaches, we embed hidden nodes that are designed using singular value decomposition (SVD) into the classical ELM. These SVD nodes in the hidden layer are shown to capture the underlying characteristics of the Big dimensional data well, exhibiting excellent generalization performances. The drawback of using SVD on the entire dataset, however, is the high computational complexity involved. To address this, a fast divide and conquer approximation scheme is introduced to maintain computational tractability on high volume data. The resultant algorithm proposed is labeled here as Fast Singular Value Decomposition-Hidden-nodes based Extreme Learning Machine or FSVD-H-ELM in short. In FSVD-H-ELM, instead of identifying the SVD hidden nodes directly from the entire dataset, SVD hidden nodes are derived from multiple random subsets of data sampled from the original dataset. Comprehensive experiments and comparisons are conducted to assess the FSVD-H-ELM against other state-of-the-art algorithms. The results obtained demonstrated the superior generalization performance and efficiency of the FSVD-H-ELM. Copyright © 2016 Elsevier Ltd. All rights reserved.
A Fiducial Approach to Extremes and Multiple Comparisons
ERIC Educational Resources Information Center
Wandler, Damian V.
2010-01-01
Generalized fiducial inference is a powerful tool for many difficult problems. Based on an extension of R. A. Fisher's work, we used generalized fiducial inference for two extreme value problems and a multiple comparison procedure. The first extreme value problem is dealing with the generalized Pareto distribution. The generalized Pareto…
Reconstructing metabolic flux vectors from extreme pathways: defining the alpha-spectrum.
Wiback, Sharon J; Mahadevan, Radhakrishnan; Palsson, Bernhard Ø
2003-10-07
The move towards genome-scale analysis of cellular functions has necessitated the development of analytical (in silico) methods to understand such large and complex biochemical reaction networks. One such method is extreme pathway analysis that uses stoichiometry and thermodynamic irreversibly to define mathematically unique, systemic metabolic pathways. These extreme pathways form the edges of a high-dimensional convex cone in the flux space that contains all the attainable steady state solutions, or flux distributions, for the metabolic network. By definition, any steady state flux distribution can be described as a nonnegative linear combination of the extreme pathways. To date, much effort has been focused on calculating, defining, and understanding these extreme pathways. However, little work has been performed to determine how these extreme pathways contribute to a given steady state flux distribution. This study represents an initial effort aimed at defining how physiological steady state solutions can be reconstructed from a network's extreme pathways. In general, there is not a unique set of nonnegative weightings on the extreme pathways that produce a given steady state flux distribution but rather a range of possible values. This range can be determined using linear optimization to maximize and minimize the weightings of a particular extreme pathway in the reconstruction, resulting in what we have termed the alpha-spectrum. The alpha-spectrum defines which extreme pathways can and cannot be included in the reconstruction of a given steady state flux distribution and to what extent they individually contribute to the reconstruction. It is shown that accounting for transcriptional regulatory constraints can considerably shrink the alpha-spectrum. The alpha-spectrum is computed and interpreted for two cases; first, optimal states of a skeleton representation of core metabolism that include transcriptional regulation, and second for human red blood cell metabolism under various physiological, non-optimal conditions.
Large-Scale Meteorological Patterns Associated with Extreme Precipitation in the US Northeast
NASA Astrophysics Data System (ADS)
Agel, L. A.; Barlow, M. A.
2016-12-01
Patterns of daily large-scale circulation associated with Northeast US extreme precipitation are identified using both k-means clustering (KMC) and Self-Organizing Maps (SOM) applied to tropopause height. Tropopause height provides a compact representation of large-scale circulation patterns, as it is linked to mid-level circulation, low-level thermal contrasts and low-level diabatic heating. Extreme precipitation is defined as the top 1% of daily wet-day observations at 35 Northeast stations, 1979-2008. KMC is applied on extreme precipitation days only, while the SOM algorithm is applied to all days in order to place the extreme results into a larger context. Six tropopause patterns are identified on extreme days: a summertime tropopause ridge, a summertime shallow trough/ridge, a summertime shallow eastern US trough, a deeper wintertime eastern US trough, and two versions of a deep cold-weather trough located across the east-central US. Thirty SOM patterns for all days are identified. Results for all days show that 6 SOM patterns account for almost half of the extreme days, although extreme precipitation occurs in all SOM patterns. The same SOM patterns associated with extreme precipitation also routinely produce non-extreme precipitation; however, on extreme precipitation days the troughs, on average, are deeper and the downstream ridges more pronounced. Analysis of other fields associated with the large-scale patterns show various degrees of anomalously strong upward motion during, and moisture transport preceding, extreme precipitation events.
Comparison of design and torque measurements of various manual wrenches.
Neugebauer, Jörg; Petermöller, Simone; Scheer, Martin; Happe, Arndt; Faber, Franz-Josef; Zoeller, Joachim E
2015-01-01
Accurate torque application and determination of the applied torque during surgical and prosthetic treatment is important to reduce complications. A study was performed to determine and compare the accuracy of manual wrenches, which are available in different designs with a large range of preset torques. Thirteen different wrench systems with a variety of preset torques ranging from 10 to 75 Ncm were evaluated. Three different designs were available, with a spring-in-coil or toggle design as an active mechanism or a beam as a passive mechanism, to select the preset torque. To provide a clinically relevant analysis, a total of 1,170 torque measurements in the range of 10 to 45 Ncm were made in vitro using an electronic torque measurement device. The absolute deviations in Ncm and percent deviations across all wrenches were small, with a mean of -0.24 ± 2.15 Ncm and -0.84% ± 11.72% as a shortfall relative to the preset value. The greatest overage was 8.2 Ncm (82.5%), and the greatest shortfall was 8.47 Ncm (46%). However, extreme values were rare, with 95th-percentile values of -1.5% (lower value) and -0.16% (upper value). A comparison with respect to wrench design revealed significantly higher deviations for coil and toggle-style wrenches than for beam wrenches. Beam wrenches were associated with a lower risk of rare extreme values thanks to their passive mechanism of achieving the selected preset torque, which minimizes the risk of harming screw connections.
Influence of hurricane-related activity on North American extreme precipitation
NASA Astrophysics Data System (ADS)
Barlow, Mathew
2010-05-01
Individual hurricanes and their remnants can produce exceptionally intense rainfall, and the associated flooding, even independent of storm surge, is one of the leading causes of hurricane-related death in the U.S. Despite the catastrophic societal costs of hurricanes and the considerable recent attention to possible trends in strength and number, little is known about the general contribution of hurricane-related activity to extreme precipitation over North America and the underlying dynamical mechanisms. Here we show, based on a 25-year observational analysis, that there are important contributions to the occurrence of extreme precipitation events over more than half of North America, including a pronounced signal over northern and inland areas, associated with an average span of influence that extends to several hundred kilometers. Large-scale vertical velocity, maximum wind speed, and tropical/extratropical character are important factors in the strength and range of influence, and the pattern of influence depends on whether an absolute or relative measure of precipitation is considered. Associated changes in stability, moisture, and vertical motion are analyzed to investigate the dynamics of the influence: the largest changes are in vertical motion, with the hurricane-related activity bringing deep tropical values even to inland and high latitude areas, consistent with the occurrence of very heavy, tropical-like precipitation. While the maximum contribution of hurricane-related activity to mean precipitation is generally less than 25% even for the most-affected coastal regions, the contribution to extreme events is much larger: well over 50% for several regions and exceeding 25% for large swaths of the continent. Typical track density plots do not capture the activity's influence on extreme precipitation.
NASA Astrophysics Data System (ADS)
Corman, J. R.; Loken, L. C.; Oliver, S. K.; Collins, S.; Butitta, V.; Stanley, E. H.
2017-12-01
Extreme events can play powerful roles in shifting ecosystem processes. In lakes, heavy rainfall can transport large amounts of particulates and dissolved nutrients into the water column and, potentially, alter biogeochemical cycling. However, the impacts of extreme rainfall events are often difficult to study due to a lack of long-term records. In this paper, we combine daily discharge records with long-term lake water quality information collected by the North Temperate Lakes Long-Term Ecological Research (NTL LTER) site to investigate the impacts of extreme events on nutrient cycling in lakes. We focus on Lake Mendota, an urban lake within the Yahara River Watershed in Madison, Wisconsin, USA, where nutrient data are available at least seasonally from 1995 - present. In June 2008, precipitation amounts in the Yahara watershed were 400% above normal values, triggering the largest discharge event on record for the 40 years of monitoring at the streamgage station; hence, we are able to compare water quality records before and after this event as a case study of how extreme rain events couple or decouple lake nutrient cycling. Following the extreme event, the lake-wide mass of nitrogen and phosphorus increased in the summer of 2008 by 35% and 21%, respectively, shifting lake stoichiometry by increasing N:P ratios (Figure 1). Nitrogen concentrations remained elevated longer than phosphorus, suggesting (1) that nitrogen inputs into the lake were sustained longer than phosphorus (i.e., a "smear" versus "pulse" loading of nitrogen versus phosphorus, respectively, in response to the extreme event) and/or (2) that in-lake biogeochemical processing was more efficient at removing phosphorus compared to nitrogen. While groundwater loading data are currently unavailable to test the former hypothesis, preliminary data from surficial nitrogen and phosphorus loading to Lake Mendota (available for 2011 - 2013) suggest that nitrogen removal efficiency is less than phosphorus, supporting the latter hypothesis. As climate change is expected to increase the frequency of extreme events, continued monitoring of lakes is needed to understand biogeochemical responses and when and how water quality threats may occur.
NASA Astrophysics Data System (ADS)
Yin, Yixing; Chen, Haishan; Xu, Chong-Yu; Xu, Wucheng; Chen, Changchun; Sun, Shanlei
2016-05-01
The regionalization methods, which "trade space for time" by pooling information from different locations in the frequency analysis, are efficient tools to enhance the reliability of extreme quantile estimates. This paper aims at improving the understanding of the regional frequency of extreme precipitation by using regionalization methods, and providing scientific background and practical assistance in formulating the regional development strategies for water resources management in one of the most developed and flood-prone regions in China, the Yangtze River Delta (YRD) region. To achieve the main goals, L-moment-based index-flood (LMIF) method, one of the most popular regionalization methods, is used in the regional frequency analysis of extreme precipitation with special attention paid to inter-site dependence and its influence on the accuracy of quantile estimates, which has not been considered by most of the studies using LMIF method. Extensive data screening of stationarity, serial dependence, and inter-site dependence was carried out first. The entire YRD region was then categorized into four homogeneous regions through cluster analysis and homogenous analysis. Based on goodness-of-fit statistic and L-moment ratio diagrams, generalized extreme-value (GEV) and generalized normal (GNO) distributions were identified as the best fitted distributions for most of the sub-regions, and estimated quantiles for each region were obtained. Monte Carlo simulation was used to evaluate the accuracy of the quantile estimates taking inter-site dependence into consideration. The results showed that the root-mean-square errors (RMSEs) were bigger and the 90 % error bounds were wider with inter-site dependence than those without inter-site dependence for both the regional growth curve and quantile curve. The spatial patterns of extreme precipitation with a return period of 100 years were finally obtained which indicated that there are two regions with highest precipitation extremes and a large region with low precipitation extremes. However, the regions with low precipitation extremes are the most developed and densely populated regions of the country, and floods will cause great loss of human life and property damage due to the high vulnerability. The study methods and procedure demonstrated in this paper will provide useful reference for frequency analysis of precipitation extremes in large regions, and the findings of the paper will be beneficial in flood control and management in the study area.
Evolution of precipitation extremes in two large ensembles of climate simulations
NASA Astrophysics Data System (ADS)
Martel, Jean-Luc; Mailhot, Alain; Talbot, Guillaume; Brissette, François; Ludwig, Ralf; Frigon, Anne; Leduc, Martin; Turcotte, Richard
2017-04-01
Recent studies project significant changes in the future distribution of precipitation extremes due to global warming. It is likely that extreme precipitation intensity will increase in a future climate and that extreme events will be more frequent. In this work, annual maxima daily precipitation series from the Canadian Earth System Model (CanESM2) 50-member large ensemble (spatial resolution of 2.8°x2.8°) and the Community Earth System Model (CESM1) 40-member large ensemble (spatial resolution of 1°x1°) are used to investigate extreme precipitation over the historical (1980-2010) and future (2070-2100) periods. The use of these ensembles results in respectively 1 500 (30 years x 50 members) and 1200 (30 years x 40 members) simulated years over both the historical and future periods. These large datasets allow the computation of empirical daily extreme precipitation quantiles for large return periods. Using the CanESM2 and CESM1 large ensembles, extreme daily precipitation with return periods ranging from 2 to 100 years are computed in historical and future periods to assess the impact of climate change. Results indicate that daily precipitation extremes generally increase in the future over most land grid points and that these increases will also impact the 100-year extreme daily precipitation. Considering that many public infrastructures have lifespans exceeding 75 years, the increase in extremes has important implications on service levels of water infrastructures and public safety. Estimated increases in precipitation associated to very extreme precipitation events (e.g. 100 years) will drastically change the likelihood of flooding and their extent in future climate. These results, although interesting, need to be extended to sub-daily durations, relevant for urban flooding protection and urban infrastructure design (e.g. sewer networks, culverts). Models and simulations at finer spatial and temporal resolution are therefore needed.
NASA Astrophysics Data System (ADS)
Nursamsiah; Nugroho Sugianto, Denny; Suprijanto, Jusup; Munasik; Yulianto, Bambang
2018-02-01
The information of extreme wave height return level was required for maritime planning and management. The recommendation methods in analyzing extreme wave were better distributed by Generalized Pareto Distribution (GPD). Seasonal variation was often considered in the extreme wave model. This research aims to identify the best model of GPD by considering a seasonal variation of the extreme wave. By using percentile 95 % as the threshold of extreme significant wave height, the seasonal GPD and non-seasonal GPD fitted. The Kolmogorov-Smirnov test was applied to identify the goodness of fit of the GPD model. The return value from seasonal and non-seasonal GPD was compared with the definition of return value as criteria. The Kolmogorov-Smirnov test result shows that GPD fits data very well both seasonal and non-seasonal model. The seasonal return value gives better information about the wave height characteristics.
Yildirim, Adem; Sürücü, Gülseren Dost; Karamercan, Ayşe; Gedik, Dilay Eken; Atci, Nermin; Dülgeroǧlu, Deniz; Özgirgin, Neşe
2016-11-21
A number of exercises to strengthen the upper extremities are recommended to increase functional independence and quality of life (QoL) in patients with paraplegia. Circuit resistance training (CRT) is a type of progressive resistive exercise performed repeatedly at fixed mechanical exercise stations. The aim of this study was to investigate the potential benefits of CRT for upper extremity muscle strength, functional independence, and QoL in patients with paraplegia. Twenty-six patients with paraplegia who were participating in a conventional rehabilitation program at a tertiary education and research hospital were enrolled in this study. The participants were randomly assigned to two groups. The exercise group participated in the CRT program, which consisted of repetitive exercises for the upper extremities performed at fixed mechanical stations 5 sessions per week for 6 weeks, in addition to conventional rehabilitation. Participants in the control group received only conventional rehabilitation over the same period. We compared the groups with respect to QoL, as well as isokinetic muscle test outcomes in the upper extremities, using the Functional Independence Measure (FIM) and Borg's scale. We observed significant increases in scores on the physical component of the FIM, Borg's scale, and QoL in both the exercise and control groups. Furthermore, the large majority of isokinetic values were significantly more improved in the exercise group compared to the control group. When post-treatment outcomes were compared between the groups, improvements in scores on the physical component of the FIM and in most isokinetic values were significantly greater in the exercise group. This study showed that CRT has positive effects on muscle strength in the upper extremities and the physical disability components of the FIM when added to conventional rehabilitation programs for paraplegic patients. However, we observed no significant improvement in QoL scores after adding CRT to a conventional treatment regime. Randomized trial (Level II).
Statistical methods for the analysis of climate extremes
NASA Astrophysics Data System (ADS)
Naveau, Philippe; Nogaj, Marta; Ammann, Caspar; Yiou, Pascal; Cooley, Daniel; Jomelli, Vincent
2005-08-01
Currently there is an increasing research activity in the area of climate extremes because they represent a key manifestation of non-linear systems and an enormous impact on economic and social human activities. Our understanding of the mean behavior of climate and its 'normal' variability has been improving significantly during the last decades. In comparison, climate extreme events have been hard to study and even harder to predict because they are, by definition, rare and obey different statistical laws than averages. In this context, the motivation for this paper is twofold. Firstly, we recall the basic principles of Extreme Value Theory that is used on a regular basis in finance and hydrology, but it still does not have the same success in climate studies. More precisely, the theoretical distributions of maxima and large peaks are recalled. The parameters of such distributions are estimated with the maximum likelihood estimation procedure that offers the flexibility to take into account explanatory variables in our analysis. Secondly, we detail three case-studies to show that this theory can provide a solid statistical foundation, specially when assessing the uncertainty associated with extreme events in a wide range of applications linked to the study of our climate. To cite this article: P. Naveau et al., C. R. Geoscience 337 (2005).
NASA Astrophysics Data System (ADS)
Yucel, Ismail; Onen, Alper
2013-04-01
Evidence is showing that global warming or climate change has a direct influence on changes in precipitation and the hydrological cycle. Extreme weather events such as heavy rainfall and flooding are projected to become much more frequent as climate warms. Regional hydrometeorological system model which couples the atmosphere with physical and gridded based surface hydrology provide efficient predictions for extreme hydrological events. This modeling system can be used for flood forecasting and warning issues as they provide continuous monitoring of precipitation over large areas at high spatial resolution. This study examines the performance of the Weather Research and Forecasting (WRF-Hydro) model that performs the terrain, sub-terrain, and channel routing in producing streamflow from WRF-derived forcing of extreme precipitation events. The capability of the system with different options such as data assimilation is tested for number of flood events observed in basins of western Black Sea Region in Turkey. Rainfall event structures and associated flood responses are evaluated with gauge and satellite-derived precipitation and measured streamflow values. The modeling system shows skills in capturing the spatial and temporal structure of extreme rainfall events and resulted flood hydrographs. High-resolution routing modules activated in the model enhance the simulated discharges.
NASA Astrophysics Data System (ADS)
Felder, Guido; Zischg, Andreas; Weingartner, Rolf
2015-04-01
Estimating peak discharges with very low probabilities is still accompanied by large uncertainties. Common estimation methods are usually based on extreme value statistics applied to observed time series or to hydrological model outputs. However, such methods assume the system to be stationary and do not specifically consider non-stationary effects. Observed time series may exclude events where peak discharge is damped by retention effects, as this process does not occur until specific thresholds, possibly beyond those of the highest measured event, are exceeded. Hydrological models can be complemented and parameterized with non-linear functions. However, in such cases calibration depends on observed data and non-stationary behaviour is not deterministically calculated. Our study discusses the option of considering retention effects on extreme peak discharges by coupling hydrological and hydraulic models. This possibility is tested by forcing the semi-distributed deterministic hydrological model PREVAH with randomly generated, physically plausible extreme precipitation patterns. The resulting hydrographs are then used to force the hydraulic model BASEMENT-ETH (riverbed in 1D, potential inundation areas in 2D). The procedure ensures that the estimated extreme peak discharge does not exceed the physical limit given by the riverbed capacity and that the dampening effect of inundation processes on peak discharge is considered.
Optimal control of orientation and entanglement for two dipole-dipole coupled quantum planar rotors.
Yu, Hongling; Ho, Tak-San; Rabitz, Herschel
2018-05-09
Optimal control simulations are performed for orientation and entanglement of two dipole-dipole coupled identical quantum rotors. The rotors at various fixed separations lie on a model non-interacting plane with an applied control field. It is shown that optimal control of orientation or entanglement represents two contrasting control scenarios. In particular, the maximally oriented state (MOS) of the two rotors has a zero entanglement entropy and is readily attainable at all rotor separations. Whereas, the contrasting maximally entangled state (MES) has a zero orientation expectation value and is most conveniently attainable at small separations where the dipole-dipole coupling is strong. It is demonstrated that the peak orientation expectation value attained by the MOS at large separations exhibits a long time revival pattern due to the small energy splittings arising form the extremely weak dipole-dipole coupling between the degenerate product states of the two free rotors. Moreover, it is found that the peak entanglement entropy value attained by the MES remains largely unchanged as the two rotors are transported to large separations after turning off the control field. Finally, optimal control simulations of transition dynamics between the MOS and the MES reveal the intricate interplay between orientation and entanglement.
NASA Astrophysics Data System (ADS)
Martel, J. L.; Brissette, F.; Mailhot, A.; Wood, R. R.; Ludwig, R.; Frigon, A.; Leduc, M.; Turcotte, R.
2017-12-01
Recent studies indicate that the frequency and intensity of extreme precipitation will increase in future climate due to global warming. In this study, we compare annual maxima precipitation series from three large ensembles of climate simulations at various spatial and temporal resolutions. The first two are at the global scale: the Canadian Earth System Model (CanESM2) 50-member large ensemble (CanESM2-LE) at a 2.8° resolution and the Community Earth System Model (CESM1) 40-member large ensemble (CESM1-LE) at a 1° resolution. The third ensemble is at the regional scale over both Eastern North America and Europe: the Canadian Regional Climate Model (CRCM5) 50-member large ensemble (CRCM5-LE) at a 0.11° resolution, driven at its boundaries by the CanESM-LE. The CRCM5-LE is a new ensemble issued from the ClimEx project (http://www.climex-project.org), a Québec-Bavaria collaboration. Using these three large ensembles, change in extreme precipitations over the historical (1980-2010) and future (2070-2100) periods are investigated. This results in 1 500 (30 years x 50 members for CanESM2-LE and CRCM5-LE) and 1200 (30 years x 40 members for CESM1-LE) simulated years over both the historical and future periods. Using these large datasets, the empirical daily (and sub-daily for CRCM5-LE) extreme precipitation quantiles for large return periods ranging from 2 to 100 years are computed. Results indicate that daily extreme precipitations generally will increase over most land grid points of both domains according to the three large ensembles. Regarding the CRCM5-LE, the increase in sub-daily extreme precipitations will be even more important than the one observed for daily extreme precipitations. Considering that many public infrastructures have lifespans exceeding 75 years, the increase in extremes has important implications on service levels of water infrastructures and public safety.
Quintela-del-Río, Alejandro; Francisco-Fernández, Mario
2011-02-01
The study of extreme values and prediction of ozone data is an important topic of research when dealing with environmental problems. Classical extreme value theory is usually used in air-pollution studies. It consists in fitting a parametric generalised extreme value (GEV) distribution to a data set of extreme values, and using the estimated distribution to compute return levels and other quantities of interest. Here, we propose to estimate these values using nonparametric functional data methods. Functional data analysis is a relatively new statistical methodology that generally deals with data consisting of curves or multi-dimensional variables. In this paper, we use this technique, jointly with nonparametric curve estimation, to provide alternatives to the usual parametric statistical tools. The nonparametric estimators are applied to real samples of maximum ozone values obtained from several monitoring stations belonging to the Automatic Urban and Rural Network (AURN) in the UK. The results show that nonparametric estimators work satisfactorily, outperforming the behaviour of classical parametric estimators. Functional data analysis is also used to predict stratospheric ozone concentrations. We show an application, using the data set of mean monthly ozone concentrations in Arosa, Switzerland, and the results are compared with those obtained by classical time series (ARIMA) analysis. Copyright © 2010 Elsevier Ltd. All rights reserved.
Changing Pattern of Indian Monsoon Extremes: Global and Local Factors
NASA Astrophysics Data System (ADS)
Ghosh, Subimal; Shastri, Hiteshri; Pathak, Amey; Paul, Supantha
2017-04-01
Indian Summer Monsoon Rainfall (ISMR) extremes have remained a major topic of discussion in the field of global change and hydro-climatology over the last decade. This attributes to multiple conclusions on changing pattern of extremes along with poor understanding of multiple processes at global and local scales associated with monsoon extremes. At a spatially aggregate scale, when number of extremes in the grids are summed over, a statistically significant increasing trend is observed for both Central India (Goswami et al., 2006) and all India (Rajeevan et al., 2008). However, such a result over Central India does not satisfy flied significance test of increase and no decrease (Krishnamurthy et al., 2009). Statistically rigorous extreme value analysis that deals with the tail of the distribution reveals a spatially non-uniform trend of extremes over India (Ghosh et al., 2012). This results into statistically significant increasing trend of spatial variability. Such an increase of spatial variability points to the importance of local factors such as deforestation and urbanization. We hypothesize that increase of spatial average of extremes is associated with the increase of events occurring over large region, while increase in spatial variability attributes to local factors. A Lagrangian approach based dynamic recycling model reveals that the major contributor of moisture to wide spread extremes is Western Indian Ocean, while land surface also contributes around 25-30% of moisture during the extremes in Central India. We further test the impacts of local urbanization on extremes and find the impacts are more visible over West central, Southern and North East India. Regional atmospheric simulations coupled with Urban Canopy Model (UCM) shows that urbanization intensifies extremes in city areas, but not uniformly all over the city. The intensification occurs over specific pockets of the urban region, resulting an increase in spatial variability even within the city. This also points to the need of setting up multiple weather stations over the city at a finer resolution for better understanding of urban extremes. We conclude that the conventional method of considering large scale factors is not sufficient for analysing the monsoon extremes and characterization of the same needs a blending of both global and local factors. Ghosh, S., Das, D., Kao, S-C. & Ganguly, A. R. Lack of uniform trends but increasing spatial variability in observed Indian rainfall extremes. Nature Clim. Change 2, 86-91 (2012) Goswami, B. N., Venugopal, V., Sengupta, D., Madhusoodanan, M. S. & Xavier, P. K. Increasing trend of extreme rain events over India in a warming environment. Science 314, 1442-1445 (2006). Krishnamurthy, C. K. B., Lall, U. & Kwon, H-H. Changing frequency and intensity of rainfall extremes over India from 1951 to 2003. J. Clim. 22, 4737-4746 (2009). Rajeevan, M., Bhate, J. & Jaswal, A. K. Analysis of variability and trends of extreme rainfall events over India using 104 years of gridded daily rainfall data. Geophys. Res. Lett. 35, L18707 (2008).
Data Mining of Extremely Large Ad-Hoc Data Sets to Produce Reverse Web-Link Graphs
2017-03-01
in most of the MR cases. From these studies , we also learned that computing -optimized instances should be chosen for serialized/compressed input data...maximum 200 words) Data mining can be a valuable tool, particularly in the acquisition of military intelligence. As the second study within a larger Naval...open web crawler data set Common Crawl. Similar to previous studies , this research employs MapReduce (MR) for sorting and categorizing output value
Vacuum statistics and stability in axionic landscapes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Masoumi, Ali; Vilenkin, Alexander, E-mail: ali@cosmos.phy.tufts.edu, E-mail: vilenkin@cosmos.phy.tufts.edu
2016-03-01
We investigate vacuum statistics and stability in random axionic landscapes. For this purpose we developed an algorithm for a quick evaluation of the tunneling action, which in most cases is accurate within 10%. We find that stability of a vacuum is strongly correlated with its energy density, with lifetime rapidly growing as the energy density is decreased. On the other hand, the probability P(B) for a vacuum to have a tunneling action B greater than a given value declines as a slow power law in B. This is in sharp contrast with the studies of random quartic potentials, which foundmore » a fast exponential decline of P(B). Our results suggest that the total number of relatively stable vacua (say, with B>100) grows exponentially with the number of fields N and can get extremely large for N∼> 100. The problem with this kind of model is that the stable vacua are concentrated near the absolute minimum of the potential, so the observed value of the cosmological constant cannot be explained without fine-tuning. To address this difficulty, we consider a modification of the model, where the axions acquire a quadratic mass term, due to their mixing with 4-form fields. This results in a larger landscape with a much broader distribution of vacuum energies. The number of relatively stable vacua in such models can still be extremely large.« less
NASA Astrophysics Data System (ADS)
Pineda, Luis E.; Willems, Patrick
2017-04-01
Weather and climatic characterization of rainfall extremes is both of scientific and societal value for hydrometeorogical risk management, yet discrimination of local and large-scale forcing remains challenging in data-scarce and complex terrain environments. Here, we present an analysis framework that separate weather (seasonal) regimes and climate (inter-annual) influences using data-driven process identification. The approach is based on signal-to-noise separation methods and extreme value (EV) modeling of multisite rainfall extremes. The EV models use a semi-automatic parameter learning [1] for model identification across temporal scales. At weather scale, the EV models are combined with a state-based hidden Markov model [2] to represent the spatio-temporal structure of rainfall as persistent weather states. At climatic scale, the EV models are used to decode the drivers leading to the shift of weather patterns. The decoding is performed into a climate-to-weather signal subspace, built via dimension reduction of climate model proxies (e.g. sea surface temperature and atmospheric circulation) We apply the framework to the Western Andean Ridge (WAR) in Ecuador and Peru (0-6°S) using ground data from the second half of the 20th century. We find that the meridional component of winds is what matters for the in-year and inter-annual variability of high rainfall intensities alongside the northern WAR (0-2.5°S). There, low-level southerly winds are found as advection drivers for oceanic moist of the normal-rainy season and weak/moderate the El Niño (EN) type; but, the strong EN type and its unique moisture surplus is locally advected at lowlands in the central WAR. Moreover, the coastal ridges, south of 3°S dampen meridional airflows, leaving local hygrothermal gradients to control the in-year distribution of rainfall extremes and their anomalies. Overall, we show that the framework, which does not make any prior assumption on the explanatory power of the weather and climate drivers, allows identification of well-known features of the regional climate in a purely data-driven fashion. Thus, this approach shows potential for characterization of precipitation extremes in data-scarce and orographically complex regions in which model reconstructions are the only climate proxies References [1] Mínguez, R., F.J. Méndez, C. Izaguirre, M. Menéndez, and I.J. Losada (2010), Pseudooptimal parameter selection of non-stationary generalized extreme value models for environmental variables, Environ. Modell. Softw. 25, 1592-1607. [2] Pineda, L., P. Willems (2016), Multisite Downscaling of Seasonal Predictions to Daily Rainfall Characteristics over Pacific-Andean River Basins in Ecuador and Peru using a non-homogenous hidden Markov model, J. Hydrometeor, 17(2), 481-498, doi:10.1175/JHM-D-15-0040.1, http://journals.ametsoc.org/doi/full/10.1175/JHM-D-15-0040.1
Global, Energy-Dependent Ring Current Response During Two Large Storms
NASA Astrophysics Data System (ADS)
Goldstein, J.; Angelopoulos, V.; Burch, J. L.; De Pascuale, S.; Fuselier, S. A.; Genestreti, K. J.; Kurth, W. S.; LLera, K.; McComas, D. J.; Reeves, G. D.; Spence, H. E.; Valek, P. W.
2015-12-01
Two recent large (~200 nT) geomagnetic storms occurred during 17--18 March 2015 and 22--23 June 2015. The global, energy-dependent ring current response to these two extreme events is investigated using both global imaging and multi-point in situ observations. Energetic neutral atom (ENA) imaging by the Two Wide-angle Imaging Neutral-atom Spectrometers (TWINS) mission provides a global view of ring current ions. Local measurements are provided by two multi-spacecraft missions. The two Van Allen Probes measure in situ plasma (including ion composition) and fields at ring current and plasmaspheric L values. The recently launched Magnetospheric Multiscale (MMS) comprises four spacecraft that have just begun to measure particles (including ion composition) and fields at outer magnetospheric L-values. We analyze the timing and energetics of the stormtime evolution of ring current ions, both trapped and precipitating, using TWINS ENA images and in situ data by the Van Allen Probes and MMS.
NASA Astrophysics Data System (ADS)
Zhou, Weijie; Dang, Yaoguo; Gu, Rongbao
2013-03-01
We apply the multifractal detrending moving average (MFDMA) to investigate and compare the efficiency and multifractality of 5-min high-frequency China Securities Index 300 (CSI 300). The results show that the CSI 300 market becomes closer to weak-form efficiency after the introduction of CSI 300 future. We find that the CSI 300 is featured by multifractality and there are less complexity and risk after the CSI 300 index future was introduced. With the shuffling, surrogating and removing extreme values procedures, we unveil that extreme events and fat-distribution are the main origin of multifractality. Besides, we discuss the knotting phenomena in multifractality, and find that the scaling range and the irregular fluctuations for large scales in the Fq(s) vs s plot can cause a knot.
NASA Astrophysics Data System (ADS)
Zhang, Yin; Xia, Jun; She, Dunxian
2018-01-01
In recent decades, extreme precipitation events have been a research hotspot worldwide. Based on 12 extreme precipitation indices, the spatiotemporal variation and statistical characteristic of precipitation extremes in the middle reaches of the Yellow River Basin (MRYRB) during 1960-2013 were investigated. The results showed that the values of most extreme precipitation indices (except consecutive dry days (CDD)) increased from the northwest to the southeast of the MRYRB, reflecting that the southeast was the wettest region in the study area. Temporally, the precipitation extremes presented a drying trend with less frequent precipitation events. Generalized extreme value (GEV) distribution was selected to fit the time series of all indices, and the quantiles values under the 50-year return period showed a similar spatial extent with the corresponding precipitation extreme indices during 1960-2013, indicating a higher risk of extreme precipitation in the southeast of the MRYRB. Furthermore, the changes in probability distribution functions of indices for the period of 1960-1986 and 1987-2013 revealed a drying tendency in our study area. Both El Niño-Southern Oscillation (ENSO) and Pacific Decadal Oscillation (PDO) were proved to have a strong influence on precipitation extremes in the MRYRB. The results of this study are useful to master the change rule of local precipitation extremes, which will help to prevent natural hazards caused.
Extreme value analysis in biometrics.
Hüsler, Jürg
2009-04-01
We review some approaches of extreme value analysis in the context of biometrical applications. The classical extreme value analysis is based on iid random variables. Two different general methods are applied, which will be discussed together with biometrical examples. Different estimation, testing, goodness-of-fit procedures for applications are discussed. Furthermore, some non-classical situations are considered where the data are possibly dependent, where a non-stationary behavior is observed in the data or where the observations are not univariate. A few open problems are also stated.
A hadronic origin for ultra-high-frequency-peaked BL Lac objects
NASA Astrophysics Data System (ADS)
Cerruti, M.; Zech, A.; Boisson, C.; Inoue, S.
2015-03-01
Current Cherenkov telescopes have identified a population of ultra-high-frequency peaked BL Lac objects (UHBLs), also known as extreme blazars, that exhibit exceptionally hard TeV spectra, including 1ES 0229+200, 1ES 0347-121, RGB J0710+591, 1ES 1101-232, and 1ES 1218+304. Although one-zone synchrotron-self-Compton (SSC) models have been generally successful in interpreting the high-energy emission observed in other BL Lac objects, they are problematic for UHBLs, necessitating very large Doppler factors and/or extremely high minimum Lorentz factors of the emitting leptonic population. In this context, we have investigated alternative scenarios where hadronic emission processes are important, using a newly developed (lepto-)hadronic numerical code to systematically explore the physical parameters of the emission region that reproduces the observed spectra while avoiding the extreme values encountered in pure SSC models. Assuming a fixed Doppler factor δ = 30, two principal parameter regimes are identified, where the high-energy emission is due to: (1) proton-synchrotron radiation, with magnetic fields B ˜ 1-100 G and maximum proton energies Ep; max ≲ 1019 eV; and (2) synchrotron emission from p-γ-induced cascades as well as SSC emission from primary leptons, with B ˜ 0.1-1 G and Ep; max ≲ 1017 eV. This can be realized with plausible, sub-Eddington values for the total (kinetic plus magnetic) power of the emitting plasma, in contrast to hadronic interpretations for other blazar classes that often warrant highly super-Eddington values.
Extreme rainfall events: Learning from raingauge time series
NASA Astrophysics Data System (ADS)
Boni, G.; Parodi, A.; Rudari, R.
2006-08-01
SummaryThis study analyzes the historical records of annual rainfall maxima recorded in Northern Italy, cumulated over time windows (durations) of 1 and 24 h and considered paradigmatic descriptions of storms of both short and long duration. Three large areas are studied: Liguria, Piedmont and Triveneto (Triveneto includes the Regions of Veneto, Trentino Alto Adige and Friuli Venezia Giulia). A regional frequency analysis of annual rainfall maxima is carried out through the Two Components Extreme Value (TCEV) distribution. A hierarchical approach is used to define statistically homogeneous areas so that the definition of a regional distribution becomes possible. Thanks to the peculiar nature of the TCEV distribution, a frequency-based threshold criterion is proposed. Such criterion allows to distinguish the observed ordinary values from the observed extra-ordinary values of annual rainfall maxima. A second step of this study focuses on the analysis of the probability of occurrence of extra-ordinary events over a period of one year. Results show the existence of a four month dominant season that maximizes the number of occurrences of annual rainfall maxima. Such results also show how the seasonality of extra-ordinary events changes whenever a different duration of events is considered. The joint probability of occurrence of extreme storms of short and long duration is also analyzed. Such analysis demonstrates how the joint probability of occurrence significantly changes when all rainfall maxima or only extra-ordinary maxima are used. All results undergo a critical discussion. Such discussion seems to lead to the point that the identified statistical characteristics might represent the landmark of those mechanisms causing heavy precipitation in the analyzed regions.
Extreme-event geoelectric hazard maps: Chapter 9
Love, Jeffrey J.; Bedrosian, Paul A.
2018-01-01
Maps of geoelectric amplitude covering about half the continental United States are presented that will be exceeded, on average, once per century in response to an extreme-intensity geomagnetic disturbance. These maps are constructed using an empirical parameterization of induction: convolving latitude-dependent statistical maps of extreme-value geomagnetic disturbances, obtained from decades of 1-minute magnetic observatory data, with local estimates of Earth-surface impedance obtained at discrete geographic sites from magnetotelluric surveys. Geoelectric amplitudes are estimated for geomagnetic waveforms having a 240-s (and 1200-s) sinusoidal period and amplitudes over 10 min (1 h) that exceed a once-per-century threshold. As a result of the combination of geographic differences in geomagnetic variation and Earth-surface impedance, once-per-century geoelectric amplitudes span more than two orders of magnitude and are a highly granular function of location. Specifically for north-south 240-s induction, once-per-century geoelectric amplitudes across large parts of the United States have a median value of 0.34 V/km; for east-west variation, they have a median value of 0.23 V/km. In Northern Minnesota, amplitudes exceed 14.00 V/km for north-south geomagnetic variation (23.34 V/km for east-west variation), while just over 100 km away, amplitudes are only 0.08 V/km (0.02 V/km). At some sites in the northern-central United States, once-per-century geoelectric amplitudes exceed the 2 V/km realized in Québec during the March 1989 storm.
NASA Astrophysics Data System (ADS)
Avanzi, Francesco; De Michele, Carlo; Gabriele, Salvatore; Ghezzi, Antonio; Rosso, Renzo
2015-04-01
Here, we show how atmospheric circulation and topography rule the variability of depth-duration-frequency (DDF) curves parameters, and we discuss how this variability has physical implications on the formation of extreme precipitations at high elevations. A DDF is a curve ruling the value of the maximum annual precipitation H as a function of duration D and the level of probability F. We consider around 1500 stations over the Italian territory, with at least 20 years of data of maximum annual precipitation depth at different durations. We estimated the DDF parameters at each location by using the asymptotic distribution of extreme values, i.e. the so-called Generalized Extreme Value (GEV) distribution, and considering a statistical simple scale invariance hypothesis. Consequently, a DDF curve depends on five different parameters. A first set relates H with the duration (namely, the mean value of annual maximum precipitation depth for unit duration and the scaling exponent), while a second set links H to F (namely, a scale, position and shape parameter). The value of the shape parameter has consequences on the type of random variable (unbounded, upper or lower bounded). This extensive analysis shows that the variability of the mean value of annual maximum precipitation depth for unit duration obeys to the coupled effect of topography and modal direction of moisture flux during extreme events. Median values of this parameter decrease with elevation. We called this phenomenon "reverse orographic effect" on extreme precipitation of short durations, since it is in contrast with general knowledge about the orographic effect on mean precipitation. Moreover, the scaling exponent is mainly driven by topography alone (with increasing values of this parameter at increasing elevations). Therefore, the quantiles of H(D,F) at durations greater than unit turn to be more variable at high elevations than at low elevations. Additionally, the analysis of the variability of the shape parameter with elevation shows that extreme events at high elevations appear to be distributed according to an upper bounded probability distribution. These evidences could be a characteristic sign of the formation of extreme precipitation events at high elevations.
Extreme events in total ozone over Arosa - Part 1: Application of extreme value theory
NASA Astrophysics Data System (ADS)
Rieder, H. E.; Staehelin, J.; Maeder, J. A.; Peter, T.; Ribatet, M.; Davison, A. C.; Stübi, R.; Weihs, P.; Holawe, F.
2010-10-01
In this study ideas from extreme value theory are for the first time applied in the field of stratospheric ozone research, because statistical analysis showed that previously used concepts assuming a Gaussian distribution (e.g. fixed deviations from mean values) of total ozone data do not adequately address the structure of the extremes. We show that statistical extreme value methods are appropriate to identify ozone extremes and to describe the tails of the Arosa (Switzerland) total ozone time series. In order to accommodate the seasonal cycle in total ozone, a daily moving threshold was determined and used, with tools from extreme value theory, to analyse the frequency of days with extreme low (termed ELOs) and high (termed EHOs) total ozone at Arosa. The analysis shows that the Generalized Pareto Distribution (GPD) provides an appropriate model for the frequency distribution of total ozone above or below a mathematically well-defined threshold, thus providing a statistical description of ELOs and EHOs. The results show an increase in ELOs and a decrease in EHOs during the last decades. The fitted model represents the tails of the total ozone data set with high accuracy over the entire range (including absolute monthly minima and maxima), and enables a precise computation of the frequency distribution of ozone mini-holes (using constant thresholds). Analyzing the tails instead of a small fraction of days below constant thresholds provides deeper insight into the time series properties. Fingerprints of dynamical (e.g. ENSO, NAO) and chemical features (e.g. strong polar vortex ozone loss), and major volcanic eruptions, can be identified in the observed frequency of extreme events throughout the time series. Overall the new approach to analysis of extremes provides more information on time series properties and variability than previous approaches that use only monthly averages and/or mini-holes and mini-highs.
Extreme events in total ozone over Arosa - Part 1: Application of extreme value theory
NASA Astrophysics Data System (ADS)
Rieder, H. E.; Staehelin, J.; Maeder, J. A.; Peter, T.; Ribatet, M.; Davison, A. C.; Stübi, R.; Weihs, P.; Holawe, F.
2010-05-01
In this study ideas from extreme value theory are for the first time applied in the field of stratospheric ozone research, because statistical analysis showed that previously used concepts assuming a Gaussian distribution (e.g. fixed deviations from mean values) of total ozone data do not adequately address the structure of the extremes. We show that statistical extreme value methods are appropriate to identify ozone extremes and to describe the tails of the Arosa (Switzerland) total ozone time series. In order to accommodate the seasonal cycle in total ozone, a daily moving threshold was determined and used, with tools from extreme value theory, to analyse the frequency of days with extreme low (termed ELOs) and high (termed EHOs) total ozone at Arosa. The analysis shows that the Generalized Pareto Distribution (GPD) provides an appropriate model for the frequency distribution of total ozone above or below a mathematically well-defined threshold, thus providing a statistical description of ELOs and EHOs. The results show an increase in ELOs and a decrease in EHOs during the last decades. The fitted model represents the tails of the total ozone data set with high accuracy over the entire range (including absolute monthly minima and maxima), and enables a precise computation of the frequency distribution of ozone mini-holes (using constant thresholds). Analyzing the tails instead of a small fraction of days below constant thresholds provides deeper insight into the time series properties. Fingerprints of dynamical (e.g. ENSO, NAO) and chemical features (e.g. strong polar vortex ozone loss), and major volcanic eruptions, can be identified in the observed frequency of extreme events throughout the time series. Overall the new approach to analysis of extremes provides more information on time series properties and variability than previous approaches that use only monthly averages and/or mini-holes and mini-highs.
Data Mining of Extremely Large Ad Hoc Data Sets to Produce Inverted Indices
2016-06-01
NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public release; distribution is unlimited DATA MINING OF...COVERED Master’s Thesis 4. TITLE AND SUBTITLE DATA MINING OF EXTREMELY LARGE AD HOC DATA SETS TO PRODUCE INVERTED INDICES 5. FUNDING NUMBERS 6...INTENTIONALLY LEFT BLANK iii Approved for public release; distribution is unlimited DATA MINING OF EXTREMELY LARGE AD HOC DATA SETS TO PRODUCE
Nearly extremal apparent horizons in simulations of merging black holes
NASA Astrophysics Data System (ADS)
Lovelace, Geoffrey; Scheel, Mark; Owen, Robert; Giesler, Matthew; Katebi, Reza; Szilagyi, Bela; Chu, Tony; Demos, Nicholas; Hemberger, Daniel; Kidder, Lawrence; Pfeiffer, Harald; Afshari, Nousha; SXS Collaboration
2015-04-01
The spin S of a Kerr black hole is bounded by the surface area A of its apparent horizon: 8 πS <= A . We present recent results (arXiv:1411.7297) for the extremality of apparent horizons for merging, rapidly rotating black holes with equal masses and equal spins aligned with the orbital angular momentum. Measuring the area and (using approximate Killing vectors) the spin on the individual and common apparent horizons, we find that the inequality 8 πS < A is satisfied but is very close to equality on the common apparent horizon at the instant it first appears--even for initial spins as large as S /M2 = 0 . 994 . We compute the smallest value e0 that Booth and Fairhurst's extremality parameter can take for any scaling of the horizon's null normal vectors, concluding that the common horizons are at least moderately close to extremal just after they appear. We construct binary-black-hole initial data with marginally trapped surfaces with 8 πS > A and e0 > 1 , but these surfaces are always surrounded by apparent horizons with 8 πS < A and e0 < 1 .
Trend of annual temperature and frequency of extreme events in the MATOPIBA region of Brazil
NASA Astrophysics Data System (ADS)
Salvador, Mozar de A.; de Brito, J. I. B.
2017-06-01
During the 1980s, a new agricultural frontier arouse in Brazil, which occupied part of the states of Maranhão, Tocantins, Piauí, and Bahia. Currently, this new frontier is known as the MATOPIBA region. The region went through intense transformations in its social and environmental characteristics, with the emergence of extensive areas of intensive agriculture and large herds. The purpose of this research was to study the climatic variabilities of temperature in the MATOPIBA region through extreme climate indexes of ClimAp tool. Data from 11 weather stations were analyzed for yearly air temperature (maximum and minimum) in the period of 1970 to 2012. To verify the trend in the series, we used methods of linear regression analysis and Kendall-tau test. The annual analysis of maximum and minimum temperatures and of the temperature extremes indexes showed a strong positive trend in practically every series (with p value less than 0.05). These results indicated that the region went through to a significant heating process in the last 3 decades. The indices of extreme also showed a significant positive trend in most of the analyzed stations, indicating a higher frequency of warm days during the year.
Intensification of seasonal temperature extremes prior to the 2°C global warming target
NASA Astrophysics Data System (ADS)
Anderson, B. T.
2011-12-01
Given current international efforts to limit human-induced global-mean near-surface temperature increases to 2°C, relative to the pre-industrial era, there is an interest in determining what unavoidable impacts to physical, biological, and socio-economic systems might occur even if this target were met. In our research we show that substantial fractions of the globe could experience seasonal-mean temperature extremes with unprecedented regularity, even if the global-mean temperature remains below the 2°C target currently envisioned. These results have significant implications for agriculture and crop yield; disease and human health; and ecosystems and biodiversity. To obtain these results, we first develop a novel method for combining numerical-model estimates of near-term increases in grid-point temperatures with stochastically generated anomalies derived from high-resolution observations during the last half of the 20th century. This method has practical advantages because it generates results at fine spatial resolution without relying on computationally-intensive regional-model experiments; it explicitly incorporates information derived from the observations regarding interannual-to-decadal variations in seasonal-mean temperatures; and it includes the generation of thousands of realizations of the possible impacts of a global mean temperature increase on local occurrences of hot extremes. Using this method we find that even given the "committed" future global-mean temperature increase of 0.6°C (1.4°C relative to the pre-industrial era) historical seasonal-mean temperature extremes will be exceeded in at least half of all years-equivalently, the historical extreme values will become the norm-for much of Africa, the southeastern and central portions of Asia, Indonesia, and the Amazon. Should the global-mean temperature increase reach 2°C (relative to the pre-industrial era), it is more likely than not that these same regions, along with large portions of western North America, will experience historical seasonal-mean temperature extremes every single year. Further, the current historical extreme values will effectively become the norm for approximately 70-80% of the Earth's land surface.
NASA Astrophysics Data System (ADS)
Kaba, M.; Zhou, F. C.; Lim, A.; Decoster, D.; Huignard, J.-P.; Tonda, S.; Dolfi, D.; Chazelas, J.
2007-11-01
The applications of microwave optoelectronics are extremely large since they extend from the Radio-over-Fibre to the Homeland security and defence systems. Then, the improved maturity of the optoelectronic components operating up to 40GHz permit to consider new optical processing functions (filtering, beamforming, ...) which can operate over very wideband microwave analogue signals. Specific performances are required which imply optical delay lines able to exhibit large Time-Bandwidth product values. It is proposed to evaluate slow light approach through highly dispersive structures based on either uniform or chirped Bragg Gratings. Therefore, we highlight the impact of the major parameters of such structures: index modulation depth, grating length, grating period, chirp coefficient and demonstrate the high potentiality of Bragg Grating for Large RF signals bandwidth processing under slow-light propagation.
Reference values for physical performance measures in the aging working population.
Cote, Mark P; Kenny, Anne; Dussetschleger, Jeffrey; Farr, Dana; Chaurasia, Ashok; Cherniack, Martin
2014-02-01
The aim of this study was to determine reference physical performance values in older aging workers. Cross-sectional physical performance measures were collected for 736 manufacturing workers to assess effects of work and nonwork factors on age-related changes in musculoskeletal function and health. Participants underwent surveys and physical testing that included bioelectrical impedance analysis, range-of-motion measures, exercise testing, and dynamic assessment. Physical characteristics, such as blood pressure and body fat percentage, were comparable to published values. Dynamic and range-of-motion measurements differed from published normative results. Women had age-related decreases in cervical extension and lateral rotation. Older men had better spinal flexion than expected. Predicted age-related decline in lower-extremity strength and shoulder strength in women was not seen. Men declined in handgrip, lower-extremity strength, and knee extension strength, but not trunk strength, across age groups. There was no appreciable decline in muscle fatigue at the trunk, shoulder, and knee with aging for either gender, except for the youngest age group of women. Normative values may underestimate physical performance in "healthy" older workers, thereby underappreciating declines in less healthy older workers. Work may be preservative of function for a large group of selected individuals. A "healthy worker effect" may be greater for musculoskeletal disease and function than for heart disease and mortality. Clinicians and researchers studying musculoskeletal function in older workers can use a more specific set of reference values.
Estimation of muscle torque in various combat sports.
Pędzich, Wioletta; Mastalerz, Andrzej; Sadowski, Jerzy
2012-01-01
The purpose of the research was to compare muscle torque of elite combat groups. Twelve taekwondo WTF athletes, twelve taekwondo ITF athletes and nine boxers participated in the study. Measurements of muscle torques were done under static conditions on a special stand which belonged to the Department of Biomechanics. The sum of muscle torque of lower right and left extremities of relative values was significantly higher for taekwondo WTF athletes than for boxers (16%, p < 0.001 for right and 10%, p < 0.05 for left extremities) and taekwondo ITF (10%, p < 0.05 for right and 8% for left extremities). Taekwondo ITF athletes attained significantly higher absolute muscle torque values than boxers for elbow flexors (20%, p < 0.05 for right and 11% for left extremities) and extensors (14% for right and 18%, p < 0.05 for left extremities) and shoulder flexors (10% for right and 12%, p < 0.05 for left extremities) and extensors (11% for right and 1% for left extremities). Taekwondo WTF and taekwondo ITF athletes obtained significantly different relative values of muscle torque of the hip flexors (16%, p < 0.05) and extensors (11%, p < 0.05) of the right extremities.
Capesius, Joseph P.; Arnold, L. Rick
2012-01-01
The Mass Balance results were quite variable over time such that they appeared suspect with respect to the concept of groundwater flow as being gradual and slow. The large degree of variability in the day-to-day and month-to-month Mass Balance results is likely the result of many factors. These factors could include ungaged stream inflows or outflows, short-term streamflow losses to and gains from temporary bank storage, and any lag in streamflow accounting owing to streamflow lag time of flow within a reach. The Pilot Point time series results were much less variable than the Mass Balance results and extreme values were effectively constrained. Less day-to-day variability, smaller magnitude extreme values, and smoother transitions in base-flow estimates provided by the Pilot Point method are more consistent with a conceptual model of groundwater flow being gradual and slow. The Pilot Point method provided a better fit to the conceptual model of groundwater flow and appeared to provide reasonable estimates of base flow.
NASA Astrophysics Data System (ADS)
Darko, Deborah; Adjei, Kwaku A.; Appiah-Adjei, Emmanuel K.; Odai, Samuel N.; Obuobie, Emmanuel; Asmah, Ruby
2018-06-01
The extent to which statistical bias-adjusted outputs of two regional climate models alter the projected change signals for the mean (and extreme) rainfall and temperature over the Volta Basin is evaluated. The outputs from two regional climate models in the Coordinated Regional Climate Downscaling Experiment for Africa (CORDEX-Africa) are bias adjusted using the quantile mapping technique. Annual maxima rainfall and temperature with their 10- and 20-year return values for the present (1981-2010) and future (2051-2080) climates are estimated using extreme value analyses. Moderate extremes are evaluated using extreme indices (viz. percentile-based, duration-based, and intensity-based). Bias adjustment of the original (bias-unadjusted) models improves the reproduction of mean rainfall and temperature for the present climate. However, the bias-adjusted models poorly reproduce the 10- and 20-year return values for rainfall and maximum temperature whereas the extreme indices are reproduced satisfactorily for the present climate. Consequently, projected changes in rainfall and temperature extremes were weak. The bias adjustment results in the reduction of the change signals for the mean rainfall while the mean temperature signals are rather magnified. The projected changes for the original mean climate and extremes are not conserved after bias adjustment with the exception of duration-based extreme indices.
Invariant Imbedded T-Matrix Method for Axial Symmetric Hydrometeors with Extreme Aspect Ratios
NASA Technical Reports Server (NTRS)
Pelissier, Craig; Kuo, Kwo-Sen; Clune, Thomas; Adams, Ian; Munchak, Stephen
2017-01-01
The single-scattering properties (SSPs) of hydrometeors are the fundamental quantities for physics-based precipitation retrievals. Thus, efficient computation of their electromagnetic scattering is of great value. Whereas the semi-analytical T-Matrix methods are likely the most efficient for nonspherical hydrometeors with axial symmetry, they are not suitable for arbitrarily shaped hydrometeors absent of any significant symmetry, for which volume integral methods such as those based on Discrete Dipole Approximation (DDA) are required. Currently the two leading T-matrix methods are the Extended Boundary Condition Method (EBCM) and the Invariant Imbedding T-matrix Method incorporating Lorentz-Mie Separation of Variables (IITM+SOV). EBCM is known to outperform IITM+SOV for hydrometeors with modest aspect ratios. However, in cases when aspect ratios become extreme, such as needle-like particles with large height to diameter values, EBCM fails to converge. Such hydrometeors with extreme aspect ratios are known to be present in solid precipitation and their SSPs are required to model the radiative responses accurately. In these cases, IITM+SOV is shown to converge. An efficient, parallelized C++ implementation for both EBCM and IITM+SOV has been developed to conduct a performance comparison between EBCM, IITM+SOV, and DDSCAT (a popular implementation of DDA). We present the comparison results and discuss details. Our intent is to release the combined ECBM IITM+SOV software to the community under an open source license.
Invariant Imbedding T-Matrix Method for Axial Symmetric Hydrometeors with Extreme Aspect Ratios
NASA Astrophysics Data System (ADS)
Pelissier, C.; Clune, T.; Kuo, K. S.; Munchak, S. J.; Adams, I. S.
2017-12-01
The single-scattering properties (SSPs) of hydrometeors are the fundamental quantities for physics-based precipitation retrievals. Thus, efficient computation of their electromagnetic scattering is of great value. Whereas the semi-analytical T-Matrix methods are likely the most efficient for nonspherical hydrometeors with axial symmetry, they are not suitable for arbitrarily shaped hydrometeors absent of any significant symmetry, for which volume integral methods such as those based on Discrete Dipole Approximation (DDA) are required. Currently the two leading T-matrix methods are the Extended Boundary Condition Method (EBCM) and the Invariant Imbedding T-matrix Method incorporating Lorentz-Mie Separation of Variables (IITM+SOV). EBCM is known to outperform IITM+SOV for hydrometeors with modest aspect ratios. However, in cases when aspect ratios become extreme, such as needle-like particles with large height to diameter values, EBCM fails to converge. Such hydrometeors with extreme aspect ratios are known to be present in solid precipitation and their SSPs are required to model the radiative responses accurately. In these cases, IITM+SOV is shown to converge. An efficient, parallelized C++ implementation for both EBCM and IITM+SOV has been developed to conduct a performance comparison between EBCM, IITM+SOV, and DDSCAT (a popular implementation of DDA). We present the comparison results and discuss details. Our intent is to release the combined ECBM & IITM+SOV software to the community under an open source license.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mor, Rivay; Netzer, Hagai; Trakhtenbrot, Benny
We report new Herschel observations of 25 z {approx_equal} 4.8 extremely luminous optically selected active galactic nuclei (AGNs). Five of the sources have extremely large star-forming (SF) luminosities, L{sub SF}, corresponding to SF rates (SFRs) of 2800-5600 M{sub Sun} yr{sup -1} assuming a Salpeter initial mass function. The remaining sources have only upper limits on their SFRs, but stacking their Herschel images results in a mean SFR of 700 {+-} 150 M{sub Sun} yr{sup -1}. The higher SFRs in our sample are comparable to the highest observed values so far at any redshift. Our sample does not contain obscured AGNs,more » which enables us to investigate several evolutionary scenarios connecting supermassive black holes and SF activity in the early universe. The most probable scenario is that we are witnessing the peak of SF activity in some sources and the beginning of the post-starburst decline in others. We suggest that all 25 sources, which are at their peak AGN activity, are in large mergers. AGN feedback may be responsible for diminishing the SF activity in 20 of them, but is not operating efficiently in 5 others.« less
VizieR Online Data Catalog: 2nd and 3d parameters of HB of globular clusters (Gratton+, 2010)
NASA Astrophysics Data System (ADS)
Gratton, R. G.; Carretta, E.; Bragaglia, A.; Lucatello, S.; S'orazii, V.
2010-05-01
The second parameter (the first being metallicity) defining the distribution of stars on the horizontal branch (HB) of globular clusters (GCs) has long been one of the major open issues in our understanding of the evolution of normal stars. Large photometric and spectroscopic databases are now available: they include large and homogeneous sets of colour-magnitude diagrams, cluster ages, and homogeneous data about chemical compositions from our FLAMES survey. We use these databases to re-examine this issue. Methods. We use the photometric data to derive median and extreme (i.e., the values including 90% of the distribution) colours and magnitudes of stars along the HB for about a hundred GCs. We transform these into median and extreme masses of stars on the HB, using the models developed by the Pisa group, and taking into account evolutionary effects. We compare these masses with those expected at the tip of the red giant branch (RGB) to derive the total mass lost by the stars. (11 data files).
Thermoelectric Properties of Nanograined Si-Ge-Au Thin Films Grown by Molecular Beam Deposition
NASA Astrophysics Data System (ADS)
Nishino, Shunsuke; Ekino, Satoshi; Inukai, Manabu; Omprakash, Muthusamy; Adachi, Masahiro; Kiyama, Makoto; Yamamoto, Yoshiyuki; Takeuchi, Tsunehiro
2018-06-01
Conditions to achieve extremely large Seebeck coefficient and extremely small thermal conductivity in Si-Ge-Au thin films formed of nanosized grains precipitated in amorphous matrix have been investigated. We employed molecular beam deposition to prepare Si1- x Ge x Au y thin films on sapphire substrate. The deposited films were annealed under nitrogen gas atmosphere at 300°C to 500°C for 15 min to 30 min. Nanocrystals dispersed in amorphous matrix were clearly observed by transmission electron microscopy. We did not observe anomalously large Seebeck coefficient, but very low thermal conductivity of nearly 1.0 W K-1 m-1 was found at around 0.2 < x < 0.6. The compositional dependence of the thermal conductivity was well accounted for by the compositional dependence of the mixing entropy. Some of these values agree exactly with the amorphous limit predicted by theoretical calculations. The smallest lattice thermal conductivity found for the present samples is lower than that of nanostructured Si-Ge bulk material for which dimensionless figure of merit of ZT ≈ 1 was reported at high temperature.
Closed-Form Evaluation of Mutual Coupling in a Planar Array of Circular Apertures
NASA Technical Reports Server (NTRS)
Bailey, M. C.
1996-01-01
The integral expression for the mutual admittance between circular apertures in a planar array is evaluated in closed form. Very good accuracy is realized when compared with values that were obtained by numerical integration. Utilization of this closed-form expression, for all element pairs that are separated by more than one element spacing, yields extremely accurate results and significantly reduces the computation time that is required to analyze the performance of a large electronically scanning antenna array.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lis, Jakub
In this paper, we investigate the Q-ball Ansatz in the baby Skyrme model. First, the appearance of peakons, i.e. solutions with extremely large absolute values of the second derivative at maxima, is analyzed. It is argued that such solutions are intrinsic to the baby Skyrme model and do not depend on the detailed form of a potential used in calculations. Next, we concentrate on compact nonspinning Q-balls. We show the failure of a small parameter expansion in this case. Finally, we explore the existence and parameter dependence of Q-ball solutions.
Fontanesi, Luca; Bertolini, Francesca; Scotti, Emilio; Schiavo, Giuseppina; Colombo, Michela; Trevisi, Paolo; Ribani, Anisa; Buttazzoni, Luca; Russo, Vincenzo; Dall'Olio, Stefania
2015-01-01
The GPR120 gene (also known as FFAR4 or O3FAR1) encodes for a functional omega-3 fatty acid receptor/sensor that mediates potent insulin sensitizing effects by repressing macrophage-induced tissue inflammation. For its functional role, GPR120 could be considered a potential target gene in animal nutrigenetics. In this work we resequenced the porcine GPR120 gene by high throughput Ion Torrent semiconductor sequencing of amplified fragments obtained from 8 DNA pools derived, on the whole, from 153 pigs of different breeds/populations (two Italian Large White pools, Italian Duroc, Italian Landrace, Casertana, Pietrain, Meishan, and wild boars). Three single nucleotide polymorphisms (SNPs), two synonymous substitutions and one in the putative 3'-untranslated region (g.114765469C > T), were identified and their allele frequencies were estimated by sequencing reads count. The g.114765469C > T SNP was also genotyped by PCR-RFLP confirming estimated frequency in Italian Large White pools. Then, this SNP was analyzed in two Italian Large White cohorts using a selective genotyping approach based on extreme and divergent pigs for back fat thickness (BFT) estimated breeding value (EBV) and average daily gain (ADG) EBV. Significant differences of allele and genotype frequencies distribution was observed between the extreme ADG-EBV groups (P < 0.001) whereas this marker was not associated with BFT-EBV.
Compressible Fabry-Perot refractometer.
Andersson, M; Eliasson, L; Pendrill, L R
1987-11-15
The use of a long, thermally stable Fabry-Perot etalon as a refractometer is considered in detail in this study of the refractive index of air. The etalon consists of two flat plates of fused silica 60 mm in diameter, with a cylindrical spacer made of Zerodur (a polycrystalline glass ceramic of extremely low thermal expansion) 200 mm long. The interferogram of light from a frequency-stabilized He-Ne laser is imaged with large-diameter mirror optics. The principal result is a demonstration of the effects of changes in atmospheric pressure on the etalon. The measured refractive-index values deviate by 2 parts in 10(7) from calculated values. Possible causes of error are considered in detail.
Heidelberg Retina Tomography Analysis in Optic Disks with Anatomic Particularities
Alexandrescu, C; Pascu, R; Ilinca, R; Popescu, V; Ciuluvica, R; Voinea, L; Celea, C
2010-01-01
Due to its objectivity, reproducibility and predictive value confirmed by many large scale statistical clinical studies, Heidelberg Retina Tomography has become one of the most used computerized image analysis of the optic disc in glaucoma. It has been signaled, though, that the diagnostic value of Moorfieds Regression Analyses and Glaucoma Probability Score decreases when analyzing optic discs with extreme sizes. The number of false positive results increases in cases of megalopapilllae and the number of false negative results increases in cases of small size optic discs. The present paper is a review of the aspects one should take into account when analyzing a HRT result of an optic disc with anatomic particularities. PMID:21254731
Robust, nonlinear, high angle-of-attack control design for a supermaneuverable vehicle
NASA Technical Reports Server (NTRS)
Adams, Richard J.
1993-01-01
High angle-of-attack flight control laws are developed for a supermaneuverable fighter aircraft. The methods of dynamic inversion and structured singular value synthesis are combined into an approach which addresses both the nonlinearity and robustness problems of flight at extreme operating conditions. The primary purpose of the dynamic inversion control elements is to linearize the vehicle response across the flight envelope. Structured singular value synthesis is used to design a dynamic controller which provides robust tracking to pilot commands. The resulting control system achieves desired flying qualities and guarantees a large margin of robustness to uncertainties for high angle-of-attack flight conditions. The results of linear simulation and structured singular value stability analysis are presented to demonstrate satisfaction of the design criteria. High fidelity nonlinear simulation results show that the combined dynamics inversion/structured singular value synthesis control law achieves a high level of performance in a realistic environment.
Correlation dimension and phase space contraction via extreme value theory
NASA Astrophysics Data System (ADS)
Faranda, Davide; Vaienti, Sandro
2018-04-01
We show how to obtain theoretical and numerical estimates of correlation dimension and phase space contraction by using the extreme value theory. The maxima of suitable observables sampled along the trajectory of a chaotic dynamical system converge asymptotically to classical extreme value laws where: (i) the inverse of the scale parameter gives the correlation dimension and (ii) the extremal index is associated with the rate of phase space contraction for backward iteration, which in dimension 1 and 2, is closely related to the positive Lyapunov exponent and in higher dimensions is related to the metric entropy. We call it the Dynamical Extremal Index. Numerical estimates are straightforward to obtain as they imply just a simple fit to a univariate distribution. Numerical tests range from low dimensional maps, to generalized Henon maps and climate data. The estimates of the indicators are particularly robust even with relatively short time series.
NASA Astrophysics Data System (ADS)
Torrungrueng, Danai; Johnson, Joel T.; Chou, Hsi-Tseng
2002-03-01
The novel spectral acceleration (NSA) algorithm has been shown to produce an $[\\mathcal{O}]$(Ntot) efficient iterative method of moments for the computation of radiation/scattering from both one-dimensional (1-D) and two-dimensional large-scale quasi-planar structures, where Ntot is the total number of unknowns to be solved. This method accelerates the matrix-vector multiplication in an iterative method of moments solution and divides contributions between points into ``strong'' (exact matrix elements) and ``weak'' (NSA algorithm) regions. The NSA method is based on a spectral representation of the electromagnetic Green's function and appropriate contour deformation, resulting in a fast multipole-like formulation in which contributions from large numbers of points to a single point are evaluated simultaneously. In the standard NSA algorithm the NSA parameters are derived on the basis of the assumption that the outermost possible saddle point, φs,max, along the real axis in the complex angular domain is small. For given height variations of quasi-planar structures, this assumption can be satisfied by adjusting the size of the strong region Ls. However, for quasi-planar structures with large height variations, the adjusted size of the strong region is typically large, resulting in significant increases in computational time for the computation of the strong-region contribution and degrading overall efficiency of the NSA algorithm. In addition, for the case of extremely large scale structures, studies based on the physical optics approximation and a flat surface assumption show that the given NSA parameters in the standard NSA algorithm may yield inaccurate results. In this paper, analytical formulas associated with the NSA parameters for an arbitrary value of φs,max are presented, resulting in more flexibility in selecting Ls to compromise between the computation of the contributions of the strong and weak regions. In addition, a ``multilevel'' algorithm, decomposing 1-D extremely large scale quasi-planar structures into more than one weak region and appropriately choosing the NSA parameters for each weak region, is incorporated into the original NSA method to improve its accuracy.
Extreme events in total ozone: Spatio-temporal analysis from local to global scale
NASA Astrophysics Data System (ADS)
Rieder, Harald E.; Staehelin, Johannes; Maeder, Jörg A.; Ribatet, Mathieu; di Rocco, Stefania; Jancso, Leonhardt M.; Peter, Thomas; Davison, Anthony C.
2010-05-01
Recently tools from extreme value theory (e.g. Coles, 2001; Ribatet, 2007) have been applied for the first time in the field of stratospheric ozone research, as statistical analysis showed that previously used concepts assuming a Gaussian distribution (e.g. fixed deviations from mean values) of total ozone data do not address the internal data structure concerning extremes adequately (Rieder et al., 2010a,b). A case study the world's longest total ozone record (Arosa, Switzerland - for details see Staehelin et al., 1998a,b) illustrates that tools based on extreme value theory are appropriate to identify ozone extremes and to describe the tails of the total ozone record. Excursions in the frequency of extreme events reveal "fingerprints" of dynamical factors such as ENSO or NAO, and chemical factors, such as cold Arctic vortex ozone losses, as well as major volcanic eruptions of the 20th century (e.g. Gunung Agung, El Chichón, Mt. Pinatubo). Furthermore, atmospheric loading in ozone depleting substances led to a continuous modification of column ozone in the northern hemisphere also with respect to extreme values (partly again in connection with polar vortex contributions). It is shown that application of extreme value theory allows the identification of many more such fingerprints than conventional time series analysis of annual and seasonal mean values. Especially, the extremal analysis shows the strong influence of dynamics, revealing that even moderate ENSO and NAO events have a discernible effect on total ozone (Rieder et al., 2010b). Overall the extremes concept provides new information on time series properties, variability, trends and the influence of dynamics and chemistry, complementing earlier analyses focusing only on monthly (or annual) mean values. Findings described above could be proven also for the total ozone records of 5 other long-term series (Belsk, Hohenpeissenberg, Hradec Kralove, Potsdam, Uccle) showing that strong influence of atmospheric dynamics (NAO, ENSO) on total ozone is a global feature in the northern mid-latitudes (Rieder et al., 2010c). In a next step frequency distributions of extreme events are analyzed on global scale (northern and southern mid-latitudes). A specific focus here is whether findings gained through analysis of long-term European ground based stations can be clearly identified as a global phenomenon. By showing results from these three types of studies an overview of extreme events in total ozone (and the dynamical and chemical features leading to those) will be presented from local to global scales. References: Coles, S.: An Introduction to Statistical Modeling of Extreme Values, Springer Series in Statistics, ISBN:1852334592, Springer, Berlin, 2001. Ribatet, M.: POT: Modelling peaks over a threshold, R News, 7, 34-36, 2007. Rieder, H.E., Staehelin, J., Maeder, J.A., Ribatet, M., Stübi, R., Weihs, P., Holawe, F., Peter, T., and A.D., Davison (2010): Extreme events in total ozone over Arosa - Part I: Application of extreme value theory, to be submitted to ACPD. Rieder, H.E., Staehelin, J., Maeder, J.A., Ribatet, M., Stübi, R., Weihs, P., Holawe, F., Peter, T., and A.D., Davison (2010): Extreme events in total ozone over Arosa - Part II: Fingerprints of atmospheric dynamics and chemistry and effects on mean values and long-term changes, to be submitted to ACPD. Rieder, H.E., Jancso, L., Staehelin, J., Maeder, J.A., Ribatet, Peter, T., and A.D., Davison (2010): Extreme events in total ozone over the northern mid-latitudes: A case study based on long-term data sets from 5 ground-based stations, in preparation. Staehelin, J., Renaud, A., Bader, J., McPeters, R., Viatte, P., Hoegger, B., Bugnion, V., Giroud, M., and Schill, H.: Total ozone series at Arosa (Switzerland): Homogenization and data comparison, J. Geophys. Res., 103(D5), 5827-5842, doi:10.1029/97JD02402, 1998a. Staehelin, J., Kegel, R., and Harris, N. R.: Trend analysis of the homogenized total ozone series of Arosa (Switzerland), 1929-1996, J. Geophys. Res., 103(D7), 8389-8400, doi:10.1029/97JD03650, 1998b.
Evaluating the Large-Scale Environment of Extreme Events Using Reanalyses
NASA Astrophysics Data System (ADS)
Bosilovich, M. G.; Schubert, S. D.; Koster, R. D.; da Silva, A. M., Jr.; Eichmann, A.
2014-12-01
Extreme conditions and events have always been a long standing concern in weather forecasting and national security. While some evidence indicates extreme weather will increase in global change scenarios, extremes are often related to the large scale atmospheric circulation, but also occurring infrequently. Reanalyses assimilate substantial amounts of weather data and a primary strength of reanalysis data is the representation of the large-scale atmospheric environment. In this effort, we link the occurrences of extreme events or climate indicators to the underlying regional and global weather patterns. Now, with greater than 3o years of data, reanalyses can include multiple cases of extreme events, and thereby identify commonality among the weather to better characterize the large-scale to global environment linked to the indicator or extreme event. Since these features are certainly regionally dependent, and also, the indicators of climate are continually being developed, we outline various methods to analyze the reanalysis data and the development of tools to support regional evaluation of the data. Here, we provide some examples of both individual case studies and composite studies of similar events. For example, we will compare the large scale environment for Northeastern US extreme precipitation with that of highest mean precipitation seasons. Likewise, southerly winds can shown to be a major contributor to very warm days in the Northeast winter. While most of our development has involved NASA's MERRA reanalysis, we are also looking forward to MERRA-2 which includes several new features that greatly improve the representation of weather and climate, especially for the regions and sectors involved in the National Climate Assessment.
NASA Astrophysics Data System (ADS)
Dugger, A. L.; Zhang, Y.; Gochis, D.; Yu, W.; McCreight, J. L.; Karsten, L.; Rafieeinasab, A.; Sampson, K. M.; Salas, F.; Read, L.; Pan, L.; Yates, D. N.; Cosgrove, B.; Clark, E. P.
2017-12-01
Streamflow extremes (lows and peaks) tend to have disproportionately higher impacts on the human and natural systems compared to mean streamflow. Examining and understanding the spatiotemporal distributions of streamflow extremes is of significant interests to both the research community and the water resources management. In this work, the output from the 24-year (1993 through 2016) retrospective runs of the National Water Model (NWM) version of WRF-Hydro will be analyzed for streamflow extremes over the CONUS domain. The CONUS domain was configured at 1-km resolution for land surface grid and 250-m resolution for terrain routing. The WRF-Hydro runs were forced by the regridded and downscaled NLDAS2 data. The analyses focus on daily mean streamflow values over the full water year and within the summer and winter seasons. Connections between NWM streamflow and other hydrologic variables (e.g. snowpack, soil moisture/saturation and ET) with variations in large-scale climate phenomena, e.g., El Niño - Southern Oscillation (ENSO), North Atlantic Oscillation (NAO), and North American monsoon are examined. The CONUS domain has a diverse environment and is characterized by complex terrain, heterogeneous land surfaces and ecosystems, and numerous hydrological basins. The potential dependence of streamflow extremes on regional terrain character, climatic conditions, and ecologic zones will also be investigated.
A nonstationary analysis for the Northern Adriatic extreme sea levels
NASA Astrophysics Data System (ADS)
Masina, Marinella; Lamberti, Alberto
2013-09-01
The historical data from the Trieste, Venice, Porto Corsini, and Rimini tide gauges have been used to investigate the spatial and temporal changes in extreme high water levels in the Northern Adriatic. A detailed analysis of annual mean sea level evolution at the three longest operating stations shows a coherent behavior both on a regional and global scale. A slight increase in magnitude of extreme water elevations, after the removal of the regularized annual mean sea level necessary to eliminate the effect of local subsidence and sea level rise, is found at the Venice and Porto Corsini stations. It seems to be mainly associated with a wind regime change occurred in the 1990s, due to an intensification of Bora wind events after their decrease in frequency and intensity during the second half of the 20th century. The extreme values, adjusted for the annual mean sea level trend, are modeled using a time-dependent GEV distribution. The inclusion of seasonality in the GEV parameters considerably improves the data fitting. The interannual fluctuations of the detrended monthly maxima exhibit a significant correlation with the variability of the large-scale atmospheric circulation represented by the North Atlantic Oscillation and Arctic Oscillation indices. The different coast exposure to the Bora and Sirocco winds and their seasonal character explain the various seasonal patterns of extreme sea levels observed at the tide gauges considered in the present analysis.
A comparative assessment of statistical methods for extreme weather analysis
NASA Astrophysics Data System (ADS)
Schlögl, Matthias; Laaha, Gregor
2017-04-01
Extreme weather exposure assessment is of major importance for scientists and practitioners alike. We compare different extreme value approaches and fitting methods with respect to their value for assessing extreme precipitation and temperature impacts. Based on an Austrian data set from 25 meteorological stations representing diverse meteorological conditions, we assess the added value of partial duration series over the standardly used annual maxima series in order to give recommendations for performing extreme value statistics of meteorological hazards. Results show the merits of the robust L-moment estimation, which yielded better results than maximum likelihood estimation in 62 % of all cases. At the same time, results question the general assumption of the threshold excess approach (employing partial duration series, PDS) being superior to the block maxima approach (employing annual maxima series, AMS) due to information gain. For low return periods (non-extreme events) the PDS approach tends to overestimate return levels as compared to the AMS approach, whereas an opposite behavior was found for high return levels (extreme events). In extreme cases, an inappropriate threshold was shown to lead to considerable biases that may outperform the possible gain of information from including additional extreme events by far. This effect was neither visible from the square-root criterion, nor from standardly used graphical diagnosis (mean residual life plot), but from a direct comparison of AMS and PDS in synoptic quantile plots. We therefore recommend performing AMS and PDS approaches simultaneously in order to select the best suited approach. This will make the analyses more robust, in cases where threshold selection and dependency introduces biases to the PDS approach, but also in cases where the AMS contains non-extreme events that may introduce similar biases. For assessing the performance of extreme events we recommend conditional performance measures that focus on rare events only in addition to standardly used unconditional indicators. The findings of this study are of relevance for a broad range of environmental variables, including meteorological and hydrological quantities.
Gravitational waves from plunges into Gargantua
NASA Astrophysics Data System (ADS)
Compère, Geoffrey; Fransen, Kwinten; Hertog, Thomas; Long, Jiang
2018-05-01
We analytically compute time domain gravitational waveforms produced in the final stages of extreme mass ratio inspirals of non-spinning compact objects into supermassive nearly extremal Kerr black holes. Conformal symmetry relates all corotating equatorial orbits in the geodesic approximation to circular orbits through complex conformal transformations. We use this to obtain the time domain Teukolsky perturbations for generic equatorial corotating plunges in closed form. The resulting gravitational waveforms consist of an intermediate polynomial ringdown phase in which the decay rate depends on the impact parameters, followed by an exponential quasi-normal mode decay. The waveform amplitude exhibits critical behavior when the orbital angular momentum tends to a minimal value determined by the innermost stable circular orbit. We show that either near-critical or large angular momentum leads to a significant extension of the LISA observable volume of gravitational wave sources of this kind.
NASA Astrophysics Data System (ADS)
Loikith, P. C.; Broccoli, A. J.; Waliser, D. E.; Lintner, B. R.; Neelin, J. D.
2015-12-01
Anomalous large-scale circulation patterns often play a key role in the occurrence of temperature extremes. For example, large-scale circulation can drive horizontal temperature advection or influence local processes that lead to extreme temperatures, such as by inhibiting moderating sea breezes, promoting downslope adiabatic warming, and affecting the development of cloud cover. Additionally, large-scale circulation can influence the shape of temperature distribution tails, with important implications for the magnitude of future changes in extremes. As a result of the prominent role these patterns play in the occurrence and character of extremes, the way in which temperature extremes change in the future will be highly influenced by if and how these patterns change. It is therefore critical to identify and understand the key patterns associated with extremes at local to regional scales in the current climate and to use this foundation as a target for climate model validation. This presentation provides an overview of recent and ongoing work aimed at developing and applying novel approaches to identifying and describing the large-scale circulation patterns associated with temperature extremes in observations and using this foundation to evaluate state-of-the-art global and regional climate models. Emphasis is given to anomalies in sea level pressure and 500 hPa geopotential height over North America using several methods to identify circulation patterns, including self-organizing maps and composite analysis. Overall, evaluation results suggest that models are able to reproduce observed patterns associated with temperature extremes with reasonable fidelity in many cases. Model skill is often highest when and where synoptic-scale processes are the dominant mechanisms for extremes, and lower where sub-grid scale processes (such as those related to topography) are important. Where model skill in reproducing these patterns is high, it can be inferred that extremes are being simulated for plausible physical reasons, boosting confidence in future projections of temperature extremes. Conversely, where model skill is identified to be lower, caution should be exercised in interpreting future projections.
Quantifying the Extremity of Windstorms for Regions Featuring Infrequent Events
NASA Astrophysics Data System (ADS)
Walz, M. A.; Leckebusch, G. C.; Kruschke, T.; Rust, H.; Ulbrich, U.
2017-12-01
This paper introduces the Distribution-Independent Storm Severity Index (DI-SSI). The DI-SSI represents an approach to quantify the severity of exceptional surface wind speeds of large scale windstorms that is complementary to the Storm Severity Index (SSI) introduced by Leckebusch et al. (2008). While the SSI approaches the extremeness of a storm from a meteorological and potential loss (impact) perspective, the DI-SSI defines the severity in a more climatological perspective. The idea is to assign equal index values to wind speeds of the same singularity (e.g. the 99th percentile) under consideration of the shape of the tail of the local wind speed climatology. Especially in regions at the edge of the classical storm track the DI-SSI shows more equitable severity estimates, e.g. for the extra-tropical cyclone Klaus. Here were compare the integral severity indices for several prominent windstorm in the European domain and discuss the advantages and disadvantages of the respective index. In order to compare the indices, their relation with the North Atlantic Oscillation (NAO) is studied, which is one of the main large scale drivers for the intensity of European windstorms. Additionally we can identify a significant relationship between the frequency and intensity of windstorms for large parts of the European domain.
The tragic fire event of June 17, 2017 in Portugal: the meteorological perspective
NASA Astrophysics Data System (ADS)
DaCamara, C.; Trigo, R. M.; Pinto, M. M.; Nunes, S. A.; Trigo, I. F.
2017-12-01
Like Mediterranean Europe, Portugal is prone to the occurrence of large and destructive wildfires that have serious impacts at the socio-economic and ecological levels. A tragic example is the episode of June 17, 2017 at Pedrógão Grande-Góis, with an official death toll of 64 people, almost 500 buildings destroyed and a continuous patch of more than 42 thousand hectares burned in one week. Climate and meteorology play a determinant role in the onset and spreading of large wildfire events in the Mediterranean basin. Two main kinds of atmospheric mechanisms may be identified. At the regional and the seasonal levels, a wetter-than usual winter followed by a warmer and drier than average spring makes the landscape prone to the occurrence of large fires. At the local and the daily scales, extreme weather conditions favor the ignition and spread of wildfires. This dual role may be assessed by means of indices of meteorological fire danger like FWI and DSR. We show that the severity of the 2017 fire season was correctly anticipated by means of a statistical model based on cumulated values of DSR starting on April 1. We then show that extreme danger of fire on June 17 was correctly forecasted for the area of Pedrógão Grande-Góis, based on values of estimated probability of exceedance of daily released energy by active fires. These two statistical approaches are on the basis of a website developed at Instituto Dom Luiz (IDL) at the Faculty of Sciences of the University of Lisbon. With more than 400 registered users, the website relies on products disseminated by the Land Surface Analysis Satellite Application Facility (LSA SAF), coordinated by IPMA, the Portuguese Weather Service.
NASA Astrophysics Data System (ADS)
Agel, Laurie; Barlow, Mathew; Feldstein, Steven B.; Gutowski, William J.
2018-03-01
Patterns of daily large-scale circulation associated with Northeast US extreme precipitation are identified using both k-means clustering (KMC) and Self-Organizing Maps (SOM) applied to tropopause height. The tropopause height provides a compact representation of the upper-tropospheric potential vorticity, which is closely related to the overall evolution and intensity of weather systems. Extreme precipitation is defined as the top 1% of daily wet-day observations at 35 Northeast stations, 1979-2008. KMC is applied on extreme precipitation days only, while the SOM algorithm is applied to all days in order to place the extreme results into the overall context of patterns for all days. Six tropopause patterns are identified through KMC for extreme day precipitation: a summertime tropopause ridge, a summertime shallow trough/ridge, a summertime shallow eastern US trough, a deeper wintertime eastern US trough, and two versions of a deep cold-weather trough located across the east-central US. Thirty SOM patterns for all days are identified. Results for all days show that 6 SOM patterns account for almost half of the extreme days, although extreme precipitation occurs in all SOM patterns. The same SOM patterns associated with extreme precipitation also routinely produce non-extreme precipitation; however, on extreme precipitation days the troughs, on average, are deeper and the downstream ridges more pronounced. Analysis of other fields associated with the large-scale patterns show various degrees of anomalously strong moisture transport preceding, and upward motion during, extreme precipitation events.
Analysis of the dependence of extreme rainfalls
NASA Astrophysics Data System (ADS)
Padoan, Simone; Ancey, Christophe; Parlange, Marc
2010-05-01
The aim of spatial analysis is to quantitatively describe the behavior of environmental phenomena such as precipitation levels, wind speed or daily temperatures. A number of generic approaches to spatial modeling have been developed[1], but these are not necessarily ideal for handling extremal aspects given their focus on mean process levels. The areal modelling of the extremes of a natural process observed at points in space is important in environmental statistics; for example, understanding extremal spatial rainfall is crucial in flood protection. In light of recent concerns over climate change, the use of robust mathematical and statistical methods for such analyses has grown in importance. Multivariate extreme value models and the class of maxstable processes [2] have a similar asymptotic motivation to the univariate Generalized Extreme Value (GEV) distribution , but providing a general approach to modeling extreme processes incorporating temporal or spatial dependence. Statistical methods for max-stable processes and data analyses of practical problems are discussed by [3] and [4]. This work illustrates methods to the statistical modelling of spatial extremes and gives examples of their use by means of a real extremal data analysis of Switzerland precipitation levels. [1] Cressie, N. A. C. (1993). Statistics for Spatial Data. Wiley, New York. [2] de Haan, L and Ferreria A. (2006). Extreme Value Theory An Introduction. Springer, USA. [3] Padoan, S. A., Ribatet, M and Sisson, S. A. (2009). Likelihood-Based Inference for Max-Stable Processes. Journal of the American Statistical Association, Theory & Methods. In press. [4] Davison, A. C. and Gholamrezaee, M. (2009), Geostatistics of extremes. Journal of the Royal Statistical Society, Series B. To appear.
Extremes and bursts in complex multi-scale plasmas
NASA Astrophysics Data System (ADS)
Watkins, N. W.; Chapman, S. C.; Hnat, B.
2012-04-01
Quantifying the spectrum of sizes and durations of large and/or long-lived fluctuations in complex, multi-scale, space plasmas is a topic of both theoretical and practical importance. The predictions of inherently multi-scale physical theories such as MHD turbulence have given one direct stimulus for its investigation. There are also space weather implications to an improved ability to assess the likelihood of an extreme fluctuation of a given size. Our intuition as scientists tends to be formed on the familiar Gaussian "normal" distribution, which has a very low likelihood of extreme fluctuations. Perhaps surprisingly, there is both theoretical and observational evidence that favours non-Gaussian, heavier-tailed, probability distributions for some space physics datasets. Additionally there is evidence for the existence of long-ranged memory between the values of fluctuations. In this talk I will show how such properties can be captured in a preliminary way by a self-similar, fractal model. I will show how such a fractal model can be used to make predictions for experimental accessible quantities like the size and duration of a buurst (a sequence of values that exceed a given threshold), or the survival probability of a burst [c.f. preliminary results in Watkins et al, PRE, 2009]. In real-world time series scaling behaviour need not be "mild" enough to be captured by a single self-similarity exponent H, but might instead require a "wild" multifractal spectrum of scaling exponents [e.g. Rypdal and Rypdal, JGR, 2011; Moloney and Davidsen, JGR, 2011] to give a complete description. I will discuss preliminary work on extending the burst approach into the multifractal domain [see also Watkins et al, chapter in press for AGU Chapman Conference on Complexity and Extreme Events in the Geosciences, Hyderabad].
Sacks, Laura A.; Lee, Terrie M.; Swancar, Amy
2013-01-01
Groundwater inflow to a subtropical seepage lake was estimated using a transient isotope-balance approach for a decade (2001–2011) with wet and dry climatic extremes. Lake water δ18O ranged from +0.80 to +3.48 ‰, reflecting the 4 m range in stage. The transient δ18O analysis discerned large differences in semiannual groundwater inflow, and the overall patterns of low and high groundwater inflow were consistent with an independent water budget. Despite simplifying assumptions that the isotopic composition of precipitation (δP), groundwater inflow, and atmospheric moisture (δA) were constant, groundwater inflow was within the water-budget error for 12 of the 19 semiannual calculation periods. The magnitude of inflow was over or under predicted during periods of climatic extreme. During periods of high net precipitation from tropical cyclones and El Niño conditions, δP values were considerably more depleted in 18O than assumed. During an extreme dry period, δA values were likely more enriched in 18O than assumed due to the influence of local lake evaporate. Isotope balance results were most sensitive to uncertainties in relative humidity, evaporation, and δ18O of lake water, which can limit precise quantification of groundwater inflow. Nonetheless, the consistency between isotope-balance and water-budget results indicates that this is a viable approach for lakes in similar settings, allowing the magnitude of groundwater inflow to be estimated over less-than-annual time periods. Because lake-water δ18O is a good indicator of climatic conditions, these data could be useful in ground-truthing paleoclimatic reconstructions using isotopic data from lake cores in similar settings.
Examining global extreme sea level variations on the coast from in-situ and remote observations
NASA Astrophysics Data System (ADS)
Menendez, Melisa; Benkler, Anna S.
2017-04-01
The estimation of extreme water level values on the coast is a requirement for a wide range of engineering and coastal management applications. In addition, climate variations of extreme sea levels on the coastal area result from a complex interacting of oceanic, atmospheric and terrestrial processes across a wide range of spatial and temporal scales. In this study, variations of extreme sea level return values are investigated from two available sources of information: in-situ tide-gauge records and satellite altimetry data. Long time series of sea level from tide-gauge records are the most valuable observations since they directly measure water level in a specific coastal location. They have however a number of sources of in-homogeneities that may affect the climate description of extremes when this data source is used. Among others, the presence of gaps, historical time in-homogeneities and jumps in the mean sea level signal are factors that can provide uncertainty in the characterization of the extreme sea level behaviour. Moreover, long records from tide-gauges are sparse and there are many coastal areas worldwide without in-situ available information. On the other hand, with the accumulating altimeter records of several satellite missions from the 1990s, approaching 25 recorded years at the time of writing, it is becoming possible the analysis of extreme sea level events from this data source. Aside the well-known issue of altimeter measurements very close to the coast (mainly due to corruption by land, wet troposphere path delay errors and local tide effects on the coastal area), there are other aspects that have to be considered when sea surface height values estimated from satellite are going to be used in a statistical extreme model, such as the use of a multi-mission product to get long observed periods and the selection of the maxima sample, since altimeter observations do not provide values uniform in time and space. Here, we have compared the extreme values of 'still water level' and 'non-tidal-residual' of in-situ records from the GESLA2 dataset (Woodworth et al. 2016) against the novel coastal altimetry datasets (Cipollini et al. 2016). Seasonal patterns, inter-annual variability and long-term trends are analyzed. Then, a time-dependent extreme model (Menendez et al. 2009) is applied to characterize extreme sea level return values and their variability on the coastal area around the world.
Time-dependent fiber bundles with local load sharing. II. General Weibull fibers.
Phoenix, S Leigh; Newman, William I
2009-12-01
Fiber bundle models (FBMs) are useful tools in understanding failure processes in a variety of material systems. While the fibers and load sharing assumptions are easily described, FBM analysis is typically difficult. Monte Carlo methods are also hampered by the severe computational demands of large bundle sizes, which overwhelm just as behavior relevant to real materials starts to emerge. For large size scales, interest continues in idealized FBMs that assume either equal load sharing (ELS) or local load sharing (LLS) among fibers, rules that reflect features of real load redistribution in elastic lattices. The present work focuses on a one-dimensional bundle of N fibers under LLS where life consumption in a fiber follows a power law in its load, with exponent rho , and integrated over time. This life consumption function is further embodied in a functional form resulting in a Weibull distribution for lifetime under constant fiber stress and with Weibull exponent, beta. Thus the failure rate of a fiber depends on its past load history, except for beta=1 . We develop asymptotic results validated by Monte Carlo simulation using a computational algorithm developed in our previous work [Phys. Rev. E 63, 021507 (2001)] that greatly increases the size, N , of treatable bundles (e.g., 10(6) fibers in 10(3) realizations). In particular, our algorithm is O(N ln N) in contrast with former algorithms which were O(N2) making this investigation possible. Regimes are found for (beta,rho) pairs that yield contrasting behavior for large N. For rho>1 and large N, brittle weakest volume behavior emerges in terms of characteristic elements (groupings of fibers) derived from critical cluster formation, and the lifetime eventually goes to zero as N-->infinity , unlike ELS, which yields a finite limiting mean. For 1/2
Time-dependent fiber bundles with local load sharing. II. General Weibull fibers
NASA Astrophysics Data System (ADS)
Phoenix, S. Leigh; Newman, William I.
2009-12-01
Fiber bundle models (FBMs) are useful tools in understanding failure processes in a variety of material systems. While the fibers and load sharing assumptions are easily described, FBM analysis is typically difficult. Monte Carlo methods are also hampered by the severe computational demands of large bundle sizes, which overwhelm just as behavior relevant to real materials starts to emerge. For large size scales, interest continues in idealized FBMs that assume either equal load sharing (ELS) or local load sharing (LLS) among fibers, rules that reflect features of real load redistribution in elastic lattices. The present work focuses on a one-dimensional bundle of N fibers under LLS where life consumption in a fiber follows a power law in its load, with exponent ρ , and integrated over time. This life consumption function is further embodied in a functional form resulting in a Weibull distribution for lifetime under constant fiber stress and with Weibull exponent, β . Thus the failure rate of a fiber depends on its past load history, except for β=1 . We develop asymptotic results validated by Monte Carlo simulation using a computational algorithm developed in our previous work [Phys. Rev. EPLEEE81063-651X 63, 021507 (2001)] that greatly increases the size, N , of treatable bundles (e.g., 106 fibers in 103 realizations). In particular, our algorithm is O(NlnN) in contrast with former algorithms which were O(N2) making this investigation possible. Regimes are found for (β,ρ) pairs that yield contrasting behavior for large N . For ρ>1 and large N , brittle weakest volume behavior emerges in terms of characteristic elements (groupings of fibers) derived from critical cluster formation, and the lifetime eventually goes to zero as N→∞ , unlike ELS, which yields a finite limiting mean. For 1/2≤ρ≤1 , however, LLS has remarkably similar behavior to ELS (appearing to be virtually identical for ρ=1 ) with an asymptotic Gaussian lifetime distribution and a finite limiting mean for large N . The coefficient of variation follows a power law in increasing N but, except for ρ=1 , the value of the negative exponent is clearly less than 1/2 unlike in ELS bundles where the exponent remains 1/2 for 1/2<ρ≤1 . For sufficiently small values 0<ρ≪1 , a transition occurs, depending on β , whereby LLS bundle lifetimes become dominated by a few long-lived fibers. Thus the bundle lifetime appears to approximately follow an extreme-value distribution for the longest lived of a parallel group of independent elements, which applies exactly to ρ=0 . The lower the value of β , the higher the transition value of ρ , below which such extreme-value behavior occurs. No evidence was found for limiting Gaussian behavior for ρ>1 but with 0<β(ρ+1)<1 , as might be conjectured from quasistatic bundle models where β(ρ+1) mimics the Weibull exponent for fiber strength.
Chao, Jianbin; Wang, Huijuan; Zhang, Yongbin; Yin, Caixia; Huo, Fangjun; Song, Kailun; Li, Zhiqing; Zhang, Ting; Zhao, Yaqin
2017-11-01
A novel pH fluorescent probe 1-(pyren-1-yl)-3-(6-methoxypridin-3-yl)-acrylketone, (PMPA), which had a pyrene structure attached to methoxypyridine, was synthesized for monitoring extremely acidic and alkaline pH. The pH titrations indicated that PMPA displayed a remarkable emission enhancement with a pK a of 2.70 and responded linearly to minor pH fluctuations within the extremely acidic range of 1.26-3.97. Interestingly, PMPA also exhibited strong pH-dependent characteristics with pK a 9.32 and linear response to extreme-alkalinity range of 8.54-10.36. In addition, PMPA displayed a good selectivity, excellent photostability and large Stokes shift (167nm). Furthermore, the probe PMPA had excellent cell membrane permeability and was applied successfully to rapidly detect pH in living cells. pH value in these organs was closely related to many diseases, so these findings suggested that the probe had potential application in pH detecting for disease diagnosis. Copyright © 2017 Elsevier B.V. All rights reserved.
Regional frequency analysis of extreme rainfalls using partial L moments method
NASA Astrophysics Data System (ADS)
Zakaria, Zahrahtul Amani; Shabri, Ani
2013-07-01
An approach based on regional frequency analysis using L moments and LH moments are revisited in this study. Subsequently, an alternative regional frequency analysis using the partial L moments (PL moments) method is employed, and a new relationship for homogeneity analysis is developed. The results were then compared with those obtained using the method of L moments and LH moments of order two. The Selangor catchment, consisting of 37 sites and located on the west coast of Peninsular Malaysia, is chosen as a case study. PL moments for the generalized extreme value (GEV), generalized logistic (GLO), and generalized Pareto distributions were derived and used to develop the regional frequency analysis procedure. PL moment ratio diagram and Z test were employed in determining the best-fit distribution. Comparison between the three approaches showed that GLO and GEV distributions were identified as the suitable distributions for representing the statistical properties of extreme rainfall in Selangor. Monte Carlo simulation used for performance evaluation shows that the method of PL moments would outperform L and LH moments methods for estimation of large return period events.
This is a presentation titled Estimating the Effect of Climate Change on Crop Yields and Farmland Values: The Importance of Extreme Temperatures that was given for the National Center for Environmental Economics
Drifter observations of submesoscale flow kinematics in the coastal ocean
NASA Astrophysics Data System (ADS)
Ohlmann, J. C.; Molemaker, M. J.; Baschek, B.; Holt, B.; Marmorino, G.; Smith, G.
2017-01-01
Fronts and eddies identified with aerial guidance are seeded with drifters to quantify submesoscale flow kinematics. The Lagrangian observations show mean divergence and vorticity values that can exceed 5 times the Coriolis frequency. Values are the largest observed in the field to date and represent an extreme departure from geostrophic dynamics. The study also quantifies errors and biases associated with Lagrangian observations of the underlying velocity strain tensor. The greatest error results from undersampling, even with a large number of drifters. A significant bias comes from inhomogeneous sampling of convergent regions that accumulate drifters within a few hours of deployment. The study demonstrates a Lagrangian sampling paradigm for targeted submesoscale structures over a broad range of scales and presents flow kinematic values associated with vertical velocities O(10) m h-1 that can have profound implications on ocean biogeochemistry.
An Automatic Orthonormalization Method for Solving Stiff Boundary-Value Problems
NASA Astrophysics Data System (ADS)
Davey, A.
1983-08-01
A new initial-value method is described, based on a remark by Drury, for solving stiff linear differential two-point cigenvalue and boundary-value problems. The method is extremely reliable, it is especially suitable for high-order differential systems, and it is capable of accommodating realms of stiffness which other methods cannot reach. The key idea behind the method is to decompose the stiff differential operator into two non-stiff operators, one of which is nonlinear. The nonlinear one is specially chosen so that it advances an orthonormal frame, indeed the method is essentially a kind of automatic orthonormalization; the second is auxiliary but it is needed to determine the required function. The usefulness of the method is demonstrated by calculating some eigenfunctions for an Orr-Sommerfeld problem when the Reynolds number is as large as 10°.
Large-scale Meteorological Patterns Associated with Extreme Precipitation Events over Portland, OR
NASA Astrophysics Data System (ADS)
Aragon, C.; Loikith, P. C.; Lintner, B. R.; Pike, M.
2017-12-01
Extreme precipitation events can have profound impacts on human life and infrastructure, with broad implications across a range of stakeholders. Changes to extreme precipitation events are a projected outcome of climate change that warrants further study, especially at regional- to local-scales. While global climate models are generally capable of simulating mean climate at global-to-regional scales with reasonable skill, resiliency and adaptation decisions are made at local-scales where most state-of-the-art climate models are limited by coarse resolution. Characterization of large-scale meteorological patterns associated with extreme precipitation events at local-scales can provide climatic information without this scale limitation, thus facilitating stakeholder decision-making. This research will use synoptic climatology as a tool by which to characterize the key large-scale meteorological patterns associated with extreme precipitation events in the Portland, Oregon metro region. Composite analysis of meteorological patterns associated with extreme precipitation days, and associated watershed-specific flooding, is employed to enhance understanding of the climatic drivers behind such events. The self-organizing maps approach is then used to characterize the within-composite variability of the large-scale meteorological patterns associated with extreme precipitation events, allowing us to better understand the different types of meteorological conditions that lead to high-impact precipitation events and associated hydrologic impacts. A more comprehensive understanding of the meteorological drivers of extremes will aid in evaluation of the ability of climate models to capture key patterns associated with extreme precipitation over Portland and to better interpret projections of future climate at impact-relevant scales.
NASA Astrophysics Data System (ADS)
Black, R. X.
2017-12-01
We summarize results from a project focusing on regional temperature and precipitation extremes over the continental United States. Our project introduces a new framework for evaluating these extremes emphasizing their (a) large-scale organization, (b) underlying physical sources (including remote-excitation and scale-interaction) and (c) representation in climate models. Results to be reported include the synoptic-dynamic behavior, seasonality and secular variability of cold waves, dry spells and heavy rainfall events in the observational record. We also study how the characteristics of such extremes are systematically related to Northern Hemisphere planetary wave structures and thus planetary- and hemispheric-scale forcing (e.g., those associated with major El Nino events and Arctic sea ice change). The underlying physics of event onset are diagnostically quantified for different categories of events. Finally, the representation of these extremes in historical coupled climate model simulations is studied and the origins of model biases are traced using new metrics designed to assess the large-scale atmospheric forcing of local extremes.
Extreme Statistics of Storm Surges in the Baltic Sea
NASA Astrophysics Data System (ADS)
Kulikov, E. A.; Medvedev, I. P.
2017-11-01
Statistical analysis of the extreme values of the Baltic Sea level has been performed for a series of observations for 15-125 years at 13 tide gauge stations. It is shown that the empirical relation between value of extreme sea level rises or ebbs (caused by storm events) and its return period in the Baltic Sea can be well approximated by the Gumbel probability distribution. The maximum values of extreme floods/ebbs of the 100-year recurrence were observed in the Gulf of Finland and the Gulf of Riga. The two longest data series, observed in Stockholm and Vyborg over 125 years, have shown a significant deviation from the Gumbel distribution for the rarest events. Statistical analysis of the hourly sea level data series reveals some asymmetry in the variability of the Baltic Sea level. The probability of rises proved higher than that of ebbs. As for the magnitude of the 100-year recurrence surge, it considerably exceeded the magnitude of ebbs almost everywhere. This asymmetry effect can be attributed to the influence of low atmospheric pressure during storms. A statistical study of extreme values has also been applied to sea level series for Narva over the period of 1994-2000, which were simulated by the ROMS numerical model. Comparisons of the "simulated" and "observed" extreme sea level distributions show that the model reproduces quite satisfactorily extreme floods of "moderate" magnitude; however, it underestimates sea level changes for the most powerful storm surges.
Future changes of precipitation characteristics in China
NASA Astrophysics Data System (ADS)
Wu, S.; Wu, Y.; Wen, J.
2017-12-01
Global warming has the potential to alter the hydrological cycle, with significant impacts on the human society, the environment and ecosystems. This study provides a detailed assessment of potential changes in precipitation characteristics in China using a suite of 12 high-resolution CMIP5 climate models under a medium and a high Representative Concentration Pathways: RCP4.5 and RCP8.5. We examine future changes over the entire distribution of precipitation, and identify any shift in the shape and/or scale of the distribution. In addition, we use extreme-value theory to evaluate the change in probability and magnitude for extreme precipitation events. Overall, China is going to experience an increase in total precipitation (by 8% under RCP4.5 and 12% under RCP8.5). This increase is uneven spatially, with more increase in the west and less increase in the east. Precipitation frequency is projected to increase in the west and decrease in the east. Under RCP4.5, the overall precipitation frequency for the entire China remains largely unchanged (0.08%). However, RCP8.5 projects a more significant decrease in frequency for large part of China, resulting in an overall decrease of 2.08%. Precipitation intensity is likely increase more uniformly, with an overall increase of 11% for RCP4.5 and 19% for RCP8.5. Precipitation increases for all parts of the distribution, but the increase is more for higher quantiles, i.e. strong events. The relative contribution of small quantiles is likely to decrease, whereas contribution from heavy events is likely to increase. Extreme precipitation increase at much higher rates than average precipitation, and high rates of increase are expected for more extreme events. 1-year events are likely to increase by 15%, but 20-year events are going to increase by 21% under RCP4.5, 26% and 40% respectively under RCP8.5. The increase of extreme events is likely to be more spatially uniform.
NASA Astrophysics Data System (ADS)
Rieder, Harald E.; Jancso, Leonhardt M.; Staehelin, Johannes; Maeder, Jörg A.; Ribatet, Mathieu; Peter, Thomas; Davison, Anthony C.
2010-05-01
In this study we analyze the frequency distribution of extreme events in low and high total ozone (termed ELOs and EHOs) for 5 long-term stations in the northern mid-latitudes in Europe (Belsk, Poland; Hradec Kralove, Czech Republic; Hohenpeissenberg and Potsdam, Germany; and Uccle, Belgium). Further, the influence of these extreme events on annual and seasonal mean values and trends is analysed. The applied method follows the new "ozone extreme concept", which is based on tools from extreme value theory [Coles, 2001; Ribatet, 2007], recently developed by Rieder et al. [2010a, b]. Mathematically seen the decisive feature within the extreme concept is the Generalized Pareto Distribution (GPD). In this analysis, the long-term trends needed to be removed first, differently to the treatment of Rieder et al. [2010a, b], in which the time series of Arosa was analysed, covering many decades of measurements in the anthropogenically undisturbed stratosphere. In contrast to previous studies only focusing on so called ozone mini-holes and mini-highs the "ozone extreme concept" provides a statistical description of the tails in total ozone distributions (i.e. extreme low and high values). It is shown that this concept is not only an appropriate method to describe the frequency and distribution of extreme events, it also provides new information on time series properties and internal variability. Furthermore it allows detection of fingerprints of physical (e.g. El Niño, NAO) and chemical (e.g. polar vortex ozone loss) features in the Earth's atmosphere as well as major volcanic eruptions (e.g. El Chichón, Mt. Pinatubo). It is shown that mean values and trends in total ozone are strongly influenced by extreme events. Trend calculations (for the period 1970-1990) are performed for the entire as well as the extremes-removed time series. The results after excluding extremes show that annual trends are most reduced at Hradec Kralove (about a factor of 3), followed by Potsdam (factor of 2.5), and Hohenpeissenberg and Belsk (both about a factor of 2). In general the reduction of trend is strongest during winter and spring. Throughout all stations the influence of ELOs on observed trends is larger than those of EHOs. Especially from the 1990s on ELOs dominate the picture as only a relatively small fraction of EHOs can be observed in the records (due to strong influence of Mt. Pinatubo eruption and polar vortex ozone loss contributions). Additionally it is evidenced that the number of observed mini-holes can be estimated highly accurate by the GPD-model. Overall the results of this thesis show that extreme events play a major role in total ozone and the "ozone extremes concept" provides deeper insight in the influence of chemical and physical features on column ozone. References: Coles, S.: An Introduction to Statistical Modeling of Extreme Values, Springer Series in Statistics, ISBN:1852334592, Springer, Berlin, 2001. Ribatet, M.: POT: Modelling peaks over a threshold, R News, 7, 34-36, 2007. Rieder ,H.E., Staehelin, J., Maeder, J.A., Ribatet, M., Stübi, R., Weihs, P., Holawe, F., Peter, T., and A.D., Davison (2010): Extreme events in total ozone over Arosa - Part I: Application of extreme value theory, to be submitted to ACPD. Rieder, H.E., Staehelin, J., Maeder, J.A., Ribatet, M., Stübi, R., Weihs, P., Holawe, F., Peter, T., and A.D., Davison (2010): Extreme events in total ozone over Arosa - Part II: Fingerprints of atmospheric dynamics and chemistry and effects on mean values and long-term changes, to be submitted to ACPD.
Climate Impacts on Extreme Energy Consumption of Different Types of Buildings
Li, Mingcai; Shi, Jun; Guo, Jun; Cao, Jingfu; Niu, Jide; Xiong, Mingming
2015-01-01
Exploring changes of building energy consumption and its relationships with climate can provide basis for energy-saving and carbon emission reduction. Heating and cooling energy consumption of different types of buildings during 1981-2010 in Tianjin city, was simulated by using TRNSYS software. Daily or hourly extreme energy consumption was determined by percentile methods, and the climate impact on extreme energy consumption was analyzed. The results showed that days of extreme heating consumption showed apparent decrease during the recent 30 years for residential and large venue buildings, whereas days of extreme cooling consumption increased in large venue building. No significant variations were found for the days of extreme energy consumption for commercial building, although a decreasing trend in extreme heating energy consumption. Daily extreme energy consumption for large venue building had no relationship with climate parameters, whereas extreme energy consumption for commercial and residential buildings was related to various climate parameters. Further multiple regression analysis suggested heating energy consumption for commercial building was affected by maximum temperature, dry bulb temperature, solar radiation and minimum temperature, which together can explain 71.5 % of the variation of the daily extreme heating energy consumption. The daily extreme cooling energy consumption for commercial building was only related to the wet bulb temperature (R2= 0.382). The daily extreme heating energy consumption for residential building was affected by 4 climate parameters, but the dry bulb temperature had the main impact. The impacts of climate on hourly extreme heating energy consumption has a 1-3 hour delay in all three types of buildings, but no delay was found in the impacts of climate on hourly extreme cooling energy consumption for the selected buildings. PMID:25923205
Climate impacts on extreme energy consumption of different types of buildings.
Li, Mingcai; Shi, Jun; Guo, Jun; Cao, Jingfu; Niu, Jide; Xiong, Mingming
2015-01-01
Exploring changes of building energy consumption and its relationships with climate can provide basis for energy-saving and carbon emission reduction. Heating and cooling energy consumption of different types of buildings during 1981-2010 in Tianjin city, was simulated by using TRNSYS software. Daily or hourly extreme energy consumption was determined by percentile methods, and the climate impact on extreme energy consumption was analyzed. The results showed that days of extreme heating consumption showed apparent decrease during the recent 30 years for residential and large venue buildings, whereas days of extreme cooling consumption increased in large venue building. No significant variations were found for the days of extreme energy consumption for commercial building, although a decreasing trend in extreme heating energy consumption. Daily extreme energy consumption for large venue building had no relationship with climate parameters, whereas extreme energy consumption for commercial and residential buildings was related to various climate parameters. Further multiple regression analysis suggested heating energy consumption for commercial building was affected by maximum temperature, dry bulb temperature, solar radiation and minimum temperature, which together can explain 71.5 % of the variation of the daily extreme heating energy consumption. The daily extreme cooling energy consumption for commercial building was only related to the wet bulb temperature (R2= 0.382). The daily extreme heating energy consumption for residential building was affected by 4 climate parameters, but the dry bulb temperature had the main impact. The impacts of climate on hourly extreme heating energy consumption has a 1-3 hour delay in all three types of buildings, but no delay was found in the impacts of climate on hourly extreme cooling energy consumption for the selected buildings.
NASA Astrophysics Data System (ADS)
Foster, I. S.; Agranier, A.; Heubeck, C. E.; Köhler, I.; Homann, M.; Tripati, A. K.; Nonnotte, P.; Ponzevera, E.; Lalonde, S.
2017-12-01
The emergence of continental crust above sea level in the early Precambrian would have created the first terrestrial habitats, and initiated atmosphere-driven weathering of the continents, yet the history of continental emergence is largely unknown[1]. Precambrian chemical sediments, specifically Banded Iron Formation (BIF), appear to have sampled the Hf-Nd isotope composition of ancient seawater, and may preserve a historical record of the emergence of the continental landmass[2] via Lu/Hf fractionation induced by subaerial differential weathering[3,4]. However, paired Hf-Nd isotope data are available for only one BIF to date, indicating appreciable emerged continental landmass ca. 2.7 Ga[2]. Our work extends this record back into the Eo- and Meso-Archean using samples of 3.8 Ga BIF from Isua, Greenland, and 3.2 Ga BIF from the Moodies Group, S. Africa. Isua samples appear to have been altered by amphibolite-grade metamorphism, however Moodies Group samples appear primary, having experienced significantly lower metamorphic grades. Moodies samples appear to retain their primary seawater signatures, however, their range of ˜Hf(i) values, from -54.6 to +40.7, is among the most extreme ever reported. Such extreme values may be indicative of one of several possibilities: unusual and intense sedimentary Lu/Hf fractionation during the Mesoarchean relative to today, sampling of a continuum of compositions from two sources with distinct Hf-compositions, or the result of early diagenetic processes occurring soon after the deposition of the Moodies Group BIF. These results suggest that interpretation of ˜Hf and ˜Nd data from BIF is not as straightforward as previously suggested[2], and positive ˜Hf values are not necessarily indicative of emerged continental crust. [1] Flament et al. (2013), Precambrian Research, 229, 177-188. [2] Veihmann et al. (2014), Geology, 42, 115-118. [3] Bayon et al. (2006), Geology, 34, 433-436. [4] Vervoort et al. (2011), Geochimica et Cosmochimica Acta, 75, 5903-5926.
Climate forecasting services: coming down from the ivory tower
NASA Astrophysics Data System (ADS)
Doblas-Reyes, F. J.; Caron, L. P.; Cortesi, N.; Soret, A.; Torralba, V.; Turco, M.; González Reviriego, N.; Jiménez, I.; Terrado, M.
2016-12-01
Subseasonal-to-seasonal (S2S) climate forecasts are increasingly used across a range of application areas (energy, water management, agriculture, health, insurance) through tailored services using the climate services paradigm. In this contribution we show the value of climate forecasting services through several examples of their application in the energy, reinsurance and agriculture sectors. Climate services aim at making climate information action oriented. In a climate forecasting context the task starts with the identification of climate variables, thresholds and events relevant to the users. These elements are then analysed to determine whether they can be both reliably and skilfully predicted at appropriate time scales. In this contribution we assess climate predictions of precipitation, temperature and wind indices from state-of-the-art operational multi-model forecast systems and if they respond to the expectations and requests from a range of users. This requires going beyond the more traditional assessment of monthly mean values to include assessments of global forecast quality of the frequency of warm, cold, windy and wet extremes (e.g. [1], [2]), as well as of using tools like the Euro-Atlantic weather regimes [3]. The forecast quality of extremes is generally similar to or slightly lower than that of monthly or seasonal averages, but offers a kind of information closer to what some users require. In addition to considering local climate variables, we also explore the use of large-scale climate indices, such as ENSO and NAO, that are associated with large regional synchronous variations of wind or tropical storm frequency. These indices help illustrating the relative merits of climate forecast information to users and are the cornerstone of climate stories that engage them in the co-production of climate information. [1] Doblas-Reyes et al, WIREs, 2013 [2] Pepler et al, Weather and Climate Extremes, 2015 [3] Pavan and Doblas-Reyes, Clim Dyn, 2013
NASA Technical Reports Server (NTRS)
Rubin, C.; Xu, G.; Judex, S.
2001-01-01
It is generally believed that mechanical signals must be large in order to be anabolic to bone tissue. Recent evidence indicates, however, that extremely low-magnitude (<10 microstrain) mechanical signals readily stimulate bone formation if induced at a high frequency. We examined the ability of extremely low-magnitude, high-frequency mechanical signals to restore anabolic bone cell activity inhibited by disuse. Adult female rats were randomly assigned to six groups: baseline control, age-matched control, mechanically stimulated for 10 min/day, disuse (hind limb suspension), disuse interrupted by 10 min/day of weight bearing, and disuse interrupted by 10 min/day of mechanical stimulation. After a 28 day protocol, bone formation rates (BFR) in the proximal tibia of mechanically stimulated rats increased compared with age-matched control (+97%). Disuse alone reduced BFR (-92%), a suppression only slightly curbed when disuse was interrupted by 10 min of weight bearing (-61%). In contrast, disuse interrupted by 10 min per day of low-level mechanical intervention normalized BFR to values seen in age-matched controls. This work indicates that this noninvasive, extremely low-level stimulus may provide an effective biomechanical intervention for the bone loss that plagues long-term space flight, bed rest, or immobilization caused by paralysis.
Stress transfer mechanisms at the submicron level for graphene/polymer systems.
Anagnostopoulos, George; Androulidakis, Charalampos; Koukaras, Emmanuel N; Tsoukleri, Georgia; Polyzos, Ioannis; Parthenios, John; Papagelis, Konstantinos; Galiotis, Costas
2015-02-25
The stress transfer mechanism from a polymer substrate to a nanoinclusion, such as a graphene flake, is of extreme interest for the production of effective nanocomposites. Previous work conducted mainly at the micron scale has shown that the intrinsic mechanism of stress transfer is shear at the interface. However, since the interfacial shear takes its maximum value at the very edge of the nanoinclusion it is of extreme interest to assess the effect of edge integrity upon axial stress transfer at the submicron scale. Here, we conduct a detailed Raman line mapping near the edges of a monolayer graphene flake that is simply supported onto an epoxy-based photoresist (SU8)/poly(methyl methacrylate) matrix at steps as small as 100 nm. We show for the first time that the distribution of axial strain (stress) along the flake deviates somewhat from the classical shear-lag prediction for a region of ∼ 2 μm from the edge. This behavior is mainly attributed to the presence of residual stresses, unintentional doping, and/or edge effects (deviation from the equilibrium values of bond lengths and angles, as well as different edge chiralities). By considering a simple balance of shear-to-normal stresses at the interface we are able to directly convert the strain (stress) gradient to values of interfacial shear stress for all the applied tensile levels without assuming classical shear-lag behavior. For large flakes a maximum value of interfacial shear stress of 0.4 MPa is obtained prior to flake slipping.
A regional strategy for ecological sustainability: A case study in Southwest China.
Wu, Xue; Liu, Shiliang; Cheng, Fangyan; Hou, Xiaoyun; Zhang, Yueqiu; Dong, Shikui; Liu, Guohua
2018-03-01
Partitioning, a method considering environmental protection and development potential, is an effective way to provide regional management strategies to maintain ecological sustainability. In this study, we provide a large-scale regional division approach and present a strategy for Southwest China, which also has extremely high development potential because of the "Western development" policy. Based on the superposition of 15 factors, including species diversity, pattern restriction, agricultural potential, accessibility, urbanization potential, and topographical limitations, the environmental value and development benefit in the region were quantified spatially by weighting the sum of indicators within environmental and development categories. By comparing the scores with their respective median values, the study area was divided into four different strategy zones: Conserve zones (34.94%), Construction zones (32.95%), Conflict zones (16.96%), and Low-tension zones (15.16%). The Conflict zones in which environmental value and development benefit were both higher than the respective medians were separated further into the following 5 levels: Extreme conflict (36.20%), Serious conflict (28.07%), Moderate conflict (12.28%), Minor conflict (6.55%), and Slight conflict (16.91%). We found that 9.04% of nature reserves were in Conflict zones, and thus should be given more attention. This study provides a simple and feasible method for regional partitioning, as well as comprehensive support that weighs both the environmental value and development benefit for China's current Ecological Red Line and space planning and for regional management in similar situations. Copyright © 2017 Elsevier B.V. All rights reserved.
BAL QSOs AND EXTREME UFOs: THE EDDINGTON CONNECTION
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zubovas, Kastytis; King, Andrew, E-mail: kastytis.zubovas@ftmc.lt
We suggest a common physical origin connecting the fast, highly ionized winds (UFOs) seen in nearby active galactic nuclei (AGNs), and the slower and less ionized winds of broad absorption line (BAL) QSOs. The primary difference is the mass-loss rate in the wind, which is ultimately determined by the rate at which mass is fed toward the central supermassive black hole (SMBH) on large scales. This is below the Eddington accretion rate in most UFOs, and slightly super-Eddington in extreme UFOs such as PG1211+143, but ranges up to {approx}10-50 times this in BAL QSOs. For UFOs this implies black holemore » accretion rates and wind mass-loss rates which are at most comparable to Eddington, giving fast, highly ionized winds. In contrast, BAL QSO black holes have mildly super-Eddington accretion rates, and drive winds whose mass-loss rates are significantly super-Eddington, and so are slower and less ionized. This picture correctly predicts the velocities and ionization states of the observed winds, including the recently discovered one in SDSS J1106+1939. We suggest that luminous AGNs may evolve through a sequence from BAL QSO through LoBAL to UFO-producing Seyfert or quasar as their Eddington factors drop during the decay of a bright accretion event. LoBALs correspond to a short-lived stage in which the AGN radiation pressure largely evacuates the ionization cone, but before the large-scale accretion rate has dropped to the Eddington value. We show that sub-Eddington wind rates would produce an M-{sigma} relation lying above that observed. We conclude that significant SMBH mass growth must occur in super-Eddington phases, either as BAL QSOs, extreme UFOs, or obscured from direct observation.« less
BAL QSOs and Extreme UFOs: The Eddington Connection
NASA Astrophysics Data System (ADS)
Zubovas, Kastytis; King, Andrew
2013-05-01
We suggest a common physical origin connecting the fast, highly ionized winds (UFOs) seen in nearby active galactic nuclei (AGNs), and the slower and less ionized winds of broad absorption line (BAL) QSOs. The primary difference is the mass-loss rate in the wind, which is ultimately determined by the rate at which mass is fed toward the central supermassive black hole (SMBH) on large scales. This is below the Eddington accretion rate in most UFOs, and slightly super-Eddington in extreme UFOs such as PG1211+143, but ranges up to ~10-50 times this in BAL QSOs. For UFOs this implies black hole accretion rates and wind mass-loss rates which are at most comparable to Eddington, giving fast, highly ionized winds. In contrast, BAL QSO black holes have mildly super-Eddington accretion rates, and drive winds whose mass-loss rates are significantly super-Eddington, and so are slower and less ionized. This picture correctly predicts the velocities and ionization states of the observed winds, including the recently discovered one in SDSS J1106+1939. We suggest that luminous AGNs may evolve through a sequence from BAL QSO through LoBAL to UFO-producing Seyfert or quasar as their Eddington factors drop during the decay of a bright accretion event. LoBALs correspond to a short-lived stage in which the AGN radiation pressure largely evacuates the ionization cone, but before the large-scale accretion rate has dropped to the Eddington value. We show that sub-Eddington wind rates would produce an M-σ relation lying above that observed. We conclude that significant SMBH mass growth must occur in super-Eddington phases, either as BAL QSOs, extreme UFOs, or obscured from direct observation.
Approaching the exa-scale: a real-world evaluation of rendering extremely large data sets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patchett, John M; Ahrens, James P; Lo, Li - Ta
2010-10-15
Extremely large scale analysis is becoming increasingly important as supercomputers and their simulations move from petascale to exascale. The lack of dedicated hardware acceleration for rendering on today's supercomputing platforms motivates our detailed evaluation of the possibility of interactive rendering on the supercomputer. In order to facilitate our understanding of rendering on the supercomputing platform, we focus on scalability of rendering algorithms and architecture envisioned for exascale datasets. To understand tradeoffs for dealing with extremely large datasets, we compare three different rendering algorithms for large polygonal data: software based ray tracing, software based rasterization and hardware accelerated rasterization. We presentmore » a case study of strong and weak scaling of rendering extremely large data on both GPU and CPU based parallel supercomputers using Para View, a parallel visualization tool. Wc use three different data sets: two synthetic and one from a scientific application. At an extreme scale, algorithmic rendering choices make a difference and should be considered while approaching exascale computing, visualization, and analysis. We find software based ray-tracing offers a viable approach for scalable rendering of the projected future massive data sizes.« less
Piras, Monica; Mascaro, Giuseppe; Deidda, Roberto; Vivoni, Enrique R
2016-02-01
Mediterranean region is characterized by high precipitation variability often enhanced by orography, with strong seasonality and large inter-annual fluctuations, and by high heterogeneity of terrain and land surface properties. As a consequence, catchments in this area are often prone to the occurrence of hydrometeorological extremes, including storms, floods and flash-floods. A number of climate studies focused in the Mediterranean region predict that extreme events will occur with higher intensity and frequency, thus requiring further analyses to assess their effect at the land surface, particularly in small- and medium-sized watersheds. In this study, climate and hydrologic simulations produced within the Climate Induced Changes on the Hydrology of Mediterranean Basins (CLIMB) EU FP7 research project were used to analyze how precipitation extremes propagate into discharge extremes in the Rio Mannu basin (472.5km(2)), located in Sardinia, Italy. The basin hydrologic response to climate forcings in a reference (1971-2000) and a future (2041-2070) period was simulated through the combined use of a set of global and regional climate models, statistical downscaling techniques, and a process based distributed hydrologic model. We analyzed and compared the distribution of annual maxima extracted from hourly and daily precipitation and peak discharge time series, simulated by the hydrologic model under climate forcing. For this aim, yearly maxima were fit by the Generalized Extreme Value (GEV) distribution using a regional approach. Next, we discussed commonality and contrasting behaviors of precipitation and discharge maxima distributions to better understand how hydrological transformations impact propagation of extremes. Finally, we show how rainfall statistical downscaling algorithms produce more reliable forcings for hydrological models than coarse climate model outputs. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Sippel, S.; Otto, F. E. L.; Forkel, M.; Allen, M. R.; Guillod, B. P.; Heimann, M.; Reichstein, M.; Seneviratne, S. I.; Kirsten, T.; Mahecha, M. D.
2015-12-01
Understanding, quantifying and attributing the impacts of climatic extreme events and variability is crucial for societal adaptation in a changing climate. However, climate model simulations generated for this purpose typically exhibit pronounced biases in their output that hinders any straightforward assessment of impacts. To overcome this issue, various bias correction strategies are routinely used to alleviate climate model deficiencies most of which have been criticized for physical inconsistency and the non-preservation of the multivariate correlation structure. We assess how biases and their correction affect the quantification and attribution of simulated extremes and variability in i) climatological variables and ii) impacts on ecosystem functioning as simulated by a terrestrial biosphere model. Our study demonstrates that assessments of simulated climatic extreme events and impacts in the terrestrial biosphere are highly sensitive to bias correction schemes with major implications for the detection and attribution of these events. We introduce a novel ensemble-based resampling scheme based on a large regional climate model ensemble generated by the distributed weather@home setup[1], which fully preserves the physical consistency and multivariate correlation structure of the model output. We use extreme value statistics to show that this procedure considerably improves the representation of climatic extremes and variability. Subsequently, biosphere-atmosphere carbon fluxes are simulated using a terrestrial ecosystem model (LPJ-GSI) to further demonstrate the sensitivity of ecosystem impacts to the methodology of bias correcting climate model output. We find that uncertainties arising from bias correction schemes are comparable in magnitude to model structural and parameter uncertainties. The present study consists of a first attempt to alleviate climate model biases in a physically consistent way and demonstrates that this yields improved simulations of climate extremes and associated impacts. [1] http://www.climateprediction.net/weatherathome/
Large, nonsaturating thermopower in a quantizing magnetic field
Fu, Liang
2018-01-01
The thermoelectric effect is the generation of an electrical voltage from a temperature gradient in a solid material due to the diffusion of free charge carriers from hot to cold. Identifying materials with a large thermoelectric response is crucial for the development of novel electric generators and coolers. We theoretically consider the thermopower of Dirac/Weyl semimetals subjected to a quantizing magnetic field. We contrast their thermoelectric properties with those of traditional heavily doped semiconductors and show that, under a sufficiently large magnetic field, the thermopower of Dirac/Weyl semimetals grows linearly with the field without saturation and can reach extremely high values. Our results suggest an immediate pathway for achieving record-high thermopower and thermoelectric figure of merit, and they compare well with a recent experiment on Pb1–xSnxSe. PMID:29806031
Regional estimation of extreme suspended sediment concentrations using watershed characteristics
NASA Astrophysics Data System (ADS)
Tramblay, Yves; Ouarda, Taha B. M. J.; St-Hilaire, André; Poulin, Jimmy
2010-01-01
SummaryThe number of stations monitoring daily suspended sediment concentration (SSC) has been decreasing since the 1980s in North America while suspended sediment is considered as a key variable for water quality. The objective of this study is to test the feasibility of regionalising extreme SSC, i.e. estimating SSC extremes values for ungauged basins. Annual maximum SSC for 72 rivers in Canada and USA were modelled with probability distributions in order to estimate quantiles corresponding to different return periods. Regionalisation techniques, originally developed for flood prediction in ungauged basins, were tested using the climatic, topographic, land cover and soils attributes of the watersheds. Two approaches were compared, using either physiographic characteristics or seasonality of extreme SSC to delineate the regions. Multiple regression models to estimate SSC quantiles as a function of watershed characteristics were built in each region, and compared to a global model including all sites. Regional estimates of SSC quantiles were compared with the local values. Results show that regional estimation of extreme SSC is more efficient than a global regression model including all sites. Groups/regions of stations have been identified, using either the watershed characteristics or the seasonality of occurrence for extreme SSC values providing a method to better describe the extreme events of SSC. The most important variables for predicting extreme SSC are the percentage of clay in the soils, precipitation intensity and forest cover.
The effects of temperature and salinity on phosphate levels in two euryhaline crustacean species
NASA Astrophysics Data System (ADS)
Spaargaren, D. H.; Richard, P.; Ceccaldi, H. J.
Total phoshate, inorganic phosphate and organic (phospholipid) phosphate concentrations were determined in the blood of Carcinus maenas and in whole-animal homogenates of Penaeus japonicus acclimatized to various salinities and a high or a low temperature. In the blood of Carcinus, total and inorganic P concentrations range between 1.0 and 4.5 mmol · l -1; the amount of phospholipids is negligeable. The higher values were found at more extreme salinities. Low temperature is associated with low phosphate concentrations, particularly at intermediate salinities. Total P concentrations in Penaeus homogenates range between 10 and 60 mmol · 1 -1; phospholipid concentrations range between zero and 50 mmol · 1 -1. The higher values are again found at the extreme salinities. Inorganic P concentrations are almost constant — ca 10 mmol · 1 -1. No apparent effect of temperature on phosphate concentrations was observed. The results show clearly that osmotic stress influences severely the phosphate metabolism of the two species studied. Both species are able to accumulate phosphate at all experimental temperature/salinity combinations used, even when deprived of food. At extreme salinities, large quantities of phosphate are accumulated and converted to organic P compounds, most likely as phospholipids associated with the cell membranes. These effects of osmotic conditions in phosphate metabolism may offer an explanation for the effect of Ca ++ on membrane permeability as the regulation of both ions may be strongly interrelated, often under hormonal control.
Computational data sciences for assessment and prediction of climate extremes
NASA Astrophysics Data System (ADS)
Ganguly, A. R.
2011-12-01
Climate extremes may be defined inclusively as severe weather events or large shifts in global or regional weather patterns which may be caused or exacerbated by natural climate variability or climate change. This area of research arguably represents one of the largest knowledge-gaps in climate science which is relevant for informing resource managers and policy makers. While physics-based climate models are essential in view of non-stationary and nonlinear dynamical processes, their current pace of uncertainty reduction may not be adequate for urgent stakeholder needs. The structure of the models may in some cases preclude reduction of uncertainty for critical processes at scales or for the extremes of interest. On the other hand, methods based on complex networks, extreme value statistics, machine learning, and space-time data mining, have demonstrated significant promise to improve scientific understanding and generate enhanced predictions. When combined with conceptual process understanding at multiple spatiotemporal scales and designed to handle massive data, interdisciplinary data science methods and algorithms may complement or supplement physics-based models. Specific examples from the prior literature and our ongoing work suggests how data-guided improvements may be possible, for example, in the context of ocean meteorology, climate oscillators, teleconnections, and atmospheric process understanding, which in turn can improve projections of regional climate, precipitation extremes and tropical cyclones in an useful and interpretable fashion. A community-wide effort is motivated to develop and adapt computational data science tools for translating climate model simulations to information relevant for adaptation and policy, as well as for improving our scientific understanding of climate extremes from both observed and model-simulated data.
NASA Astrophysics Data System (ADS)
von Trentini, F.; Willkofer, F.; Wood, R. R.; Schmid, F. J.; Ludwig, R.
2017-12-01
The ClimEx project (Climate change and hydrological extreme events - risks and perspectives for water management in Bavaria and Québec) focuses on the effects of climate change on hydro-meteorological extreme events and their implications for water management in Bavaria and Québec. Therefore, a hydro-meteorological model chain is applied. It employs high performance computing capacity of the Leibniz Supercomputing Centre facility SuperMUC to dynamically downscale 50 members of the Global Circulation Model CanESM2 over European and Eastern North American domains using the Canadian Regional Climate Model (RCM) CRCM5. Over Europe, the unique single model ensemble is conjointly analyzed with the latest information provided through the CORDEX-initiative, to better assess the influence of natural climate variability and climatic change in the dynamics of extreme events. Furthermore, these 50 members of a single RCM will enhance extreme value statistics (extreme return periods) by exploiting the available 1500 model years for the reference period from 1981 to 2010. Hence, the RCM output is applied to drive the process based, fully distributed, and deterministic hydrological model WaSiM in high temporal (3h) and spatial (500m) resolution. WaSiM and the large ensemble are further used to derive a variety of hydro-meteorological patterns leading to severe flood events. A tool for virtual perfect prediction shall provide a combination of optimal lead time and management strategy to mitigate certain flood events following these patterns.
Stationary and non-stationary extreme value modeling of extreme temperature in Malaysia
NASA Astrophysics Data System (ADS)
Hasan, Husna; Salleh, Nur Hanim Mohd; Kassim, Suraiya
2014-09-01
Extreme annual temperature of eighteen stations in Malaysia is fitted to the Generalized Extreme Value distribution. Stationary and non-stationary models with trend are considered for each station and the Likelihood Ratio test is used to determine the best-fitting model. Results show that three out of eighteen stations i.e. Bayan Lepas, Labuan and Subang favor a model which is linear in the location parameter. A hierarchical cluster analysis is employed to investigate the existence of similar behavior among the stations. Three distinct clusters are found in which one of them consists of the stations that favor the non-stationary model. T-year estimated return levels of the extreme temperature are provided based on the chosen models.
NASA Astrophysics Data System (ADS)
Parey, S.
2014-12-01
F. J. Acero1, S. Parey2, T.T.H. Hoang2, D. Dacunha-Castelle31Dpto. Física, Universidad de Extremadura, Avda. de Elvas s/n, 06006, Badajoz 2EDF/R&D, 6 quai Watier, 78401 Chatou Cedex, France 3Laboratoire de Mathématiques, Université Paris 11, Orsay, France Trends can already be detected in daily rainfall amount in the Iberian Peninsula (IP), and this will have an impact on the extreme levels. In this study, we compare different ways to estimate future return levels for heavy rainfall, based on the statistical extreme value theory. Both Peaks over Threshold (POT) and block maxima with the Generalized Extreme Value (GEV) distribution will be used and their results compared when linear trends are assumed in the parameters: threshold and scale parameter for POT and location and scale parameter for GEV. But rainfall over the IP is a special variable in that a large number of the values are 0. Thus, the impact of taking this into account is discussed too. Another approach is then tested, based on the evolutions of the mean and variance obtained from the time series of rainy days only, and of the number of rainy days. A statistical test, similar to that designed for temperature in Parey et al. 2013, is used to assess if the trends in extremes can be considered as mostly due to these evolutions when considering only rainy days. The results show that it is mainly the case: the extremes of the residuals, after removing the trends in mean and standard deviation, cannot be differentiated from those of a stationary process. Thus, the future return levels can be estimated from the stationary return level of these residuals and an estimation of the future mean and standard deviation. Moreover, an estimation of the future number of rainy days is used to retrieve the return levels for all days. All of these comparisons are made for an ensemble of high quality rainfall time series observed in the Iberian Peninsula over the period 1961-2010, from which we want to estimate a 20-year return level expected in 2020. The evolutions and the impact of the different approaches will be discussed for 3 seasons: fall, spring and winter. Parey S., Hoang T.T.H., Dacunha-Castelle D.: The importance of mean and variance in predicting changes in temperature extremes, Journal of Geophysical Research: Atmospheres, Vol. 118, 1-12, 2013.
Ionita-Laza, Iuliana; Ottman, Ruth
2011-11-01
The recent progress in sequencing technologies makes possible large-scale medical sequencing efforts to assess the importance of rare variants in complex diseases. The results of such efforts depend heavily on the use of efficient study designs and analytical methods. We introduce here a unified framework for association testing of rare variants in family-based designs or designs based on unselected affected individuals. This framework allows us to quantify the enrichment in rare disease variants in families containing multiple affected individuals and to investigate the optimal design of studies aiming to identify rare disease variants in complex traits. We show that for many complex diseases with small values for the overall sibling recurrence risk ratio, such as Alzheimer's disease and most cancers, sequencing affected individuals with a positive family history of the disease can be extremely advantageous for identifying rare disease variants. In contrast, for complex diseases with large values of the sibling recurrence risk ratio, sequencing unselected affected individuals may be preferable.
Design of pre-optics for laser guide star wavefront sensor for the ELT
NASA Astrophysics Data System (ADS)
Muslimov, Eduard; Dohlen, Kjetil; Neichel, Benoit; Hugot, Emmanuel
2017-12-01
In the present paper, we consider the optical design of a zoom system for the active refocusing in laser guide star wavefront sensors. The system is designed according to the specifications coming from the Extremely Large Telescope (ELT)-HARMONI instrument, the first-light, integral field spectrograph for the European (E)-ELT. The system must provide a refocusing of the laser guide as a function of telescope pointing and large decentring of the incoming beam. The system considers four moving lens groups, each of them being a doublet with one aspherical surface. The advantages and shortcomings of such a solution in terms of the component displacements and complexity of the surfaces are described in detail. It is shown that the system can provide the median value of the residual wavefront error of 13.8-94.3 nm and the maximum value <206 nm, while the exit pupil distortion is 0.26-0.36% for each of the telescope pointing directions.
Excellent magnetocaloric properties of melt-extracted Gd-based amorphous microwires
NASA Astrophysics Data System (ADS)
Bingham, N. S.; Wang, H.; Qin, F.; Peng, H. X.; Sun, J. F.; Franco, V.; Srikanth, H.; Phan, M. H.
2012-09-01
We report upon the excellent magnetocaloric properties of Gd53Al24Co20Zr3 amorphous microwires. In addition to obtaining the large magnetic entropy change (-ΔSM ˜ 10.3 J/kg K at TC ˜ 95 K), an extremely large value of refrigerant capacity (RC ˜ 733.4 J/kg) has been achieved for a field change of 5 T in an array of forty microwires arranged in parallel. This value of RC is about 79% and 103% larger than those of Gd (˜410 J/kg) and Gd5Si2Ge1.9Fe0.1 (˜360 J/kg) regardless of their magnetic ordering temperatures. The design and fabrication of a magnetic bed made of these parallel-arranged microwires would thus be a very promising approach for active magnetic refrigeration for nitrogen liquefaction. Since these microwires can easily be assembled as laminate structures, they have potential applications as a cooling device for micro electro mechanical systems and nano electro mechanical systems.
Estimation of breeding values using selected pedigree records.
Morton, Richard; Howarth, Jordan M
2005-06-01
Fish bred in tanks or ponds cannot be easily tagged individually. The parentage of any individual may be determined by DNA fingerprinting, but is sufficiently expensive that large numbers cannot be so finger-printed. The measurement of the objective trait can be made on a much larger sample relatively cheaply. This article deals with experimental designs for selecting individuals to be finger-printed and for the estimation of the individual and family breeding values. The general setup provides estimates for both genetic effects regarded as fixed or random and for fixed effects due to known regressors. The family effects can be well estimated when even very small numbers are finger-printed, provided that they are the individuals with the most extreme phenotypes.
On the identification of Dragon Kings among extreme-valued outliers
NASA Astrophysics Data System (ADS)
Riva, M.; Neuman, S. P.; Guadagnini, A.
2013-07-01
Extreme values of earth, environmental, ecological, physical, biological, financial and other variables often form outliers to heavy tails of empirical frequency distributions. Quite commonly such tails are approximated by stretched exponential, log-normal or power functions. Recently there has been an interest in distinguishing between extreme-valued outliers that belong to the parent population of most data in a sample and those that do not. The first type, called Gray Swans by Nassim Nicholas Taleb (often confused in the literature with Taleb's totally unknowable Black Swans), is drawn from a known distribution of the tails which can thus be extrapolated beyond the range of sampled values. However, the magnitudes and/or space-time locations of unsampled Gray Swans cannot be foretold. The second type of extreme-valued outliers, termed Dragon Kings by Didier Sornette, may in his view be sometimes predicted based on how other data in the sample behave. This intriguing prospect has recently motivated some authors to propose statistical tests capable of identifying Dragon Kings in a given random sample. Here we apply three such tests to log air permeability data measured on the faces of a Berea sandstone block and to synthetic data generated in a manner statistically consistent with these measurements. We interpret the measurements to be, and generate synthetic data that are, samples from α-stable sub-Gaussian random fields subordinated to truncated fractional Gaussian noise (tfGn). All these data have frequency distributions characterized by power-law tails with extreme-valued outliers about the tail edges.
Modelling hydrological extremes under non-stationary conditions using climate covariates
NASA Astrophysics Data System (ADS)
Vasiliades, Lampros; Galiatsatou, Panagiota; Loukas, Athanasios
2013-04-01
Extreme value theory is a probabilistic theory that can interpret the future probabilities of occurrence of extreme events (e.g. extreme precipitation and streamflow) using past observed records. Traditionally, extreme value theory requires the assumption of temporal stationarity. This assumption implies that the historical patterns of recurrence of extreme events are static over time. However, the hydroclimatic system is nonstationary on time scales that are relevant to extreme value analysis, due to human-mediated and natural environmental change. In this study the generalized extreme value (GEV) distribution is used to assess nonstationarity in annual maximum daily rainfall and streamflow timeseries at selected meteorological and hydrometric stations in Greece and Cyprus. The GEV distribution parameters (location, scale, and shape) are specified as functions of time-varying covariates and estimated using the conditional density network (CDN) as proposed by Cannon (2010). The CDN is a probabilistic extension of the multilayer perceptron neural network. Model parameters are estimated via the generalized maximum likelihood (GML) approach using the quasi-Newton BFGS optimization algorithm, and the appropriate GEV-CDN model architecture for the selected meteorological and hydrometric stations is selected by fitting increasingly complicated models and choosing the one that minimizes the Akaike information criterion with small sample size correction. For all case studies in Greece and Cyprus different formulations are tested with combinational cases of stationary and nonstationary parameters of the GEV distribution, linear and non-linear architecture of the CDN and combinations of the input climatic covariates. Climatic indices such as the Southern Oscillation Index (SOI), which describes atmospheric circulation in the eastern tropical pacific related to El Niño Southern Oscillation (ENSO), the Pacific Decadal Oscillation (PDO) index that varies on an interdecadal rather than interannual time scale and the atmospheric circulation patterns as expressed by the North Atlantic Oscillation (NAO) index are used to express the GEV parameters as functions of the covariates. Results show that the nonstationary GEV model can be an efficient tool to take into account the dependencies between extreme value random variables and the temporal evolution of the climate.
NASA Astrophysics Data System (ADS)
Rieder, Harald E.; Jancso, Leonhardt M.; Rocco, Stefania Di; Staehelin, Johannes; Maeder, Joerg A.; Peter, Thomas; Ribatet, Mathieu; Davison, Anthony C.; de Backer, Hugo; Koehler, Ulf; Krzyścin, Janusz; Vaníček, Karel
2011-11-01
We apply methods from extreme value theory to identify extreme events in high (termed EHOs) and low (termed ELOs) total ozone and to describe the distribution tails (i.e. very high and very low values) of five long-term European ground-based total ozone time series. The influence of these extreme events on observed mean values, long-term trends and changes is analysed. The results show a decrease in EHOs and an increase in ELOs during the last decades, and establish that the observed downward trend in column ozone during the 1970-1990s is strongly dominated by changes in the frequency of extreme events. Furthermore, it is shown that clear ‘fingerprints’ of atmospheric dynamics (NAO, ENSO) and chemistry [ozone depleting substances (ODSs), polar vortex ozone loss] can be found in the frequency distribution of ozone extremes, even if no attribution is possible from standard metrics (e.g. annual mean values). The analysis complements earlier analysis for the world's longest total ozone record at Arosa, Switzerland, confirming and revealing the strong influence of atmospheric dynamics on observed ozone changes. The results provide clear evidence that in addition to ODS, volcanic eruptions and strong/moderate ENSO and NAO events had significant influence on column ozone in the European sector.
Gerdes, N; Farin, E
2016-10-01
Objective: Taking Fibromyalgia syndrome (FMS) as an example, the article illustrates a problem that to our knowledge has not been addressed in rehabilitation research so far: According to our large dataset, a sizeable proportion of patients had to be sent home with extremely severe burdens (<2 nd percentile in the normal population) at discharge - in spite of good improvements during their stay. Data and methods: Since 2009, patients in the RehaKlinikum Bad Säckingen, an in-patient rehab center for orthopedic-rheumatic diseases, answer the questionnaire "Indicators of Rehabilitation Status" (IRES) at the beginning and the end of their stay. We analysed IRES-data of 1 803 patients with FMS (94% women). In addition to analyses of change, we determined the degrees of severity at admission and discharge on the basis of a comparison with the normative sample of the IRES. In order to predict membership of the high-risk group of patients with still "extremely severe" values at discharge, we performed binary logistic regression analyses. Results: At admission, about 90% of the patients showed either "extreme" (65%<2 nd percentile) or "severe" (27% 2 nd -10 th percentile) values on the IRES summary score as well as on the scores for "psychic status", "pain", "symptoms of orthopedic and cardiovascular diseases", and "functioning in everyday life". In sum, then, FMS-patients have come to rehabilitation with multiple burdens of a severe to extreme degree. At discharge, the mean summary score had improved with a "strong" effect size of SRM=1.07. In spite of these good overall improvements, however, 37.4% of the patients went home with "extreme" burdens remaining, even though almost 60% of them had experienced "strong" (28%) or "relevant" (31%) improvements. The most important predictor of affiliation to this "high-risk group" was - as expected - the IRES summary score at admission. But unexpectedly influential were also some characteristics of social status such as lower household income and lower degrees of education. Conclusion: In rehabilitation research, analyses of change between pre- and post-measurement values should be accompanied by assessments of severity of rehabilitation status at discharge because even good improvements do not necessarily mean that a patient has been rehabilitated successfully. © Georg Thieme Verlag KG Stuttgart · New York.
2007-01-01
cost of defense and the value of insurance against threats purchased by those costs. But a large fraction of the public debate at this time was...happen. One extreme we could ask of people is basically to self- insure . Let them bear the cost of things to happen. Th at’s a pretty powerful incentive...instruments KEYNOTE ADDRESS 11 2006 DEFENSE ECONOMICS CONFERENCE that can be used to shift risk burdens, and the use of insurance and pricing of
Kalpathy-Cramer, Jayashree; de Herrera, Alba García Seco; Demner-Fushman, Dina; Antani, Sameer; Bedrick, Steven; Müller, Henning
2014-01-01
Medical image retrieval and classification have been extremely active research topics over the past 15 years. With the ImageCLEF benchmark in medical image retrieval and classification a standard test bed was created that allows researchers to compare their approaches and ideas on increasingly large and varied data sets including generated ground truth. This article describes the lessons learned in ten evaluations campaigns. A detailed analysis of the data also highlights the value of the resources created. PMID:24746250
Neuromuscular Consequences of an Extreme Mountain Ultra-Marathon
Millet, Guillaume Y.; Tomazin, Katja; Verges, Samuel; Vincent, Christopher; Bonnefoy, Régis; Boisson, Renée-Claude; Gergelé, Laurent; Féasson, Léonard; Martin, Vincent
2011-01-01
We investigated the physiological consequences of one of the most extreme exercises realized by humans in race conditions: a 166-km mountain ultra-marathon (MUM) with 9500 m of positive and negative elevation change. For this purpose, (i) the fatigue induced by the MUM and (ii) the recovery processes over two weeks were assessed. Evaluation of neuromuscular function (NMF) and blood markers of muscle damage and inflammation were performed before and immediately following (n = 22), and 2, 5, 9 and 16 days after the MUM (n = 11) in experienced ultra-marathon runners. Large maximal voluntary contraction decreases occurred after MUM (−35% [95% CI: −28 to −42%] and −39% [95% CI: −32 to −46%] for KE and PF, respectively), with alteration of maximal voluntary activation, mainly for KE (−19% [95% CI: −7 to −32%]). Significant modifications in markers of muscle damage and inflammation were observed after the MUM as suggested by the large changes in creatine kinase (from 144±94 to 13,633±12,626 UI L−1), myoglobin (from 32±22 to 1,432±1,209 µg L−1), and C-Reactive Protein (from <2.0 to 37.7±26.5 mg L−1). Moderate to large reductions in maximal compound muscle action potential amplitude, high-frequency doublet force, and low frequency fatigue (index of excitation-contraction coupling alteration) were also observed for both muscle groups. Sixteen days after MUM, NMF had returned to initial values, with most of the recovery process occurring within 9 days of the race. These findings suggest that the large alterations in NMF after an ultra-marathon race are multi-factorial, including failure of excitation-contraction coupling, which has never been described after prolonged running. It is also concluded that as early as two weeks after such an extreme running exercise, maximal force capacities have returned to baseline. PMID:21364944
NASA Astrophysics Data System (ADS)
Schlögl, Matthias; Laaha, Gregor
2017-04-01
The assessment of road infrastructure exposure to extreme weather events is of major importance for scientists and practitioners alike. In this study, we compare the different extreme value approaches and fitting methods with respect to their value for assessing the exposure of transport networks to extreme precipitation and temperature impacts. Based on an Austrian data set from 25 meteorological stations representing diverse meteorological conditions, we assess the added value of partial duration series (PDS) over the standardly used annual maxima series (AMS) in order to give recommendations for performing extreme value statistics of meteorological hazards. Results show the merits of the robust L-moment estimation, which yielded better results than maximum likelihood estimation in 62 % of all cases. At the same time, results question the general assumption of the threshold excess approach (employing PDS) being superior to the block maxima approach (employing AMS) due to information gain. For low return periods (non-extreme events) the PDS approach tends to overestimate return levels as compared to the AMS approach, whereas an opposite behavior was found for high return levels (extreme events). In extreme cases, an inappropriate threshold was shown to lead to considerable biases that may outperform the possible gain of information from including additional extreme events by far. This effect was visible from neither the square-root criterion nor standardly used graphical diagnosis (mean residual life plot) but rather from a direct comparison of AMS and PDS in combined quantile plots. We therefore recommend performing AMS and PDS approaches simultaneously in order to select the best-suited approach. This will make the analyses more robust, not only in cases where threshold selection and dependency introduces biases to the PDS approach but also in cases where the AMS contains non-extreme events that may introduce similar biases. For assessing the performance of extreme events we recommend the use of conditional performance measures that focus on rare events only in addition to standardly used unconditional indicators. The findings of the study directly address road and traffic management but can be transferred to a range of other environmental variables including meteorological and hydrological quantities.
Kim, Chaeeun; Park, Jun-Cheol; Choi, Sun Young; Kim, Yonghun; Seo, Seung-Young; Park, Tae-Eon; Kwon, Se-Hun; Cho, Byungjin; Ahn, Ji-Hoon
2018-04-01
2D layered materials with sensitive surfaces are promising materials for use in chemical sensing devices, owing to their extremely large surface-to-volume ratios. However, most chemical sensors based on 2D materials are used in the form of laterally defined active channels, in which the active area is limited to the actual device dimensions. Therefore, a novel approach for fabricating self-formed active-channel devices is proposed based on 2D semiconductor materials with very large surface areas, and their potential gas sensing ability is examined. First, the vertical growth phenomenon of SnS 2 nanocrystals is investigated with large surface area via metal-assisted growth using prepatterned metal electrodes, and then self-formed active-channel devices are suggested without additional pattering through the selective synthesis of SnS 2 nanosheets on prepatterned metal electrodes. The self-formed active-channel device exhibits extremely high response values (>2000% at 10 ppm) for NO 2 along with excellent NO 2 selectivity. Moreover, the NO 2 gas response of the gas sensing device with vertically self-formed SnS 2 nanosheets is more than two orders of magnitude higher than that of a similar exfoliated SnS 2 -based device. These results indicate that the facile device fabrication method would be applicable to various systems in which surface area plays an important role. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
On the use of star-shaped genealogies in inference of coalescence times.
Rosenberg, Noah A; Hirsh, Aaron E
2003-01-01
Genealogies from rapidly growing populations have approximate "star" shapes. We study the degree to which this approximation holds in the context of estimating the time to the most recent common ancestor (T(MRCA)) of a set of lineages. In an exponential growth scenario, we find that unless the product of population size (N) and growth rate (r) is at least approximately 10(5), the "pairwise comparison estimator" of T(MRCA) that derives from the star genealogy assumption has bias of 10-50%. Thus, the estimator is appropriate only for large populations that have grown very rapidly. The "tree-length estimator" of T(MRCA) is more biased than the pairwise comparison estimator, having low bias only for extremely large values of Nr. PMID:12930771
NASA Astrophysics Data System (ADS)
Wang, Cailin; Ren, Xuehui; Li, Ying
2017-04-01
We defined the threshold of extreme precipitation using detrended fluctuation analysis based on daily precipitation during 1955-2013 in Kuandian County, Liaoning Province. Three-dimensional copulas were introduced to analyze the characteristics of four extreme precipitation factors: the annual extreme precipitation day, extreme precipitation amount, annual average extreme precipitation intensity, and extreme precipitation rate of contribution. The results show that (1) the threshold is 95.0 mm, extreme precipitation events generally occur 1-2 times a year, the average extreme precipitation intensity is 100-150 mm, and the extreme precipitation amount is 100-270 mm accounting for 10 to 37 % of annual precipitation. (2) The generalized extreme value distribution, extreme value distribution, and generalized Pareto distribution are suitable for fitting the distribution function for each element of extreme precipitation. The Ali-Mikhail-Haq (AMH) copula function reflects the joint characteristics of extreme precipitation factors. (3) The return period of the three types has significant synchronicity, and the joint return period and co-occurrence return period have long delay when the return period of the single factor is long. This reflects the inalienability of extreme precipitation factors. The co-occurrence return period is longer than that of the single factor and joint return period. (4) The single factor fitting only reflects single factor information of extreme precipitation but is unrelated to the relationship between factors. Three-dimensional copulas represent the internal information of extreme precipitation factors and are closer to the actual. The copula function is potentially widely applicable for the multiple factors of extreme precipitation.
Nilsson, Dan-E; Warrant, Eric J; Johnsen, Sönke; Hanlon, Roger T; Shashar, Nadav
2013-09-08
We recently reported (Curr Biol 22:683-688, 2012) that the eyes of giant and colossal squid can grow to three times the diameter of the eyes of any other animal, including large fishes and whales. As an explanation to this extreme absolute eye size, we developed a theory for visual performance in aquatic habitats, leading to the conclusion that the huge eyes of giant and colossal squid are uniquely suited for detection of sperm whales, which are important squid-predators in the depths where these squid live. A paper in this journal by Schmitz et al. (BMC Evol Biol 13:45, 2013) refutes our conclusions on the basis of two claims: (1) using allometric data they argue that the eyes of giant and colossal squid are not unexpectedly large for the size of the squid, and (2) a revision of the values used for modelling indicates that large eyes are not better for detection of approaching sperm whales than they are for any other task. We agree with Schmitz et al. that their revised values for intensity and abundance of planktonic bioluminescence may be more realistic, or at least more appropriately conservative, but argue that their conclusions are incorrect because they have not considered some of the main arguments put forward in our paper. We also present new modelling to demonstrate that our conclusions remain robust, even with the revised input values suggested by Schmitz et al.
Extreme Value Theory and the New Sunspot Number Series
NASA Astrophysics Data System (ADS)
Acero, F. J.; Carrasco, V. M. S.; Gallego, M. C.; García, J. A.; Vaquero, J. M.
2017-04-01
Extreme value theory was employed to study solar activity using the new sunspot number index. The block maxima approach was used at yearly (1700-2015), monthly (1749-2016), and daily (1818-2016) scales, selecting the maximum sunspot number value for each solar cycle, and the peaks-over-threshold (POT) technique was used after a declustering process only for the daily data. Both techniques led to negative values for the shape parameters. This implies that the extreme sunspot number value distribution has an upper bound. The return level (RL) values obtained from the POT approach were greater than when using the block maxima technique. Regarding the POT approach, the 110 year (550 and 1100 year) RLs were lower (higher) than the daily maximum observed sunspot number value of 528. Furthermore, according to the block maxima approach, the 10-cycle RL lay within the block maxima daily sunspot number range, as expected, but it was striking that the 50- and 100-cycle RLs were also within that range. Thus, it would seem that the RL is reaching a plateau, and, although one must be cautious, it would be difficult to attain sunspot number values greater than 550. The extreme value trends from the four series (yearly, monthly, and daily maxima per solar cycle, and POT after declustering the daily data) were analyzed with the Mann-Kendall test and Sen’s method. Only the negative trend of the daily data with the POT technique was statistically significant.
Minimizing metastatic risk in radiotherapy fractionation schedules
NASA Astrophysics Data System (ADS)
Badri, Hamidreza; Ramakrishnan, Jagdish; Leder, Kevin
2015-11-01
Metastasis is the process by which cells from a primary tumor disperse and form new tumors at distant anatomical locations. The treatment and prevention of metastatic cancer remains an extremely challenging problem. This work introduces a novel biologically motivated objective function to the radiation optimization community that takes into account metastatic risk instead of the status of the primary tumor. In this work, we consider the problem of developing fractionated irradiation schedules that minimize production of metastatic cancer cells while keeping normal tissue damage below an acceptable level. A dynamic programming framework is utilized to determine the optimal fractionation scheme. We evaluated our approach on a breast cancer case using the heart and the lung as organs-at-risk (OAR). For small tumor α /β values, hypo-fractionated schedules were optimal, which is consistent with standard models. However, for relatively larger α /β values, we found the type of schedule depended on various parameters such as the time when metastatic risk was evaluated, the α /β values of the OARs, and the normal tissue sparing factors. Interestingly, in contrast to standard models, hypo-fractionated and semi-hypo-fractionated schedules (large initial doses with doses tapering off with time) were suggested even with large tumor α/β values. Numerical results indicate the potential for significant reduction in metastatic risk.
Multiscale numerical simulations of magnetoconvection at low magnetic Prandtl and Rossby numbers.
NASA Astrophysics Data System (ADS)
Maffei, S.; Calkins, M. A.; Julien, K. A.; Marti, P.
2017-12-01
The dynamics of the Earth's outer core is characterized by low values of the Rossby (Ro), Ekman and magnetic Prandtl numbers. These values indicate the large spectra of temporal and spatial scales that need to be accounted for in realistic numerical simulations of the system. Current direct numerical simulation are not capable of reaching this extreme regime, suggesting that a new class of models is required to account for the rich dynamics expected in the natural system. Here we present results from a quasi-geostrophic, multiscale model based on the scale separation implied by the low Ro typical of rapidly rotating systems. We investigate a plane layer geometry where convection is driven by an imposed temperature gradient and the hydrodynamic equations are modified by a large scale magnetic field. Analytical investigation shows that at values of thermal and magnetic Prandtl numbers relevant for liquid metals, the energetic requirements for the onset of convection is not significantly altered even in the presence of strong magnetic fields. Results from strongly forced nonlinear numerical simulations show the presence of an inverse cascade, typical of 2-D turbulence, when no or weak magnetic field is applied. For higher values of the magnetic field the inverse cascade is quenched.
A dynamical systems approach to studying midlatitude weather extremes
NASA Astrophysics Data System (ADS)
Messori, Gabriele; Caballero, Rodrigo; Faranda, Davide
2017-04-01
Extreme weather occurrences carry enormous social and economic costs and routinely garner widespread scientific and media coverage. The ability to predict these events is therefore a topic of crucial importance. Here we propose a novel predictability pathway for extreme events, by building upon recent advances in dynamical systems theory. We show that simple dynamical systems metrics can be used to identify sets of large-scale atmospheric flow patterns with similar spatial structure and temporal evolution on time scales of several days to a week. In regions where these patterns favor extreme weather, they afford a particularly good predictability of the extremes. We specifically test this technique on the atmospheric circulation in the North Atlantic region, where it provides predictability of large-scale wintertime surface temperature extremes in Europe up to 1 week in advance.
Tanisawa, Kumpei; Arai, Yasumichi; Hirose, Nobuyoshi; Shimokata, Hiroshi; Yamada, Yoshiji; Kawai, Hisashi; Kojima, Motonaga; Obuchi, Shuichi; Hirano, Hirohiko; Yoshida, Hideyo; Suzuki, Hiroyuki; Fujiwara, Yoshinori; Ihara, Kazushige; Sugaya, Maki; Arai, Tomio; Mori, Seijiro; Sawabe, Motoji; Sato, Noriko; Muramatsu, Masaaki; Higuchi, Mitsuru; Liu, Yao-Wen; Kong, Qing-Peng
2017-01-01
Abstract Life span is a complex trait regulated by multiple genetic and environmental factors; however, the genetic determinants of extreme longevity have been largely unknown. To identify the functional coding variants associated with extreme longevity, we performed an exome-wide association study (EWAS) on a Japanese population by using an Illumina HumanExome Beadchip and a focused replication study on a Chinese population. The EWAS on two independent Japanese cohorts consisting of 530 nonagenarians/centenarians demonstrated that the G allele of CLEC3B missense variant p.S106G was associated with extreme longevity at the exome-wide level of significance (p = 2.33×10–7, odds ratio [OR] = 1.50). The CLEC3B gene encodes tetranectin, a protein implicated in the mineralization process in osteogenesis as well as in the prognosis and metastasis of cancer. The replication study consisting of 448 Chinese nonagenarians/centenarians showed that the G allele of CLEC3B p.S106G was also associated with extreme longevity (p = .027, OR = 1.51), and the p value of this variant reached 1.87×10–8 in the meta-analysis of Japanese and Chinese populations. In conclusion, the present study identified the CLEC3B p.S106G as a novel longevity-associated variant, raising the novel hypothesis that tetranectin, encoded by CLEC3B, plays a role in human longevity and aging. PMID:27154906
The Pace of Perceivable Extreme Climate Change
NASA Astrophysics Data System (ADS)
Tan, X.; Gan, T. Y.
2015-12-01
When will the signal of obvious changes in extreme climate emerge over climate variability (Time of Emergence, ToE) is a key question for planning and implementing measures to mitigate the potential impact of climate change to natural and human systems that are generally adapted to potential changes from current variability. We estimated ToEs for the magnitude, duration and frequency of global extreme climate represented by 24 extreme climate indices (16 for temperature and 8 for precipitation) with different thresholds of the signal-to-noise (S/N) ratio based on projections of CMIP5 global climate models under RCP8.5 and RCP4.5 for the 21st century. The uncertainty of ToE is assessed by using 3 different methods to calculate S/N for each extreme index. Results show that ToEs of the projected extreme climate indices based on the RCP4.5 climate scenarios are generally projected to happen about 20 years later than that for the RCP8.5 climate scenarios. Under RCP8.5, the projected magnitude, duration and frequency of extreme temperature on Earth will all exceed 2 standard deviations by 2100, and the empirical 50th percentile of the global ToE for the frequency and magnitude of hot (cold) extreme are about 2040 and 2054 (2064 and 2054) for S/N > 2, respectively. The 50th percentile of global ToE for the intensity of extreme precipitation is about 2030 and 2058 for S/N >0.5 and S/N >1, respectively. We further evaluated the exposure of ecosystems and human societies to the pace of extreme climate change by determining the year of ToE for various extreme climate indices projected to occur over terrestrial biomes, marine realms and major urban areas with large populations. This was done by overlaying terrestrial, ecoregions and population maps with maps of ToE derived, to extract ToEs for these regions. Possible relationships between GDP per person and ToE are also investigated by relating the mean ToE for each country and its average value of GDP per person.
NASA Astrophysics Data System (ADS)
Matin, M.; Mondal, Rajib; Barman, N.; Thamizhavel, A.; Dhar, S. K.
2018-05-01
Here, we report an extremely large positive magnetoresistance (XMR) in a single-crystal sample of MoSi2, approaching almost 107% at 2 K in a 14-T magnetic field without appreciable saturation. Hall resistivity data reveal an uncompensated nature of MoSi2 with an electron-hole compensation level sufficient enough to expect strong saturation of magnetoresistance in the high-field regime. Magnetotransport and the complementary de Haas-van Alphen (dHvA) oscillations results, however, suggest that strong Zeeman effect causes a magnetic field-induced modulation of the Fermi pockets and drives the system towards perfect electron-hole compensation condition in the high-field regime. Thus, the nonsaturating XMR of this semimetal arises under the unconventional situation of Zeeman effect-driven electron-hole compensation, whereas its huge magnitude is decided solely by the ultralarge value of the carrier mobility. Intrinsic ultralarge carrier mobility, strong suppression of backward scattering of the charge carriers, and nontrivial Berry phase in dHvA oscillations attest to the topological character of MoSi2. Therefore, this semimetal represents another material hosting combination of topological and conventional electronic phases.
Retainment of r-process material in dwarf galaxies
NASA Astrophysics Data System (ADS)
Beniamini, Paz; Dvorkin, Irina; Silk, Joe
2018-04-01
The synthesis of r-process elements is known to involve extremely energetic explosions. At the same time, recent observations find significant r-process enrichment even in extremely small ultra-faint dwarf (UFD) galaxies. This raises the question of retainment of those elements within their hosts. We estimate the retainment fraction and find that it is large ˜0.9, unless the r-process event is very energetic (≳ 1052erg) and / or the host has lost a large fraction of its gas prior to the event. We estimate the r-process mass per event and rate as implied by abundances in UFDs, taking into account imperfect retainment and different models of UFD evolution. The results are consistent with previous estimates (Beniamini et al. 2016b) and with the constraints from the recently detected macronova accompanying a neutron star merger (GW170817). We also estimate the distribution of abundances predicted by these models. We find that ˜0.07 of UFDs should have r-process enrichment. The results are consistent with both the mean values and the fluctuations of [Eu/Fe] in galactic metal poor stars, supporting the possibility that UFDs are the main 'building blocks' of the galactic halo population.
NASA Astrophysics Data System (ADS)
Veneziano, D.; Langousis, A.; Lepore, C.
2009-12-01
The annual maximum of the average rainfall intensity in a period of duration d, Iyear(d), is typically assumed to have generalized extreme value (GEV) distribution. The shape parameter k of that distribution is especially difficult to estimate from either at-site or regional data, making it important to constraint k using theoretical arguments. In the context of multifractal representations of rainfall, we observe that standard theoretical estimates of k from extreme value (EV) and extreme excess (EE) theories do not apply, while estimates from large deviation (LD) theory hold only for very small d. We then propose a new theoretical estimator based on fitting GEV models to the numerically calculated distribution of Iyear(d). A standard result from EV and EE theories is that k depends on the tail behavior of the average rainfall in d, I(d). This result holds if Iyear(d) is the maximum of a sufficiently large number n of variables, all distributed like I(d); therefore its applicability hinges on whether n = 1yr/d is large enough and the tail of I(d) is sufficiently well known. One typically assumes that at least for small d the former condition is met, but poor knowledge of the upper tail of I(d) remains an obstacle for all d. In fact, in the case of multifractal rainfall, also the first condition is not met because, irrespective of d, 1yr/d is too small (Veneziano et al., 2009, WRR, in press). Applying large deviation (LD) theory to this multifractal case, we find that, as d → 0, Iyear(d) approaches a GEV distribution whose shape parameter kLD depends on a region of the distribution of I(d) well below the upper tail, is always positive (in the EV2 range), is much larger than the value predicted by EV and EE theories, and can be readily found from the scaling properties of I(d). The scaling properties of rainfall can be inferred also from short records, but the limitation remains that the result holds under d → 0 not for finite d. Therefore, for different reasons, none of the above asymptotic theories applies to Iyear(d). In practice, one is interested in the distribution of Iyear(d) over a finite range of averaging durations d and return periods T. Using multifractal representations of rainfall, we have numerically calculated the distribution of Iyear(d) and found that, although not GEV, the distribution can be accurately approximated by a GEV model. The best-fitting parameter k depends on d, but is insensitive to the scaling properties of rainfall and the range of return periods T used for fitting. We have obtained a default expression for k(d) and compared it with estimates from historical rainfall records. The theoretical function tracks well the empirical dependence on d, although it generally overestimates the empirical k values, possibly due to deviations of rainfall from perfect scaling. This issue is under investigation.
Nonparametric Regression Subject to a Given Number of Local Extreme Value
2001-07-01
compilation report: ADP013708 thru ADP013761 UNCLASSIFIED Nonparametric regression subject to a given number of local extreme value Ali Majidi and Laurie...locations of the local extremes for the smoothing algorithm. 280 A. Majidi and L. Davies 3 The smoothing problem We make the smoothing problem precise...is the solution of QP3. k--oo 282 A. Majidi and L. Davies FiG. 2. The captions top-left, top-right, bottom-left, bottom-right show the result of the
Persistence Mapping Using EUV Solar Imager Data
NASA Technical Reports Server (NTRS)
Thompson, B. J.; Young, C. A.
2016-01-01
We describe a simple image processing technique that is useful for the visualization and depiction of gradually evolving or intermittent structures in solar physics extreme-ultraviolet imagery. The technique is an application of image segmentation, which we call "Persistence Mapping," to isolate extreme values in a data set, and is particularly useful for the problem of capturing phenomena that are evolving in both space and time. While integration or "time-lapse" imaging uses the full sample (of size N ), Persistence Mapping rejects (N - 1)/N of the data set and identifies the most relevant 1/N values using the following rule: if a pixel reaches an extreme value, it retains that value until that value is exceeded. The simplest examples isolate minima and maxima, but any quantile or statistic can be used. This paper demonstrates how the technique has been used to extract the dynamics in long-term evolution of comet tails, erupting material, and EUV dimming regions.
[The linear dimensions of human body measurements of Chinese male pilots in standing posture].
Guo, Xiao-chao; Liu, Bao-shan; Xiao, Hui; Wang, Zhi-jie; Li, Rong; Guo, Hui
2003-02-01
To provide the latest anthropometric data of Chinese male pilots on a large scale. 94 linear dimensions of human body measurements were defined, of which there are 42 fundamental items and 52 recommended items. The computer databanks were programmed, in which the subprograms were preset for data checking such as extreme value examination, logical judgement for data relationship, and measuring-remeasuring difference test. All workers were well trained before pilot measurements. 1739 male pilots from China Air Force was measured for the 42 fundamental items, and of which 904 pilots were measured for the 52 recommended items. Mean, standard deviation, the maximum value, the minimal value, and the 5th, 50th, 95th percentile data of all the 94 items were given. The quality of the data was stable and reliable. All data of the 94 linear dimensions of human body measurements were valid and reliable with high precision.
Research in Stochastic Processes.
1982-12-01
constant high level boundary. References 1. Jurg Husler , Extremie values of non-stationary sequ-ences ard the extr-rmal index, Center for Stochastic...A. Weron, Oct. 82. 20. "Extreme values of non-stationary sequences and the extremal index." Jurg Husler , Oct. 82. 21. "A finitely additive white noise...string model, Y. Miyahara, Carleton University and Nagoya University. Sept. 22 On extremfe values of non-stationary sequences, J. Husler , University of
More tornadoes in the most extreme U.S. tornado outbreaks
NASA Astrophysics Data System (ADS)
Tippett, Michael K.; Lepore, Chiara; Cohen, Joel E.
2016-12-01
Tornadoes and severe thunderstorms kill people and damage property every year. Estimated U.S. insured losses due to severe thunderstorms in the first half of 2016 were $8.5 billion (US). The largest U.S. effects of tornadoes result from tornado outbreaks, which are sequences of tornadoes that occur in close succession. Here, using extreme value analysis, we find that the frequency of U.S. outbreaks with many tornadoes is increasing and that it is increasing faster for more extreme outbreaks. We model this behavior by extreme value distributions with parameters that are linear functions of time or of some indicators of multidecadal climatic variability. Extreme meteorological environments associated with severe thunderstorms show consistent upward trends, but the trends do not resemble those currently expected to result from global warming.
The neutral Higgs self-couplings in the (h)MSSM
NASA Astrophysics Data System (ADS)
Chalons, G.; Djouadi, A.; Quevillon, J.
2018-05-01
We consider the Minimal Supersymmetric extension of the Standard Model in the regime where the supersymmetric breaking scale is extremely large. In this MSSM, not only the Higgs masses will be affected by large radiative corrections, the dominant part of which is provided by the third generation quark/squark sector, but also the various self-couplings among the Higgs states. In this note, assuming that squarks are extremely heavy, we evaluate the next-to-leading order radiative corrections to the two neutral CP-even Higgs self-couplings λHhh and λhhh and to the partial decay width Γ (H → hh) that are most relevant at the LHC. The calculation is performed using an effective field theory approach that resums the large logarithmic squark contributions and allows to keep under control the perturbative expansion. Since the direct loop vertex corrections are generally missing in this effective approach, we have properly renormalised the effective theory to take them into account. Finally, we perform a comparison of the results in this effective MSSM with those obtained in a much simpler way in the so-called hMSSM approach in which the mass value for the lightest Higgs boson Mh = 125 GeV is used as an input. We show that the hMSSM provides a reasonably good approximation of the corrected self-couplings and H → hh decay rate and, hence, it can be used also in these cases.
Changes in extremes due to half a degree warming in observations and models
NASA Astrophysics Data System (ADS)
Fischer, E. M.; Schleussner, C. F.; Pfleiderer, P.
2017-12-01
Assessing the climate impacts of half-a-degree warming increments is high on the post-Paris science agenda. Discriminating those effects is particularly challenging for climate extremes such as heavy precipitation and heat extremes for which model uncertainties are generally large, and for which internal variability is so important that it can easily offset or strongly amplify the forced local changes induced by half a degree warming. Despite these challenges we provide evidence for large-scale changes in the intensity and frequency of climate extremes due to half a degree warming. We first assess the difference in extreme climate indicators in observational data for the 1960s and 1970s versus the recent past, two periods differ by half a degree. We identify distinct differences for the global and continental-scale occurrence of heat and heavy precipitation extremes. We show that those observed changes in heavy precipitation and heat extremes broadly agree with simulated historical differences and are informative for the projected differences between 1.5 and 2°C warming despite different radiative forcings. We therefore argue that evidence from the observational record can inform the debate about discernible climate impacts in the light of model uncertainty by providing a conservative estimate of the implications of 0.5°C warming. A limitation of using the observational record arises from potential non-linearities in the response of climate extremes to a certain level of warming. We test for potential non-linearities in the response of heat and heavy precipitation extremes in a large ensemble of transient climate simulations. We further quantify differences between a time-window approach in a coupled model large ensemble vs. time-slice experiments using prescribed SST experiments performed in the context of the HAPPI-MIP project. Thereby we provide different lines of evidence that half a degree warming leads to substantial changes in the expected occurrence of heat and heavy precipitation extremes.
Role of absorbing aerosols on hot extremes in India in a GCM
NASA Astrophysics Data System (ADS)
Mondal, A.; Sah, N.; Venkataraman, C.; Patil, N.
2017-12-01
Temperature extremes and heat waves in North-Central India during the summer months of March through June are known for causing significant impact in terms of human health, productivity and mortality. While greenhouse gas-induced global warming is generally believed to intensify the magnitude and frequency of such extremes, aerosols are usually associated with an overall cooling, by virtue of their dominant radiation scattering nature, in most world regions. Recently, large-scale atmospheric conditions leading to heat wave and extreme temperature conditions have been analysed for the North-Central Indian region. However, the role of absorbing aerosols, including black carbon and dust, is still not well understood, in mediating hot extremes in the region. In this study, we use 30-year simulations from a chemistry-coupled atmosphere-only General Circulation Model (GCM), ECHAM6-HAM2, forced with evolving aerosol emissions in an interactive aerosol module, along with observed sea surface temperatures, to examine large-scale and mesoscale conditions during hot extremes in India. The model is first validated with observed gridded temperature and reanalysis data, and is found to represent observed variations in temperature in the North-Central region and concurrent large-scale atmospheric conditions during high temperature extremes realistically. During these extreme events, changes in near surface properties include a reduction in single scattering albedo and enhancement in short-wave solar heating rate, compared to climatological conditions. This is accompanied by positive anomalies of black carbon and dust aerosol optical depths. We conclude that the large-scale atmospheric conditions such as the presence of anticyclones and clear skies, conducive to heat waves and high temperature extremes, are exacerbated by absorbing aerosols in North-Central India. Future air quality regulations are expected to reduce sulfate particles and their masking of GHG warming. It is concurrently important to mitigate emissions of warming black carbon particles, to manage future climate change-induced hot extremes.
Stable Isotope Systematics of Martian Perchlorate
NASA Astrophysics Data System (ADS)
Martin, P.; Farley, K. A.; Archer, D., Jr.; Atreya, S. K.; Conrad, P. G.; Eigenbrode, J. L.; Fairen, A.; Franz, H. B.; Freissinet, C.; Glavin, D. P.; Mahaffy, P. R.; Malespin, C.; Ming, D. W.; Navarro-Gonzalez, R.; Sutter, B.
2015-12-01
Chlorine isotopic compositions in HCl released during evolved gas analysis (EGA) runs have been detected by the Sample Analysis at Mars (SAM) instrument on the Curiosity rover ranging from approximately -9‰ to -50‰ δ37Cl, with two spatially and isotopically separated groups of samples averaging -15‰ and -45‰. These extremely low values are the first such detection of any known natural material; common terrestrial values very rarely exceed ±5‰, and the most extreme isotopic signature yet detected elsewhere in the solar system are values of around +24‰ on the Moon. The only other known location in the solar system with large negative chlorine isotopes is the Atacama Desert, where perchlorate with -14‰ δ37Cl has been detected. The Atacama perchlorate has unusual Δ17O signatures associated with it, indicating a formation mechanism involving O3, which suggests an atmospheric origin of the perchlorate and associated large isotopic anomalies. Identification of non-zero positive Δ17O signatures in the O2 released during EGA runs would allow definitive evidence for a similar process having occurred on Mars. Perchlorate is thought to be the most likely source of HCl in EGA runs due to the simultaneous onset of O2 release. If perchlorate is indeed the HCl source, atmospheric chemistry could be responsible for the observed isotopic anomalies, with variable extents of perchlorate production producing the isotopic variability. However, chloride salts have also been observed to release HCl upon heating; if the timing of O2 release is merely coincidental, observed HCl could be coming from chlorides. At thermodynamic equilibrium, the fractionation factor of perchlorate reduction is 0.93, meaning that differing amounts of post-deposition reduction of isotopically normal perchlorate to chloride could account for the highly variable Cl isotopes. Additionally, post-deposition reduction could account for the difference between the two Cl isotopic groups if perchlorate is the HCl source, as the residual perchlorate after reduction will be isotopically heavy. Therefore, conclusive determination of the origin of HCl released during EGA is vital to understanding the origin of this large δ37Cl anomaly.
Eccentricity growth and orbit flip in near-coplanar hierarchical three-body systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Gongjie; Naoz, Smadar; Kocsis, Bence
2014-04-20
The secular dynamical evolution of a hierarchical three-body system in which a distant third object orbits around a binary has been studied extensively, demonstrating that the inner orbit can undergo large eccentricity and inclination oscillations. It was shown before that starting with a circular inner orbit, large mutual inclination (40°-140°) can produce long timescale modulations that drive the eccentricity to extremely large values and can flip the orbit. Here, we demonstrate that starting with an almost coplanar configuration, for eccentric inner and outer orbits, the eccentricity of the inner orbit can still be excited to high values, and the orbitmore » can flip by ∼180°, rolling over its major axis. The ∼180° flip criterion and the flip timescale are described by simple analytic expressions that depend on the initial orbital parameters. With tidal dissipation, this mechanism can produce counter-orbiting exoplanetary systems. In addition, we also show that this mechanism has the potential to enhance the tidal disruption or collision rates for different systems. Furthermore, we explore the entire e {sub 1} and i {sub 0} parameter space that can produce flips.« less
Extreme value modelling of Ghana stock exchange index.
Nortey, Ezekiel N N; Asare, Kwabena; Mettle, Felix Okoe
2015-01-01
Modelling of extreme events has always been of interest in fields such as hydrology and meteorology. However, after the recent global financial crises, appropriate models for modelling of such rare events leading to these crises have become quite essential in the finance and risk management fields. This paper models the extreme values of the Ghana stock exchange all-shares index (2000-2010) by applying the extreme value theory (EVT) to fit a model to the tails of the daily stock returns data. A conditional approach of the EVT was preferred and hence an ARMA-GARCH model was fitted to the data to correct for the effects of autocorrelation and conditional heteroscedastic terms present in the returns series, before the EVT method was applied. The Peak Over Threshold approach of the EVT, which fits a Generalized Pareto Distribution (GPD) model to excesses above a certain selected threshold, was employed. Maximum likelihood estimates of the model parameters were obtained and the model's goodness of fit was assessed graphically using Q-Q, P-P and density plots. The findings indicate that the GPD provides an adequate fit to the data of excesses. The size of the extreme daily Ghanaian stock market movements were then computed using the value at risk and expected shortfall risk measures at some high quantiles, based on the fitted GPD model.
Symbolic Analysis of Concurrent Programs with Polymorphism
NASA Technical Reports Server (NTRS)
Rungta, Neha Shyam
2010-01-01
The current trend of multi-core and multi-processor computing is causing a paradigm shift from inherently sequential to highly concurrent and parallel applications. Certain thread interleavings, data input values, or combinations of both often cause errors in the system. Systematic verification techniques such as explicit state model checking and symbolic execution are extensively used to detect errors in such systems [7, 9]. Explicit state model checking enumerates possible thread schedules and input data values of a program in order to check for errors [3, 9]. To partially mitigate the state space explosion from data input values, symbolic execution techniques substitute data input values with symbolic values [5, 7, 6]. Explicit state model checking and symbolic execution techniques used in conjunction with exhaustive search techniques such as depth-first search are unable to detect errors in medium to large-sized concurrent programs because the number of behaviors caused by data and thread non-determinism is extremely large. We present an overview of abstraction-guided symbolic execution for concurrent programs that detects errors manifested by a combination of thread schedules and data values [8]. The technique generates a set of key program locations relevant in testing the reachability of the target locations. The symbolic execution is then guided along these locations in an attempt to generate a feasible execution path to the error state. This allows the execution to focus in parts of the behavior space more likely to contain an error.
NASA Astrophysics Data System (ADS)
Woo, Hye-Jin; Park, Kyung-Ae
2017-09-01
Significant wave height (SWH) data of nine satellite altimeters were validated with in-situ SWH measurements from buoy stations in the East/Japan Sea (EJS) and the Northwest Pacific Ocean. The spatial and temporal variability of extreme SWHs was investigated by defining the 90th, 95th, and 99th percentiles based on percentile analysis. The annual mean of extreme SWHs was dramatically increased by 3.45 m in the EJS, which is significantly higher than the normal mean of about 1.44 m. The spatial distributions of SWHs showed significantly higher values in the eastern region of the EJS than those in the western part. Characteristic seasonality was found from the time-series SWHs with high SWHs (>2.5 m) in winter but low values (<1 m) in summer. The trends of the normal and extreme (99th percentile) SWHs in the EJS had a positive value of 0.0056 m year-1 and 0.0125 m year-1, respectively. The long-term trend demonstrated that higher SWH values were more extreme with time during the past decades. The predominant spatial distinctions between the coastal regions in the marginal seas of the Northwest Pacific Ocean and open ocean regions were presented. In spring, both normal and extreme SWHs showed substantially increasing trends in the EJS. Finally, we first presented the impact of the long-term trend of extreme SWHs on the marine ecosystem through vertical mixing enhancement in the upper ocean of the EJS.
Lenzuni, Paolo
2015-07-01
The purpose of this article is to develop a method for the statistical inference of the maximum peak sound pressure level and of the associated uncertainty. Both quantities are requested by the EU directive 2003/10/EC for a complete and solid assessment of the noise exposure at the workplace. Based on the characteristics of the sound pressure waveform, it is hypothesized that the distribution of the measured peak sound pressure levels follows the extreme value distribution. The maximum peak level is estimated as the largest member of a finite population following this probability distribution. The associated uncertainty is also discussed, taking into account not only the contribution due to the incomplete sampling but also the contribution due to the finite precision of the instrumentation. The largest of the set of measured peak levels underestimates the maximum peak sound pressure level. The underestimate can be as large as 4 dB if the number of measurements is limited to 3-4, which is common practice in occupational noise assessment. The extended uncertainty is also quite large (~2.5 dB), with a weak dependence on the sampling details. Following the procedure outlined in this article, a reliable comparison between the peak sound pressure levels measured in a workplace and the EU directive action limits is possible. Non-compliance can occur even when the largest of the set of measured peak levels is several dB below such limits. © The Author 2015. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Ki-Myeong; Weinberg, Erick J.; Physics Department, Columbia University, New York, New York 10027
2009-01-15
We explore the characteristics of spherical bags made of large numbers of BPS magnetic monopoles. There are two extreme limits. In the Abelian bag, N zeros of the Higgs field are arranged in a quasiregular lattice on a sphere of radius R{sub cr}{approx}N/v, where v is the Higgs vacuum expectation value. The massive gauge fields of the theory are largely confined to a thin shell at this radius that separates an interior with almost vanishing magnetic and Higgs fields from an exterior region with long-range Coulomb magnetic and Higgs fields. In the other limiting case, which we term a non-Abelianmore » bag, the N zeros of the Higgs field are all the origin, but there is again a thin shell of radius R{sub cr}. In this case the region enclosed by this shell can be viewed as a large monopole core, with small Higgs field but nontrivial massive and massless gauge fields.« less
ERIC Educational Resources Information Center
Kinnier, Richard T.
1984-01-01
Examined the resolution of value conflicts in 60 adults who wrote a solution to their conflicts. Compared extreme resolutions with those representing compromise. Compromisers and extremists did not differ in how rationally resolved they were about their solutions but compromisers felt better about their solutions. (JAC)
Lux in obscuro II: photon orbits of extremal AdS black holes revisited
NASA Astrophysics Data System (ADS)
Tang, Zi-Yu; Ong, Yen Chin; Wang, Bin
2017-12-01
A large class of spherically symmetric static extremal black hole spacetimes possesses a stable null photon sphere on their horizons. For the extremal Kerr-Newman family, the photon sphere only really coincides with the horizon in the sense clarified by Doran. The condition under which a photon orbit is stable on an asymptotically flat extremal Kerr-Newman black hole horizon has recently been clarified; it is found that a sufficiently large angular momentum destabilizes the photon orbit, whereas an electrical charge tends to stabilize it. We investigated the effect of a negative cosmological constant on this observation, and found the same behavior in the case of extremal asymptotically Kerr-Newman-AdS black holes in (3+1) -dimensions. In (2+1) -dimensions, in the presence of an electrical charge, the angular momentum never becomes large enough to destabilize the photon orbit. We comment on the instabilities of black hole spacetimes with a stable photon orbit.
Calvente, Irene; Dávila-Arias, Cristina; Ocón-Hernández, Olga; Pérez-Lobato, Rocío; Ramos, Rosa; Artacho-Cordón, Francisco; Olea, Nicolás; Núñez, María Isabel; Fernández, Mariana F.
2014-01-01
Objective To characterize the exposure to electric fields and magnetic fields of non-ionizing radiation in the electromagnetic spectrum (15 Hz to 100 kHz) in the dwellings of children from the Spanish Environment and Childhood-“INMA” population-based birth cohort. Methodology The study sample was drawn from the INMA-Granada cohort. Out of 300 boys participating in the 9–10 year follow-up, 123 families agreed to the exposure assessment at home and completed a specific ad hoc questionnaire gathering information on sources of non-ionizing radiation electric and magnetic fields inside the homes and on patterns of use. Long-term indoor measurements were carried out in the living room and bedroom. Results Survey data showed a low exposure in the children's homes according to reference levels of the International Commission on Non-Ionizing Radiation Protection but with large differences among homes in mean and maximum values. Daytime electrostatic and magnetic fields were below the quantification limit in 78.6% (92 dwellings) and 92.3% (108 dwellings) of houses, with an arithmetic mean value (± standard deviation) of 7.31±9.32 V/m and 162.30±91.16 nT, respectively. Mean magnetic field values were 1.6 lower during the night than the day. Nocturnal electrostatic values were not measured. Exposure levels were influenced by the area of residence (higher values in urban/semi-urban versus rural areas), type of dwelling, age of dwelling, floor of the dwelling, and season. Conclusion Given the greater sensitivity to extremely low-frequency electromagnetic fields of children and following the precautionary principle, preventive measures are warranted to reduce their exposure. PMID:25192253
Calvente, Irene; Dávila-Arias, Cristina; Ocón-Hernández, Olga; Pérez-Lobato, Rocío; Ramos, Rosa; Artacho-Cordón, Francisco; Olea, Nicolás; Núñez, María Isabel; Fernández, Mariana F
2014-01-01
To characterize the exposure to electric fields and magnetic fields of non-ionizing radiation in the electromagnetic spectrum (15 Hz to 100 kHz) in the dwellings of children from the Spanish Environment and Childhood-"INMA" population-based birth cohort. The study sample was drawn from the INMA-Granada cohort. Out of 300 boys participating in the 9-10 year follow-up, 123 families agreed to the exposure assessment at home and completed a specific ad hoc questionnaire gathering information on sources of non-ionizing radiation electric and magnetic fields inside the homes and on patterns of use. Long-term indoor measurements were carried out in the living room and bedroom. Survey data showed a low exposure in the children's homes according to reference levels of the International Commission on Non-Ionizing Radiation Protection but with large differences among homes in mean and maximum values. Daytime electrostatic and magnetic fields were below the quantification limit in 78.6% (92 dwellings) and 92.3% (108 dwellings) of houses, with an arithmetic mean value (± standard deviation) of 7.31±9.32 V/m and 162.30±91.16 nT, respectively. Mean magnetic field values were 1.6 lower during the night than the day. Nocturnal electrostatic values were not measured. Exposure levels were influenced by the area of residence (higher values in urban/semi-urban versus rural areas), type of dwelling, age of dwelling, floor of the dwelling, and season. Given the greater sensitivity to extremely low-frequency electromagnetic fields of children and following the precautionary principle, preventive measures are warranted to reduce their exposure.
NASA Technical Reports Server (NTRS)
Grotjahn, Richard; Black, Robert; Leung, Ruby; Wehner, Michael F.; Barlow, Mathew; Bosilovich, Michael G.; Gershunov, Alexander; Gutowski, William J., Jr.; Gyakum, John R.; Katz, Richard W.;
2015-01-01
The objective of this paper is to review statistical methods, dynamics, modeling efforts, and trends related to temperature extremes, with a focus upon extreme events of short duration that affect parts of North America. These events are associated with large scale meteorological patterns (LSMPs). The statistics, dynamics, and modeling sections of this paper are written to be autonomous and so can be read separately. Methods to define extreme events statistics and to identify and connect LSMPs to extreme temperature events are presented. Recent advances in statistical techniques connect LSMPs to extreme temperatures through appropriately defined covariates that supplement more straightforward analyses. Various LSMPs, ranging from synoptic to planetary scale structures, are associated with extreme temperature events. Current knowledge about the synoptics and the dynamical mechanisms leading to the associated LSMPs is incomplete. Systematic studies of: the physics of LSMP life cycles, comprehensive model assessment of LSMP-extreme temperature event linkages, and LSMP properties are needed. Generally, climate models capture observed properties of heat waves and cold air outbreaks with some fidelity. However they overestimate warm wave frequency and underestimate cold air outbreak frequency, and underestimate the collective influence of low-frequency modes on temperature extremes. Modeling studies have identified the impact of large-scale circulation anomalies and landatmosphere interactions on changes in extreme temperatures. However, few studies have examined changes in LSMPs to more specifically understand the role of LSMPs on past and future extreme temperature changes. Even though LSMPs are resolvable by global and regional climate models, they are not necessarily well simulated. The paper concludes with unresolved issues and research questions.
NASA Astrophysics Data System (ADS)
Welch, R. M.; Sengupta, S. K.; Kuo, K. S.
1988-04-01
Statistical measures of the spatial distributions of gray levels (cloud reflectivities) are determined for LANDSAT Multispectral Scanner digital data. Textural properties for twelve stratocumulus cloud fields, seven cumulus fields, and two cirrus fields are examined using the Spatial Gray Level Co-Occurrence Matrix method. The co-occurrence statistics are computed for pixel separations ranging from 57 m to 29 km and at angles of 0°, 45°, 90° and 135°. Nine different textual measures are used to define the cloud field spatial relationships. However, the measures of contrast and correlation appear to be most useful in distinguishing cloud structure.Cloud field macrotexture describes general cloud field characteristics at distances greater than the size of typical cloud elements. It is determined from the spatial asymptotic values of the texture measures. The slope of the texture curves at small distances provides a measure of the microtexture of individual cloud cells. Cloud fields composed primarily of small cells have very steep slopes and reach their asymptotic values at short distances from the origin. As the cells composing the cloud field grow larger, the slope becomes more gradual and the asymptotic distance increases accordingly. Low asymptotic values of correlation show that stratocumulus cloud fields have no large scale organized structure.Besides the ability to distinguish cloud field structure, texture appears to be a potentially valuable tool in cloud classification. Stratocumulus clouds are characterized by low values of angular second moment and large values of entropy. Cirrus clouds appear to have extremely low values of contrast, low values of entropy, and very large values of correlation.Finally, we propose that sampled high spatial resolution satellite data be used in conjunction with coarser resolution operational satellite data to detect and identify cloud field structure and directionality and to locate regions of subresolution scale cloud contamination.
Modeling extreme PM10 concentration in Malaysia using generalized extreme value distribution
NASA Astrophysics Data System (ADS)
Hasan, Husna; Mansor, Nadiah; Salleh, Nur Hanim Mohd
2015-05-01
Extreme PM10 concentration from the Air Pollutant Index (API) at thirteen monitoring stations in Malaysia is modeled using the Generalized Extreme Value (GEV) distribution. The data is blocked into monthly selection period. The Mann-Kendall (MK) test suggests a non-stationary model so two models are considered for the stations with trend. The likelihood ratio test is used to determine the best fitted model and the result shows that only two stations favor the non-stationary model (Model 2) while the other eleven stations favor stationary model (Model 1). The return level of PM10 concentration that is expected to exceed the maximum once within a selected period is obtained.
NASA Astrophysics Data System (ADS)
Bates, P. D.; Quinn, N.; Sampson, C. C.; Smith, A.; Wing, O.; Neal, J. C.
2017-12-01
Remotely sensed data has transformed the field of large scale hydraulic modelling. New digital elevation, hydrography and river width data has allowed such models to be created for the first time, and remotely sensed observations of water height, slope and water extent has allowed them to be calibrated and tested. As a result, we are now able to conduct flood risk analyses at national, continental or even global scales. However, continental scale analyses have significant additional complexity compared to typical flood risk modelling approaches. Traditional flood risk assessment uses frequency curves to define the magnitude of extreme flows at gauging stations. The flow values for given design events, such as the 1 in 100 year return period flow, are then used to drive hydraulic models in order to produce maps of flood hazard. Such an approach works well for single gauge locations and local models because over relatively short river reaches (say 10-60km) one can assume that the return period of an event does not vary. At regional to national scales and across multiple river catchments this assumption breaks down, and for a given flood event the return period will be different at different gauging stations, a pattern known as the event `footprint'. Despite this, many national scale risk analyses still use `constant in space' return period hazard layers (e.g. the FEMA Special Flood Hazard Areas) in their calculations. Such an approach can estimate potential exposure, but will over-estimate risk and cannot determine likely flood losses over a whole region or country. We address this problem by using a stochastic model to simulate many realistic extreme event footprints based on observed gauged flows and the statistics of gauge to gauge correlations. We take the entire USGS gauge data catalogue for sites with > 45 years of record and use a conditional approach for multivariate extreme values to generate sets of flood events with realistic return period variation in space. We undertake a number of quality checks of the stochastic model and compare real and simulated footprints to show that the method is able to re-create realistic patterns even at continental scales where there is large variation in flood generating mechanisms. We then show how these patterns can be used to drive a large scale 2D hydraulic to predict regional scale flooding.
Quantifying the consequences of changing hydroclimatic extremes on protection levels for the Rhine
NASA Astrophysics Data System (ADS)
Sperna Weiland, Frederiek; Hegnauer, Mark; Buiteveld, Hendrik; Lammersen, Rita; van den Boogaard, Henk; Beersma, Jules
2017-04-01
The Dutch method for quantifying the magnitude and frequency of occurrence of discharge extremes in the Rhine basin and the potential influence of climate change hereon are presented. In the Netherlands flood protection design requires estimates of discharge extremes for return periods of 1000 up to 100,000 years. Observed discharge records are too short to derive such extreme return discharges, therefore extreme value assessment is based on very long synthetic discharge time-series generated with the Generator of Rainfall And Discharge Extremes (GRADE). The GRADE instrument consists of (1) a stochastic weather generator based on time series resampling of historical f rainfall and temperature and (2) a hydrological model optimized following the GLUE methodology and (3) a hydrodynamic model to simulate the propagation of flood waves based on the generated hydrological time-series. To assess the potential influence of climate change, the four KNMI'14 climate scenarios are applied. These four scenarios represent a large part of the uncertainty provided by the GCMs used for the IPCC 5th assessment report (the CMIP5 GCM simulations under different climate forcings) and are for this purpose tailored to the Rhine and Meuse river basins. To derive the probability distributions of extreme discharges under climate change the historical synthetic rainfall and temperature series simulated with the weather generator are transformed to the future following the KNMI'14 scenarios. For this transformation the Advanced Delta Change method, which allows that the changes in the extremes differ from those in the means, is used. Subsequently the hydrological model is forced with the historical and future (i.e. transformed) synthetic time-series after which the propagation of the flood waves is simulated with the hydrodynamic model to obtain the extreme discharge statistics both for current and future climate conditions. The study shows that both for 2050 and 2085 increases in discharge extremes for the river Rhine at Lobith are projected by all four KNMI'14 climate scenarios. This poses increased requirements for flood protection design in order to prepare for changing climate conditions.
Polygenic determinants in extremes of high-density lipoprotein cholesterol[S
Dron, Jacqueline S.; Wang, Jian; Low-Kam, Cécile; Khetarpal, Sumeet A.; Robinson, John F.; McIntyre, Adam D.; Ban, Matthew R.; Cao, Henian; Rhainds, David; Dubé, Marie-Pierre; Rader, Daniel J.; Lettre, Guillaume; Tardif, Jean-Claude
2017-01-01
HDL cholesterol (HDL-C) remains a superior biochemical predictor of CVD risk, but its genetic basis is incompletely defined. In patients with extreme HDL-C concentrations, we concurrently evaluated the contributions of multiple large- and small-effect genetic variants. In a discovery cohort of 255 unrelated lipid clinic patients with extreme HDL-C levels, we used a targeted next-generation sequencing panel to evaluate rare variants in known HDL metabolism genes, simultaneously with common variants bundled into a polygenic trait score. Two additional cohorts were used for validation and included 1,746 individuals from the Montréal Heart Institute Biobank and 1,048 individuals from the University of Pennsylvania. Findings were consistent between cohorts: we found rare heterozygous large-effect variants in 18.7% and 10.9% of low- and high-HDL-C patients, respectively. We also found common variant accumulation, indicated by extreme polygenic trait scores, in an additional 12.8% and 19.3% of overall cases of low- and high-HDL-C extremes, respectively. Thus, the genetic basis of extreme HDL-C concentrations encountered clinically is frequently polygenic, with contributions from both rare large-effect and common small-effect variants. Multiple types of genetic variants should be considered as contributing factors in patients with extreme dyslipidemia. PMID:28870971
Polygenic determinants in extremes of high-density lipoprotein cholesterol.
Dron, Jacqueline S; Wang, Jian; Low-Kam, Cécile; Khetarpal, Sumeet A; Robinson, John F; McIntyre, Adam D; Ban, Matthew R; Cao, Henian; Rhainds, David; Dubé, Marie-Pierre; Rader, Daniel J; Lettre, Guillaume; Tardif, Jean-Claude; Hegele, Robert A
2017-11-01
HDL cholesterol (HDL-C) remains a superior biochemical predictor of CVD risk, but its genetic basis is incompletely defined. In patients with extreme HDL-C concentrations, we concurrently evaluated the contributions of multiple large- and small-effect genetic variants. In a discovery cohort of 255 unrelated lipid clinic patients with extreme HDL-C levels, we used a targeted next-generation sequencing panel to evaluate rare variants in known HDL metabolism genes, simultaneously with common variants bundled into a polygenic trait score. Two additional cohorts were used for validation and included 1,746 individuals from the Montréal Heart Institute Biobank and 1,048 individuals from the University of Pennsylvania. Findings were consistent between cohorts: we found rare heterozygous large-effect variants in 18.7% and 10.9% of low- and high-HDL-C patients, respectively. We also found common variant accumulation, indicated by extreme polygenic trait scores, in an additional 12.8% and 19.3% of overall cases of low- and high-HDL-C extremes, respectively. Thus, the genetic basis of extreme HDL-C concentrations encountered clinically is frequently polygenic, with contributions from both rare large-effect and common small-effect variants. Multiple types of genetic variants should be considered as contributing factors in patients with extreme dyslipidemia. Copyright © 2017 by the American Society for Biochemistry and Molecular Biology, Inc.
Stress Transfer Mechanisms at the Submicron Level for Graphene/Polymer Systems
2015-01-01
The stress transfer mechanism from a polymer substrate to a nanoinclusion, such as a graphene flake, is of extreme interest for the production of effective nanocomposites. Previous work conducted mainly at the micron scale has shown that the intrinsic mechanism of stress transfer is shear at the interface. However, since the interfacial shear takes its maximum value at the very edge of the nanoinclusion it is of extreme interest to assess the effect of edge integrity upon axial stress transfer at the submicron scale. Here, we conduct a detailed Raman line mapping near the edges of a monolayer graphene flake that is simply supported onto an epoxy-based photoresist (SU8)/poly(methyl methacrylate) matrix at steps as small as 100 nm. We show for the first time that the distribution of axial strain (stress) along the flake deviates somewhat from the classical shear-lag prediction for a region of ∼2 μm from the edge. This behavior is mainly attributed to the presence of residual stresses, unintentional doping, and/or edge effects (deviation from the equilibrium values of bond lengths and angles, as well as different edge chiralities). By considering a simple balance of shear-to-normal stresses at the interface we are able to directly convert the strain (stress) gradient to values of interfacial shear stress for all the applied tensile levels without assuming classical shear-lag behavior. For large flakes a maximum value of interfacial shear stress of 0.4 MPa is obtained prior to flake slipping. PMID:25644121
More tornadoes in the most extreme U.S. tornado outbreaks.
Tippett, Michael K; Lepore, Chiara; Cohen, Joel E
2016-12-16
Tornadoes and severe thunderstorms kill people and damage property every year. Estimated U.S. insured losses due to severe thunderstorms in the first half of 2016 were $8.5 billion (US). The largest U.S. effects of tornadoes result from tornado outbreaks, which are sequences of tornadoes that occur in close succession. Here, using extreme value analysis, we find that the frequency of U.S. outbreaks with many tornadoes is increasing and that it is increasing faster for more extreme outbreaks. We model this behavior by extreme value distributions with parameters that are linear functions of time or of some indicators of multidecadal climatic variability. Extreme meteorological environments associated with severe thunderstorms show consistent upward trends, but the trends do not resemble those currently expected to result from global warming. Copyright © 2016, American Association for the Advancement of Science.
Scale dependency of regional climate modeling of current and future climate extremes in Germany
NASA Astrophysics Data System (ADS)
Tölle, Merja H.; Schefczyk, Lukas; Gutjahr, Oliver
2017-11-01
A warmer climate is projected for mid-Europe, with less precipitation in summer, but with intensified extremes of precipitation and near-surface temperature. However, the extent and magnitude of such changes are associated with creditable uncertainty because of the limitations of model resolution and parameterizations. Here, we present the results of convection-permitting regional climate model simulations for Germany integrated with the COSMO-CLM using a horizontal grid spacing of 1.3 km, and additional 4.5- and 7-km simulations with convection parameterized. Of particular interest is how the temperature and precipitation fields and their extremes depend on the horizontal resolution for current and future climate conditions. The spatial variability of precipitation increases with resolution because of more realistic orography and physical parameterizations, but values are overestimated in summer and over mountain ridges in all simulations compared to observations. The spatial variability of temperature is improved at a resolution of 1.3 km, but the results are cold-biased, especially in summer. The increase in resolution from 7/4.5 km to 1.3 km is accompanied by less future warming in summer by 1 ∘C. Modeled future precipitation extremes will be more severe, and temperature extremes will not exclusively increase with higher resolution. Although the differences between the resolutions considered (7/4.5 km and 1.3 km) are small, we find that the differences in the changes in extremes are large. High-resolution simulations require further studies, with effective parameterizations and tunings for different topographic regions. Impact models and assessment studies may benefit from such high-resolution model results, but should account for the impact of model resolution on model processes and climate change.
Sea Extremes: Integrated impact assessment in coastal climate adaptation
NASA Astrophysics Data System (ADS)
Sorensen, Carlo; Knudsen, Per; Broge, Niels; Molgaard, Mads; Andersen, Ole
2016-04-01
We investigate effects of sea level rise and a change in precipitation pattern on coastal flooding hazards. Historic and present in situ and satellite data of water and groundwater levels, precipitation, vertical ground motion, geology, and geotechnical soil properties are combined with flood protection measures, topography, and infrastructure to provide a more complete picture of the water-related impact from climate change at an exposed coastal location. Results show that future sea extremes evaluated from extreme value statistics may, indeed, have a large impact. The integrated effects from future storm surges and other geo- and hydro-parameters need to be considered in order to provide for the best protection and mitigation efforts, however. Based on the results we present and discuss a simple conceptual model setup that can e.g. be used for 'translation' of regional sea level rise evidence and projections to concrete impact measures. This may be used by potentially affected stakeholders -often working in different sectors and across levels of governance, in a common appraisal of the challenges faced ahead. The model may also enter dynamic tools to evaluate local impact as sea level research advances and projections for the future are updated.
Ramkumar, Prem N; Muschler, George F; Spindler, Kurt P; Harris, Joshua D; McCulloch, Patrick C; Mont, Michael A
2017-04-01
The recent private-public partnership to unlock and utilize all available health data has large-scale implications for public health and personalized medicine, especially within orthopedics. Today, consumer based technologies such as smartphones and "wearables" store tremendous amounts of personal health data (known as "mHealth") that, when processed and contextualized, have the potential to open new windows of insight for the orthopedic surgeon about their patients. In the present report, the landscape, role, and future technical considerations of mHealth and open architecture are defined with particular examples in lower extremity arthroplasty. A limitation of the current mHealth landscape is the fragmentation and lack of interconnectivity between the myriad of available apps. The importance behind the currently lacking open mHealth architecture is underscored by the offer of improved research, increased workflow efficiency, and value capture for the orthopedic surgeon. There exists an opportunity to leverage existing mobile health data for orthopaedic surgeons, particularly those specializing in lower extremity arthroplasty, by transforming patient small data into insightful big data through the implementation of "open" architecture that affords universal data standards and a global interconnected network. Copyright © 2016 Elsevier Inc. All rights reserved.
Grotjahn, Richard; Black, Robert; Leung, Ruby; ...
2015-05-22
This paper reviews research approaches and open questions regarding data, statistical analyses, dynamics, modeling efforts, and trends in relation to temperature extremes. Our specific focus is upon extreme events of short duration (roughly less than 5 days) that affect parts of North America. These events are associated with large scale meteorological patterns (LSMPs). Methods used to define extreme events statistics and to identify and connect LSMPs to extreme temperatures are presented. Recent advances in statistical techniques can connect LSMPs to extreme temperatures through appropriately defined covariates that supplements more straightforward analyses. A wide array of LSMPs, ranging from synoptic tomore » planetary scale phenomena, have been implicated as contributors to extreme temperature events. Current knowledge about the physical nature of these contributions and the dynamical mechanisms leading to the implicated LSMPs is incomplete. There is a pressing need for (a) systematic study of the physics of LSMPs life cycles and (b) comprehensive model assessment of LSMP-extreme temperature event linkages and LSMP behavior. Generally, climate models capture the observed heat waves and cold air outbreaks with some fidelity. However they overestimate warm wave frequency and underestimate cold air outbreaks frequency, and underestimate the collective influence of low-frequency modes on temperature extremes. Climate models have been used to investigate past changes and project future trends in extreme temperatures. Overall, modeling studies have identified important mechanisms such as the effects of large-scale circulation anomalies and land-atmosphere interactions on changes in extreme temperatures. However, few studies have examined changes in LSMPs more specifically to understand the role of LSMPs on past and future extreme temperature changes. Even though LSMPs are resolvable by global and regional climate models, they are not necessarily well simulated so more research is needed to understand the limitations of climate models and improve model skill in simulating extreme temperatures and their associated LSMPs. Furthermore, the paper concludes with unresolved issues and research questions.« less
NASA Astrophysics Data System (ADS)
Burnik Šturm, Martina; Ganbaatar, Oyunsaikhan; Voigt, Christian C.; Kaczensky, Petra
2017-04-01
Hydrogen (δ2H) and oxygen (δ18O) isotope values of water are widely used to track the global hydrological cycle and the global δ2H and δ18O patterns of precipitation are increasingly used in studies on animal migration, forensics, food authentication and traceability studies. However, δ2H and δ18O values of precipitation spanning one or more years are available for only a few 100 locations worldwide and for many remote areas such as Mongolia data are still scarce. We obtained the first field-based δ2H and δ18O isotope data of event-based precipitation, rivers and other water bodies in the extreme environment of the Dzungarian Gobi desert in SW Mongolia, covering a period of 16 months (1). Our study area is located over 450 km north-east from the nearest IAEA GNIP station (Fukang station, China) from which it is separated by a mountain range at the international border between China and Mongolia. Isotope values of the collected event-based precipitation showed and extreme range and a high seasonal variability with higher and more variable values in summer and lower in winter. The high variability could not be explained by different origin of air masses alone (i.e. NW polar winds over Russia or westerlies over Central Asia; analyzed using back-trajectory HYSPLIT model), but is likely a result of a combination of different processes affecting the isotope values of precipitation in this area. The calculated field-based local meteoric water line (LMWL, δ2H=(7.42±0.16)δ18O-(23.87±3.27)) showed isotopic characteristics of precipitation in an arid region. We observed a slight discrepancy between the filed based and modelled (Online Isotope in Precipitation Calculator, OIPC) LMWL which highlighted the difficulty of modelling the δ2H and δ18O values for areas with extreme climatic conditions and thus emphasized the importance of collecting long-term field-based data. The collected isotopic data of precipitation and other water bodies provide a basis for future studies in this largely understudied region. (1)Burnik Šturm M., Ganbaatar O., Voigt C.C., Kaczensky P. (2016) First field-based observations of δ2H and δ18O values of precipitation, rivers and other water bodies in the Dzungarian Gobi, SW Mongolia. Isotopes in Environmental and Health Studies, doi: 10.1080/10256016.2016.1231184
Probabilistic forecasting of extreme weather events based on extreme value theory
NASA Astrophysics Data System (ADS)
Van De Vyver, Hans; Van Schaeybroeck, Bert
2016-04-01
Extreme events in weather and climate such as high wind gusts, heavy precipitation or extreme temperatures are commonly associated with high impacts on both environment and society. Forecasting extreme weather events is difficult, and very high-resolution models are needed to describe explicitly extreme weather phenomena. A prediction system for such events should therefore preferably be probabilistic in nature. Probabilistic forecasts and state estimations are nowadays common in the numerical weather prediction community. In this work, we develop a new probabilistic framework based on extreme value theory that aims to provide early warnings up to several days in advance. We consider the combined events when an observation variable Y (for instance wind speed) exceeds a high threshold y and its corresponding deterministic forecasts X also exceeds a high forecast threshold y. More specifically two problems are addressed:} We consider pairs (X,Y) of extreme events where X represents a deterministic forecast, and Y the observation variable (for instance wind speed). More specifically two problems are addressed: Given a high forecast X=x_0, what is the probability that Y>y? In other words: provide inference on the conditional probability: [ Pr{Y>y|X=x_0}. ] Given a probabilistic model for Problem 1, what is the impact on the verification analysis of extreme events. These problems can be solved with bivariate extremes (Coles, 2001), and the verification analysis in (Ferro, 2007). We apply the Ramos and Ledford (2009) parametric model for bivariate tail estimation of the pair (X,Y). The model accommodates different types of extremal dependence and asymmetry within a parsimonious representation. Results are presented using the ensemble reforecast system of the European Centre of Weather Forecasts (Hagedorn, 2008). Coles, S. (2001) An Introduction to Statistical modelling of Extreme Values. Springer-Verlag.Ferro, C.A.T. (2007) A probability model for verifying deterministic forecasts of extreme events. Wea. Forecasting {22}, 1089-1100.Hagedorn, R. (2008) Using the ECMWF reforecast dataset to calibrate EPS forecasts. ECMWF Newsletter, {117}, 8-13.Ramos, A., Ledford, A. (2009) A new class of models for bivariate joint tails. J.R. Statist. Soc. B {71}, 219-241.
NASA Astrophysics Data System (ADS)
Walz, M. A.; Donat, M.; Leckebusch, G. C.
2017-12-01
As extreme wind speeds are responsible for large socio-economic losses in Europe, a skillful prediction would be of great benefit for disaster prevention as well as for the actuarial community. Here we evaluate patterns of large-scale atmospheric variability and the seasonal predictability of extreme wind speeds (e.g. >95th percentile) in the European domain in the dynamical seasonal forecast system ECMWF System 4, and compare to the predictability based on a statistical prediction model. The dominant patterns of atmospheric variability show distinct differences between reanalysis and ECMWF System 4, with most patterns in System 4 extended downstream in comparison to ERA-Interim. The dissimilar manifestations of the patterns within the two models lead to substantially different drivers associated with the occurrence of extreme winds in the respective model. While the ECMWF System 4 is shown to provide some predictive power over Scandinavia and the eastern Atlantic, only very few grid cells in the European domain have significant correlations for extreme wind speeds in System 4 compared to ERA-Interim. In contrast, a statistical model predicts extreme wind speeds during boreal winter in better agreement with the observations. Our results suggest that System 4 does not seem to capture the potential predictability of extreme winds that exists in the real world, and therefore fails to provide reliable seasonal predictions for lead months 2-4. This is likely related to the unrealistic representation of large-scale patterns of atmospheric variability. Hence our study points to potential improvements of dynamical prediction skill by improving the simulation of large-scale atmospheric dynamics.
Eight guidelines for developing a strategy for the '90s.
Kaufman, N
1994-03-20
Regardless of the outcome of federal reform initiatives, health care is undergoing structural change of unprecedented magnitude. Structural change occurs when there is a fundamental, sustainable change in the values and purchasing behavior of buyers. During such times, market leaders are extremely vulnerable to competitive threats due to internal bureaucratic barriers. Witness the U.S. computer and automobile industries. As Robert Lutz, president of Chrysler, points out, "Being large doesn't mean being safe. The large won't eat the small. The swift will eat the slow." During this dynamic period in health care, it is critical that strategy be on target. Periods of structural change are filled with numerous threats as well as opportunities. The following are eight guidelines for developing health care strategy during the structural changes of the '90s.
Visual Analysis among Novices: Training and Trend Lines as Graphic Aids
ERIC Educational Resources Information Center
Nelson, Peter M.; Van Norman, Ethan R.; Christ, Theodore J.
2017-01-01
The current study evaluated the degree to which novice visual analysts could discern trends in simulated time-series data across differing levels of variability and extreme values. Forty-five novice visual analysts were trained in general principles of visual analysis. One group received brief training on how to identify and omit extreme values.…
Implementing Extreme Value Analysis in a Geospatial Workflow for Storm Surge Hazard Assessment
NASA Astrophysics Data System (ADS)
Catelli, J.; Nong, S.
2014-12-01
Gridded data of 100-yr (1%) and 500-yr (0.2%) storm surge flood elevations for the United States, Gulf of Mexico, and East Coast are critical to understanding this natural hazard. Storm surge heights were calculated across the study area utilizing SLOSH (Sea, Lake, and Overland Surges from Hurricanes) model data for thousands of synthetic US landfalling hurricanes. Based on the results derived from SLOSH, a series of interpolations were performed using spatial analysis in a geographic information system (GIS) at both the SLOSH basin and the synthetic event levels. The result was a single grid of maximum flood elevations for each synthetic event. This project addresses the need to utilize extreme value theory in a geospatial environment to analyze coincident cells across multiple synthetic events. The results are 100-yr (1%) and 500-yr (0.2%) values for each grid cell in the study area. This talk details a geospatial approach to move raster data to SciPy's NumPy Array structure using the Python programming language. The data are then connected through a Python library to an outside statistical package like R to fit cell values to extreme value theory distributions and return values for specified recurrence intervals. While this is not a new process, the value behind this work is the ability to keep this process in a single geospatial environment and be able to easily replicate this process for other natural hazard applications and extreme event modeling.
Liu, Yang; Zhang, Mingqing; Fang, Xiuqi
2018-03-20
By merging reconstructed phenological series from published articles and observations of China Phenology Observation Network (CPON), the first blooming date of Amygdalus davidiana (FBA) in Beijing between 1741 and 2000 is reconstructed. The Butterworth method is used to remove the multi-year variations for generating the phenological series of annual variations in the first blooming date of A. davidiana. The extreme delay years in the phenological series are identified using the percentage threshold method. The characteristics of the extreme delays and the correspondence of these events with natural forcings are analysed. The main results are as follows. In annual phenological series, the extreme delays appeared in single year as main feature, only A.D.1800-1801, 1816-1817 and 1983-1984 were the events of two consecutively extreme years. Approximately 85% of the extreme delays occurred during 1-2 years after the large volcanic eruptions (VEI ≥ 4) in the eastern rim or the western rim of the Pacific Ocean, as the same proportion of the extreme delays followed El Niño events. About 73% years of the extreme delays fall in the valleys of sunspot cycles or the Dalton minimum period in the year or the previous year. According to the certainty factor (CF), the large eruptions have the greatest influence to the extreme delays; sunspot activity is the second, and ENSO is the last one. The extreme phenological delayed year is most likely to occur after a large eruption, which particularly occurs during El Niño year and its previous several years were in the descending portion or valley of sunspot phase.
NASA Astrophysics Data System (ADS)
Liu, Yang; Zhang, Mingqing; Fang, Xiuqi
2018-03-01
By merging reconstructed phenological series from published articles and observations of China Phenology Observation Network (CPON), the first blooming date of Amygdalus davidiana (FBA) in Beijing between 1741 and 2000 is reconstructed. The Butterworth method is used to remove the multi-year variations for generating the phenological series of annual variations in the first blooming date of A. davidiana. The extreme delay years in the phenological series are identified using the percentage threshold method. The characteristics of the extreme delays and the correspondence of these events with natural forcings are analysed. The main results are as follows. In annual phenological series, the extreme delays appeared in single year as main feature, only A.D.1800-1801, 1816-1817 and 1983-1984 were the events of two consecutively extreme years. Approximately 85% of the extreme delays occurred during 1-2 years after the large volcanic eruptions (VEI ≥ 4) in the eastern rim or the western rim of the Pacific Ocean, as the same proportion of the extreme delays followed El Niño events. About 73% years of the extreme delays fall in the valleys of sunspot cycles or the Dalton minimum period in the year or the previous year. According to the certainty factor (CF), the large eruptions have the greatest influence to the extreme delays; sunspot activity is the second, and ENSO is the last one. The extreme phenological delayed year is most likely to occur after a large eruption, which particularly occurs during El Niño year and its previous several years were in the descending portion or valley of sunspot phase.
Value-at-risk estimation with wavelet-based extreme value theory: Evidence from emerging markets
NASA Astrophysics Data System (ADS)
Cifter, Atilla
2011-06-01
This paper introduces wavelet-based extreme value theory (EVT) for univariate value-at-risk estimation. Wavelets and EVT are combined for volatility forecasting to estimate a hybrid model. In the first stage, wavelets are used as a threshold in generalized Pareto distribution, and in the second stage, EVT is applied with a wavelet-based threshold. This new model is applied to two major emerging stock markets: the Istanbul Stock Exchange (ISE) and the Budapest Stock Exchange (BUX). The relative performance of wavelet-based EVT is benchmarked against the Riskmetrics-EWMA, ARMA-GARCH, generalized Pareto distribution, and conditional generalized Pareto distribution models. The empirical results show that the wavelet-based extreme value theory increases predictive performance of financial forecasting according to number of violations and tail-loss tests. The superior forecasting performance of the wavelet-based EVT model is also consistent with Basel II requirements, and this new model can be used by financial institutions as well.
PERSISTENCE MAPPING USING EUV SOLAR IMAGER DATA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thompson, B. J.; Young, C. A., E-mail: barbara.j.thompson@nasa.gov
We describe a simple image processing technique that is useful for the visualization and depiction of gradually evolving or intermittent structures in solar physics extreme-ultraviolet imagery. The technique is an application of image segmentation, which we call “Persistence Mapping,” to isolate extreme values in a data set, and is particularly useful for the problem of capturing phenomena that are evolving in both space and time. While integration or “time-lapse” imaging uses the full sample (of size N ), Persistence Mapping rejects ( N − 1)/ N of the data set and identifies the most relevant 1/ N values using themore » following rule: if a pixel reaches an extreme value, it retains that value until that value is exceeded. The simplest examples isolate minima and maxima, but any quantile or statistic can be used. This paper demonstrates how the technique has been used to extract the dynamics in long-term evolution of comet tails, erupting material, and EUV dimming regions.« less
The association between preceding drought occurrence and heat waves in the Mediterranean
NASA Astrophysics Data System (ADS)
Russo, Ana; Gouveia, Célia M.; Ramos, Alexandre M.; Páscoa, Patricia; Trigo, Ricardo M.
2017-04-01
A large number of weather driven extreme events has occurred worldwide in the last decade, namely in Europe that has been struck by record breaking extreme events with unprecedented socio-economic impacts, including the mega-heatwaves of 2003 in Europe and 2010 in Russia, and the large droughts in southwestern Europe in 2005 and 2012. The last IPCC report on extreme events points that a changing climate can lead to changes in the frequency, intensity, spatial extent, duration, and timing of weather and climate extremes. These, combined with larger exposure, can result in unprecedented risk to humans and ecosystems. In this context it is becoming increasingly relevant to improve the early identification and predictability of such events, as they negatively affect several socio-economic activities. Moreover, recent diagnostic and modelling experiments have confirmed that hot extremes are often preceded by surface moisture deficits in some regions throughout the world. In this study we analyze if the occurrence of hot extreme months is enhanced by the occurrence of preceding drought events throughout the Mediterranean area. In order to achieve this purpose, the number of hot days in the regions' hottest month will be associated with a drought indicator. The evolution and characterization of drought was analyzed using both the Standardized Precipitation Evaporation Index (SPEI) and the Standardized Precipitation Index (SPI), as obtained from CRU TS3.23 database for the period 1950-2014. We have used both SPI and SPEI for different time scales between 3 and 9 months with a spatial resolution of 0.5°. The number of hot days and nights per month (NHD and NHN) was determined using the ECAD-EOBS daily dataset for the same period and spatial resolution (dataset v14). The NHD and NHN were computed, respectively, as the number of days with a maximum or minimum temperature exceeding the 90th percentile. Results show that the most frequent hottest months for the Mediterranean region occur in July and August. Moreover, the magnitude of correlations between detrended NHD/NHN and the preceding 6- and 9-month SPEI/SPI are usually dimmer than for the 3 month time-scale. Most regions exhibit significantly negative correlations, i.e. high (low) NHD/NHN following negative (positive) SPEI/SPI values, and thus a potential for NHD/NHN early warning. Finally, correlations between the NHD/NHN with SPI and SPEI differ, with SPEI characterized by slightly higher values observed mainly for the 3-months time-scale. Acknowledgments: This work was partially supported by national funds through FCT (Fundação para a Ciência e a Tecnologia, Portugal) under project IMDROFLOOD (WaterJPI/0004/2014). Ana Russo thanks FCT for granted support (SFRH/BPD/99757/2014). A. M. Ramos was also supported by a FCT postdoctoral grant (FCT/DFRH/ SFRH/BPD/84328/2012).
Pathology and genomics of pediatric melanoma: A critical reexamination and new insights.
Bahrami, Armita; Barnhill, Raymond L
2018-02-01
The clinicopathologic features of pediatric melanoma are distinct from those of the adult counterpart. For example, most childhood melanomas exhibit a uniquely favorable biologic behavior, save for those arising in large/giant congenital nevi. Recent studies suggest that the characteristically favorable biologic behavior of childhood melanoma may be related to extreme telomere shortening and dysfunction in the cancer cells. Herein, we review the genomic profiles that have been defined for the different subtypes of pediatric melanoma and particularly emphasize the potential prognostic value of telomerase reverse transcriptase alterations for these tumors. © 2017 Wiley Periodicals, Inc.
Mechanical Properties of Cu-Cr-Nb Alloys
NASA Technical Reports Server (NTRS)
Ellis, David L.
1997-01-01
The chemical compositions of the alloys are listed. The alloying levels were near the values for stochiometric Cr2Nb. A slight excess of Cr was chosen for increased hydrogen embrittlement resistance. The microstructures of all Cu-Cr-Nb alloys were very similar. Two typical transmission electron microscope (TEM) micrographs are presented. The images show the presence of large mount of Cr2Nb precipitates in a nearly pure Cu matrix. The interactions between dislocations and precipitates are currently under investigations, but as the images demonstrates, the extremely fine (less then 15 nm) Cr2Nb are the primary strengtheners for the alloy.
Universal energy distribution for interfaces in a random-field environment
NASA Astrophysics Data System (ADS)
Fedorenko, Andrei A.; Stepanow, Semjon
2003-11-01
We study the energy distribution function ρ(E) for interfaces in a random-field environment at zero temperature by summing the leading terms in the perturbation expansion of ρ(E) in powers of the disorder strength, and by taking into account the nonperturbational effects of the disorder using the functional renormalization group. We have found that the average and the variance of the energy for one-dimensional interface of length L behave as,
Microbial loop contribution to exergy in the sediments of the Marsala lagoon (Italy)
NASA Astrophysics Data System (ADS)
Pusceddu, A.; Danovaro, R.
2003-04-01
Recent advances in ecological modelling have stressed the need for new descriptors of ecosystem health, able to consider the actual transfer of energy through food webs, including also the potential transfer/loss of (genetic) information. In ecological terms, exergy is defined as a goal function which, as sum of energy (biomass) and (genetic) information contained in a given system due to living organisms, acts as a quality indicator of ecosystems. Biopolymeric organic carbon (BPC) quantity and biochemical composition, bacteria, heterotrophic nanoflagellate and meiofauna abundance, biomass and exergy contents were investigated, on a seasonal basis, in the Marsala lagoon (Mediterranean Sea), at two stations characterized by contrasting hydrodynamic conditions. Carbohydrate (2.8 mg g-1), protein (1.6 mg g-1) and lipid (0.86 mg g-1) contents were extremely high, with values at the more exposed station about 3 times lower than those at the sheltered one. BPC (on average 2.5 mg C g-1), dominated by carbohydrates (50%), was mostly refractory and largely unaccounted for by primary organic matter (4% of BPC), indicating that the Marsala lagoon sediments act as a "detritus sink". At both stations, bacterial (on average 0.3 mg C g-1) and heterotrophic nanoflagellate (9.8 μgC g-1) biomass values were rather high, whereas meiofauna biomass was extremely low (on average 7.2 μg C cm-2). The exergy transfer along the benthic microbial loop components in the Marsala lagoon appeared largely bottlenecked by the refractory composition of organic detritus. In the more exposed station, the exergy transfer towards the higher trophic levels was more efficient than in the sheltered one. Although total exergy values were significantly higher in summer than in winter, at both stations the exergy transfer in winter was more efficient than in summer. Our results indicate that, in 'detritus sink' systems, auxiliary energy (e.g., wind-induced sediment resuspension) might be of paramount importance for increasing efficiency of organic detritus channeling to higher trophic levels.
Benefits of an ultra large and multiresolution ensemble for estimating available wind power
NASA Astrophysics Data System (ADS)
Berndt, Jonas; Hoppe, Charlotte; Elbern, Hendrik
2016-04-01
In this study we investigate the benefits of an ultra large ensemble with up to 1000 members including multiple nesting with a target horizontal resolution of 1 km. The ensemble shall be used as a basis to detect events of extreme errors in wind power forecasting. Forecast value is the wind vector at wind turbine hub height (~ 100 m) in the short range (1 to 24 hour). Current wind power forecast systems rest already on NWP ensemble models. However, only calibrated ensembles from meteorological institutions serve as input so far, with limited spatial resolution (˜10 - 80 km) and member number (˜ 50). Perturbations related to the specific merits of wind power production are yet missing. Thus, single extreme error events which are not detected by such ensemble power forecasts occur infrequently. The numerical forecast model used in this study is the Weather Research and Forecasting Model (WRF). Model uncertainties are represented by stochastic parametrization of sub-grid processes via stochastically perturbed parametrization tendencies and in conjunction via the complementary stochastic kinetic-energy backscatter scheme already provided by WRF. We perform continuous ensemble updates by comparing each ensemble member with available observations using a sequential importance resampling filter to improve the model accuracy while maintaining ensemble spread. Additionally, we use different ensemble systems from global models (ECMWF and GFS) as input and boundary conditions to capture different synoptic conditions. Critical weather situations which are connected to extreme error events are located and corresponding perturbation techniques are applied. The demanding computational effort is overcome by utilising the supercomputer JUQUEEN at the Forschungszentrum Juelich.
NASA Astrophysics Data System (ADS)
Krishnamurti, T. N.; Kumar, Vinay
2017-04-01
This study addresses numerical prediction of atmospheric wave trains that provide a monsoonal link to the Arctic ice melt. The monsoonal link is one of several ways that heat is conveyed to the Arctic region. This study follows a detailed observational study on thermodynamic wave trains that are initiated by extreme rain events of the northern summer south Asian monsoon. These wave trains carry large values of heat content anomalies, heat transports and convergence of flux of heat. These features seem to be important candidates for the rapid melt scenario. This present study addresses numerical simulation of the extreme rains, over India and Pakistan, and the generation of thermodynamic wave trains, simulations of large heat content anomalies, heat transports along pathways and heat flux convergences, potential vorticity and the diabatic generation of potential vorticity. We compare model based simulation of many features such as precipitation, divergence and the divergent wind with those evaluated from the reanalysis fields. We have also examined the snow and ice cover data sets during and after these events. This modeling study supports our recent observational findings on the monsoonal link to the rapid Arctic ice melt of the Canadian Arctic. This numerical modeling suggests ways to interpret some recent episodes of rapid ice melts that may require a well-coordinated field experiment among atmosphere, ocean, ice and snow cover scientists. Such a well-coordinated study would sharpen our understanding of this one component of the ice melt, i.e. the monsoonal link, which appears to be fairly robust.
Ege, Tolga; Unlu, Aytekin; Tas, Huseyin; Bek, Dogan; Turkan, Selim; Cetinkaya, Aytac
2015-01-01
Decision of limb salvage or amputation is generally aided with several trauma scoring systems such as the mangled extremity severity score (MESS). However, the reliability of the injury scores in the settling of open fractures due to explosives and missiles is challenging. Mortality and morbidity of the extremity trauma due to firearms are generally associated with time delay in revascularization, injury mechanism, anatomy of the injured site, associated injuries, age and the environmental circumstance. The purpose of the retrospective study was to evaluate the extent of extremity injuries due to ballistic missiles and to detect the reliability of mangled extremity severity score (MESS) in both upper and lower extremities. Between 2004 and 2014, 139 Gustillo Anderson Type III open fractures of both the upper and lower extremities were enrolled in the study. Data for patient age, fire arm type, transporting time from the field to the hospital (and the method), injury severity scores, MESS scores, fracture types, amputation levels, bone fixation methods and postoperative infections and complications retrieved from the two level-2 trauma center's data base. Sensitivity, specificity, positive and negative predictive values of the MESS were calculated to detect the ability in deciding amputation in the mangled limb. Amputation was performed in 39 extremities and limb salvage attempted in 100 extremities. The mean followup time was 14.6 months (range 6-32 months). In the amputated group, the mean MESS scores for upper and lower extremity were 8.8 (range 6-11) and 9.24 (range 6-11), respectively. In the limb salvage group, the mean MESS scores for upper and lower extremities were 5.29 (range 4-7) and 5.19 (range 3-8), respectively. Sensitivity of MESS in upper and lower extremities were calculated as 80% and 79.4% and positive predictive values detected as 55.55% and 83.3%, respectively. Specificity of MESS score for upper and lower extremities was 84% and 86.6%; negative predictive values were calculated as 95.45% and 90.2%, respectively. MESS is not predictive in combat related extremity injuries especially if between a score of 6-8. Limb ischemia and presence or absence of shock can be used in initial decision-making for amputation.
Ege, Tolga; Unlu, Aytekin; Tas, Huseyin; Bek, Dogan; Turkan, Selim; Cetinkaya, Aytac
2015-01-01
Background: Decision of limb salvage or amputation is generally aided with several trauma scoring systems such as the mangled extremity severity score (MESS). However, the reliability of the injury scores in the settling of open fractures due to explosives and missiles is challenging. Mortality and morbidity of the extremity trauma due to firearms are generally associated with time delay in revascularization, injury mechanism, anatomy of the injured site, associated injuries, age and the environmental circumstance. The purpose of the retrospective study was to evaluate the extent of extremity injuries due to ballistic missiles and to detect the reliability of mangled extremity severity score (MESS) in both upper and lower extremities. Materials and Methods: Between 2004 and 2014, 139 Gustillo Anderson Type III open fractures of both the upper and lower extremities were enrolled in the study. Data for patient age, fire arm type, transporting time from the field to the hospital (and the method), injury severity scores, MESS scores, fracture types, amputation levels, bone fixation methods and postoperative infections and complications retrieved from the two level-2 trauma center's data base. Sensitivity, specificity, positive and negative predictive values of the MESS were calculated to detect the ability in deciding amputation in the mangled limb. Results: Amputation was performed in 39 extremities and limb salvage attempted in 100 extremities. The mean followup time was 14.6 months (range 6–32 months). In the amputated group, the mean MESS scores for upper and lower extremity were 8.8 (range 6–11) and 9.24 (range 6–11), respectively. In the limb salvage group, the mean MESS scores for upper and lower extremities were 5.29 (range 4–7) and 5.19 (range 3–8), respectively. Sensitivity of MESS in upper and lower extremities were calculated as 80% and 79.4% and positive predictive values detected as 55.55% and 83.3%, respectively. Specificity of MESS score for upper and lower extremities was 84% and 86.6%; negative predictive values were calculated as 95.45% and 90.2%, respectively. Conclusion: MESS is not predictive in combat related extremity injuries especially if between a score of 6–8. Limb ischemia and presence or absence of shock can be used in initial decision-making for amputation. PMID:26806974
Statistical Modeling of Extreme Values and Evidence of Presence of Dragon King (DK) in Solar Wind
NASA Astrophysics Data System (ADS)
Gomes, T.; Ramos, F.; Rempel, E. L.; Silva, S.; C-L Chian, A.
2017-12-01
The solar wind constitutes a nonlinear dynamical system, presenting intermittent turbulence, multifractality and chaotic dynamics. One characteristic shared by many such complex systems is the presence of extreme events, that play an important role in several Geophysical phenomena and their statistical characterization is a problem of great practical relevance. This work investigates the presence of extreme events in time series of the modulus of the interplanetary magnetic field measured by Cluster spacecraft on February 2, 2002. One of the main results is that the solar wind near the Earth's bow shock can be modeled by the Generalized Pareto (GP) and Generalized Extreme Values (GEV) distributions. Both models present a statistically significant positive shape parameter which implyies a heavy tail in the probability distribution functions and an unbounded growth in return values as return periods become too long. There is evidence that current sheets are the main responsible for positive values of the shape parameter. It is also shown that magnetic reconnection at the interface between two interplanetary magnetic flux ropes in the solar wind can be considered as Dragon Kings (DK), a class of extreme events whose formation mechanisms are fundamentally different from others. As long as magnetic reconnection can be classified as a Dragon King, there is the possibility of its identification and even its prediction. Dragon kings had previously been identified in time series of financial crashes, nuclear power generation accidents, stock market and so on. It is believed that they are associated with the occurrence of extreme events in dynamical systems at phase transition, bifurcation, crises or tipping points.
NASA Astrophysics Data System (ADS)
Staehelin, J.; Rieder, H. E.; Maeder, J. A.; Ribatet, M.; Davison, A. C.; Stübi, R.
2009-04-01
Atmospheric ozone protects the biota living at the Earth's surface from harmful solar UV-B and UV-C radiation. The global ozone shield is expected to gradually recover from the anthropogenic disturbance of ozone depleting substances (ODS) in the coming decades. The stratospheric ozone layer at extratropics might significantly increase above the thickness of the chemically undisturbed atmosphere which might enhance ozone concentrations at the tropopause altitude where ozone is an important greenhouse gas. At Arosa, a resort village in the Swiss Alps, total ozone measurements started in 1926 leading to the longest total ozone series of the world. One Fery spectrograph and seven Dobson spectrophotometers were operated at Arosa and the method used to homogenize the series will be presented. Due to its unique length the series allows studying total ozone in the chemically undisturbed as well as in the ODS loaded stratosphere. The series is particularly valuable to study natural variability in the period prior to 1970, when ODS started to affect stratospheric ozone. Concepts developed by extreme value statistics allow objective definitions of "ozone extreme high" and "ozone extreme low" values by fitting the (daily mean) time series using the Generalized Pareto Distribution (GPD). Extreme high ozone events can be attributed to effects of ElNino and/or NAO, whereas in the chemically disturbed stratosphere high frequencies of extreme low total ozone values simultaneously occur with periods of strong polar ozone depletion (identified by statistical modeling with Equivalent Stratospheric Chlorine times Volume of Stratospheric Polar Clouds) and volcanic eruptions (such as El Chichon and Pinatubo).
Climate extremes drive changes in functional community structure.
Boucek, Ross E; Rehage, Jennifer S
2014-06-01
The response of communities to climate extremes can be quite variable. Much of this variation has been attributed to differences in community-specific functional trait diversity, as well as community composition. Yet, few if any studies have explicitly tested the response of the functional trait structure of communities following climate extremes (CEs). Recently in South Florida, two independent, but sequential potential CEs took place, a 2010 cold front, followed by a 2011 drought, both of which had profound impacts on a subtropical estuarine fish community. These CEs provided an opportunity to test whether the structure of South Florida fish communities following each extreme was a result of species-specific differences in functional traits. From historical temperature (1927-2012) and freshwater inflows records into the estuary (1955-2012), we determined that the cold front was a statistically extreme disturbance, while the drought was not, but rather a decadal rare disturbance. The two disturbances predictably affected different parts of functional community structure and thus different component species. The cold front virtually eliminated tropical species, including large-bodied snook, mojarra species, nonnative cichlids, and striped mullet, while having little affect on temperate fishes. Likewise, the drought severely impacted freshwater fishes including Florida gar, bowfin, and two centrarchids, with little effect on euryhaline species. Our findings illustrate the ability of this approach to predict and detect both the filtering effects of different types of disturbances and the implications of the resulting changes in community structure. Further, we highlight the value of this approach to developing predictive frameworks for better understanding community responses to global change. © 2014 John Wiley & Sons Ltd.
Using Extreme Tropical Precipitation Statistics to Constrain Future Climate States
NASA Astrophysics Data System (ADS)
Igel, M.; Biello, J. A.
2017-12-01
Tropical precipitation is characterized by a rapid growth in mean intensity as the column humidity increases. This behavior is examined in both a cloud resolving model and with high-resolution observations of precipitation and column humidity from CloudSat and AIRS, respectively. The model and the observations exhibit remarkable consistency and suggest a new paradigm for extreme precipitation. We show that the total precipitation can be decomposed into a product of contributions from a mean intensity, a probability of precipitation, and a global PDF of column humidity values. We use the modeling and observational results to suggest simple, analytic forms for each of these functions. The analytic representations are then used to construct a simple expression for the global accumulated precipitation as a function of the parameters of each of the component functions. As the climate warms, extreme precipitation intensity and global precipitation are expected to increase, though at different rates. When these predictions are incorporated into the new analytic expression for total precipitation, predictions for changes due to global warming to the probability of precipitation and the PDF of column humidity can be made. We show that strong constraints can be imposed on the future shape of the PDF of column humidity but that only weak constraints can be set on the probability of precipitation. These are largely imposed by the intensification of extreme precipitation. This result suggests that understanding precisely how extreme precipitation responds to climate warming is critical to predicting other impactful properties of global hydrology. The new framework can also be used to confirm and discount existing theories for shifting precipitation.
A New Sample of Cool Subdwarfs from SDSS: Properties and Kinematics
NASA Astrophysics Data System (ADS)
Savcheva, Antonia; West, Andrew A.; Bochanski, John J.
2014-06-01
We present a new sample of M subdwarfs compiled from the 7th data re- lease of the Sloan Digital Sky Survey. With 3517 new subdwarfs, this new sample significantly increases the number the existing sample of low-mass subdwarfs. This catalog includes unprecedentedly large numbers of extreme and ultra sudwarfs. Here, we present the catalog and the statistical analysis we perform. Subdwarf template spectra are derived. We show color-color and reduced proper motion diagrams of the three metallicity classes, which are shown to separate from the disk dwarf population. The extreme and ultra subdwarfs are seen at larger values of reduced proper motion as expected for more dynamically heated populations. We determine 3D kinematics for all of the stars with proper motions. The color-magnitude diagrams show a clear separation of the three metallicity classes with the ultra and extreme subdwarfs being significantly closer to the main sequence than the ordinary subdwarfs. All subdwarfs lie below and to the blue of the main sequence. Based on the average (U, V, W ) velocities and their dispersions, the extreme and ultra subdwarfs likely belong to the Galactic halo, while the ordinary subdwarfs are likely part of the old Galactic (or thick) disk. An extensive activity analy- sis of subdwarfs is performed using chromospheric Hα emission and 208 active subdwarfs are found. We show that while the activity fraction of subdwarfs rises with spectral class and levels off at the latest spectral classes, consistent with the behavior of M dwarfs, the extreme and ultra subdwarfs are basically flat.
NASA Astrophysics Data System (ADS)
Dodd, J. P.; Freimuth, E. J.; Olson, E. J.; Diefendorf, A. F.
2015-12-01
One of the main goals of tree ring isotope studies is to reconstruct climate-driven variations in the source water and antecedent precipitation; however, evaporation in the soil and leaves can significantly modify the isotope values of the source water. This is particularly the case in arid environments where evaporative effects are perhaps the most significant unknown variable when attempting to reconstruct regional-scale hydroclimate variations from tree ring isotope proxies. To quantify the effects of extreme aridity on α-cellulose δ18O values, we measured the oxygen isotope values of groundwater, xylem water, leaf water, and tree ring α-cellulose in an endemic species of drought-resistant trees (Prosopis tamarugo) from different microenvironments throughout the Atacama Desert of Northern Chile. Average annual precipitation is <5 mm/yr, and groundwater is the primary water source for P. tamarugo trees in the region. Groundwater δ18O values at the sample locations range from -6.7 to -9.7‰, and xylem water δ18O values record a systematic increase (ave. Δ18Ox-gw =+1.3‰; 2σ =1.0‰). Leaf waters are significantly affected by evaporative enrichment with a range of δ18O values from 7 to 23‰. This range most likely reflects a number of physiological and environmental conditions including tree size, canopy development, and sample time (i.e. morning vs. evening). However, despite the large variation in leaf water δ18O values, the average difference between the α-cellulose and groundwater is very consistent (Δ18Oc-gw = +39.7‰; 2σ =1.3‰). P. tamarugo samples were collected in austral spring, when tree growth was at its maximum; therefore, any seasonal variations in plant physiology not captured with this dataset will have a limited impact on cellulose production. These data demonstrate that despite the variable evaporative enrichment of 18O in the leaf water, the α-cellulose δ18O values provide a remarkably consistent record of variations in groundwater δ18O values in this extremely arid environment.
NASA Astrophysics Data System (ADS)
Agel, Laurie; Barlow, Mathew; Colby, Frank; Binder, Hanin; Catto, Jennifer L.; Hoell, Andrew; Cohen, Judah
2018-05-01
Previous work has identified six large-scale meteorological patterns (LSMPs) of dynamic tropopause height associated with extreme precipitation over the Northeast US, with extreme precipitation defined as the top 1% of daily station precipitation. Here, we examine the three-dimensional structure of the tropopause LSMPs in terms of circulation and factors relevant to precipitation, including moisture, stability, and synoptic mechanisms associated with lifting. Within each pattern, the link between the different factors and extreme precipitation is further investigated by comparing the relative strength of the factors between days with and without the occurrence of extreme precipitation. The six tropopause LSMPs include two ridge patterns, two eastern US troughs, and two troughs centered over the Ohio Valley, with a strong seasonality associated with each pattern. Extreme precipitation in the ridge patterns is associated with both convective mechanisms (instability combined with moisture transport from the Great Lakes and Western Atlantic) and synoptic forcing related to Great Lakes storm tracks and embedded shortwaves. Extreme precipitation associated with eastern US troughs involves intense southerly moisture transport and strong quasi-geostrophic forcing of vertical velocity. Ohio Valley troughs are associated with warm fronts and intense warm conveyor belts that deliver large amounts of moisture ahead of storms, but little direct quasi-geostrophic forcing. Factors that show the largest difference between days with and without extreme precipitation include integrated moisture transport, low-level moisture convergence, warm conveyor belts, and quasi-geostrophic forcing, with the relative importance varying between patterns.
Variability in winter climate and winter extremes reduces population growth of an alpine butterfly.
Roland, Jens; Matter, Stephen F
2013-01-01
We examined the long-term, 15-year pattern of population change in a network of 21 Rocky Mountain populations of Parnassius smintheus butterflies in response to climatic variation. We found that winter values of the broadscale climate variable, the Pacific Decadal Oscillation (PDO) index, were a strong predictor of annual population growth, much more so than were endogenous biotic factors related to population density. The relationship between PDO and population growth was nonlinear. Populations declined in years with extreme winter PDO values, when there were either extremely warm or extremely cold sea surface temperatures in the eastern Pacific relative to that in the western Pacific. Results suggest that more variable winters, and more frequent extremely cold or warm winters, will result in more frequent decline of these populations, a pattern exacerbated by the trend for increasingly variable winters seen over the past century.
Validation of extremes within the Perfect-Predictor Experiment of the COST Action VALUE
NASA Astrophysics Data System (ADS)
Hertig, Elke; Maraun, Douglas; Wibig, Joanna; Vrac, Mathieu; Soares, Pedro; Bartholy, Judith; Pongracz, Rita; Mares, Ileana; Gutierrez, Jose Manuel; Casanueva, Ana; Alzbutas, Robertas
2016-04-01
Extreme events are of widespread concern due to their damaging consequences on natural and anthropogenic systems. From science to applications the statistical attributes of rare and infrequent occurrence and low probability become connected with the socio-economic aspect of strong impact. Specific end-user needs regarding information about extreme events depend on the type of application, but as a joining element there is always the request for easily accessible climate change information with a clear description of their uncertainties and limitations. Within the Perfect-Predictor Experiment of the COST Action VALUE extreme indices modelled from a wide range of downscaling methods are compared to reference indices calculated from observational data. The experiment uses reference data from a selection of 86 weather stations representative of the different climates in Europe. Results are presented for temperature and precipitation extremes and include aspects of the marginal distribution as well as spell-length related aspects.
NASA Astrophysics Data System (ADS)
Dosio, Alessandro
2017-07-01
The most severe effects of global warning will be related to the frequency and severity of extreme events. We provide an analysis of projections of temperature and related extreme events for Africa based on a large ensemble of Regional Climate Models from the COordinated Regional climate Downscaling EXperiment (CORDEX). Results are presented not only by means of widely used indices but also with a recently developed Heat Wave Magnitude Index-daily (HWMId), which takes into account both heat wave duration and intensity. Results show that under RCP8.5, warming of more than 3.5 °C is projected in JFM over most of the continent, whereas in JAS temperatures over large part of Northern Africa, the Sahara and the Arabian peninsula are projected to increase up to 6 °C. Large increase in in the number of warm days (Tx90p) is found over sub equatorial Africa, with values up to more than 90 % in JAS, and more than 80 % in JFM over e.g., the gulf of Guinea, Central African Republic, South Sudan and Ethiopia. Changes in Tn90p (warm nights) are usually larger, with some models projecting Tn90p reaching 95 % starting from around 2060 even under RCP4.5 over the Gulf of Guinea and the Sahel. Results also show that the total length of heat spells projected to occur normally (i.e. once every 2 years) under RCP8.5 may be longer than those occurring once every 30 years under the lower emission scenario. By employing the recently developed HWMId index, it is possible to investigate the relationship between heat wave length ad intensity; in particular it is shown that very intense heat waves such as that occurring over the Horn of Africa may have values of HWMId larger than that of longer, but relatively weak, heat waves over West Africa.
Power laws and extreme values in antibody repertoires
NASA Astrophysics Data System (ADS)
Boyer, Sebastien; Biswas, Dipanwita; Scaramozzino, Natale; Kumar, Ananda Soshee; Nizak, Clément; Rivoire, Olivier
2015-03-01
Evolution by natural selection involves the succession of three steps: mutations, selection and proliferation. We are interested in describing and characterizing the result of selection over a population of many variants. After selection, this population will be dominated by the few best variants, with highest propensity to be selected, or highest ``selectivity.'' We ask the following question: how is the selectivity of the best variants distributed in the population? Extreme value theory, which characterizes the extreme tail of probability distributions in terms of a few universality class, has been proposed to describe it. To test this proposition and identify the relevant universality class, we performed quantitative in vitro experimental selections of libraries of >105 antibodies using the technique of phage display. Data obtained by high-throughput sequencing allows us to fit the selectivity distribution over more than two decades. In most experiments, the results show a striking power law for the selectivity distribution of the top antibodies, consistent with extreme value theory.
NASA Astrophysics Data System (ADS)
Alahmadi, F.; Rahman, N. A.; Abdulrazzak, M.
2014-09-01
Rainfall frequency analysis is an essential tool for the design of water related infrastructure. It can be used to predict future flood magnitudes for a given magnitude and frequency of extreme rainfall events. This study analyses the application of rainfall partial duration series (PDS) in the vast growing urban Madinah city located in the western part of Saudi Arabia. Different statistical distributions were applied (i.e. Normal, Log Normal, Extreme Value type I, Generalized Extreme Value, Pearson Type III, Log Pearson Type III) and their distribution parameters were estimated using L-moments methods. Also, different selection criteria models are applied, e.g. Akaike Information Criterion (AIC), Corrected Akaike Information Criterion (AICc), Bayesian Information Criterion (BIC) and Anderson-Darling Criterion (ADC). The analysis indicated the advantage of Generalized Extreme Value as the best fit statistical distribution for Madinah partial duration daily rainfall series. The outcome of such an evaluation can contribute toward better design criteria for flood management, especially flood protection measures.
Impact of an extreme climatic event on community assembly.
Thibault, Katherine M; Brown, James H
2008-03-04
Extreme climatic events are predicted to increase in frequency and magnitude, but their ecological impacts are poorly understood. Such events are large, infrequent, stochastic perturbations that can change the outcome of entrained ecological processes. Here we show how an extreme flood event affected a desert rodent community that has been monitored for 30 years. The flood (i) caused catastrophic, species-specific mortality; (ii) eliminated the incumbency advantage of previously dominant species; (iii) reset long-term population and community trends; (iv) interacted with competitive and metapopulation dynamics; and (v) resulted in rapid, wholesale reorganization of the community. This and a previous extreme rainfall event were punctuational perturbations-they caused large, rapid population- and community-level changes that were superimposed on a background of more gradual trends driven by climate and vegetation change. Captured by chance through long-term monitoring, the impacts of such large, infrequent events provide unique insights into the processes that structure ecological communities.
NASA Astrophysics Data System (ADS)
Arain, M. A.
2017-12-01
North American temperate forests are a critical component of the global carbon cycle and regional water resources. A large portion of these forests has traditionally been managed for timber production and other uses. The response of these forests, which are in different stages of development, to extreme weather events such as drought and heat stresses, climate variability and management regimes is not fully understood. In this study, eddy covariance flux measurements in an age sequence (77-, 42-, and 14-years old as of 2016) of white pine (Pinus strobus L.) plantation forests in southern Ontario, Canada are examined to determine the impact of heat and drought stresses and climate variability over a 14 year period (2003 to 2016). The mean annual net ecosystem productivity (NEP) values were 195 ± 87, 512 ±161 and 103 ± 103 g C m-2 year-1 in 77-, 42- and 14-year-old forests respectively, over the study period. The youngest forest became a net carbon sink in the fifth year of its growth. Air temperature was a dominant control on carbon fluxes and heat stress reduced photosynthesis much more as compared to ecosystem respiration in the growing season. A large decrease in annual NEP was observed during years experiencing heat waves. Drought stress had the strongest impact on the middle age forest which had the largest carbon sink and water demand. In contrast, young forest was more sensitive to heat stress, than drought. Severity of heat and drought stress impacts was highly dependent on the timing of these events. Simultaneous occurrence of heat and drought stress in the early growing season such as in 2012 and 2016 had a drastic negative impact on carbon balance in these forests due to plant-soil-atmosphere feedbacks. Future research should consider the timing of the extreme events, the stage of forest development and effects of extreme events on component fluxes. This research helps to assess the vulnerability of managed forests and their ecological and hydrological responses to climate change and extreme weather events.
NASA Astrophysics Data System (ADS)
Ostrenga, D.; Shen, S.; Vollmer, B.; Meyer, D. L.
2017-12-01
NASA climate reanalysis dataset from MERRA-2 contains numerous data for atmosphere, land, and ocean, that are grouped into 95 products of archived volume over 300 TB. The data files are saved as hourly-file, day-file (hourly time interval) and month-file containing up to 125 parameters. Due to the large number of data files and the sheer data volumes, it is a challenging for users, especially those in the application research community, to handle dealing with the original data files. Most of these researchers prefer to focus on a small region or single location using the hourly data for long time periods to analyze extreme weather events or say winds for renewable energy applications. At the GES DISC, we have been working closely with the science teams and the application user community to create several new value added data products and high quality services to facilitate the use of the model data for various types of research. We have tested converting hourly data from one-day per file into different data cubes, such as one-month, one-year, or whole-mission and then continued to analyze the efficiency of the accessibility of this newly structured data through various services. Initial results have shown that compared to the original file structure, the new data has significantly improved the performance for accessing long time series. It is noticed that the performance is associated to the cube size and structure, the compression method, and how the data are accessed. The optimized data cube structure will not only improve the data access, but also enable better online analytic services for doing statistical analysis and extreme events mining. Two case studies will be presented using the newly structured data and value added services, the California drought and the extreme drought of the Northeastern states of Brazil. Furthermore, data access and analysis through cloud storage capabilities will be investigated.
Variance analysis of forecasted streamflow maxima in a wet temperate climate
NASA Astrophysics Data System (ADS)
Al Aamery, Nabil; Fox, James F.; Snyder, Mark; Chandramouli, Chandra V.
2018-05-01
Coupling global climate models, hydrologic models and extreme value analysis provides a method to forecast streamflow maxima, however the elusive variance structure of the results hinders confidence in application. Directly correcting the bias of forecasts using the relative change between forecast and control simulations has been shown to marginalize hydrologic uncertainty, reduce model bias, and remove systematic variance when predicting mean monthly and mean annual streamflow, prompting our investigation for maxima streamflow. We assess the variance structure of streamflow maxima using realizations of emission scenario, global climate model type and project phase, downscaling methods, bias correction, extreme value methods, and hydrologic model inputs and parameterization. Results show that the relative change of streamflow maxima was not dependent on systematic variance from the annual maxima versus peak over threshold method applied, albeit we stress that researchers strictly adhere to rules from extreme value theory when applying the peak over threshold method. Regardless of which method is applied, extreme value model fitting does add variance to the projection, and the variance is an increasing function of the return period. Unlike the relative change of mean streamflow, results show that the variance of the maxima's relative change was dependent on all climate model factors tested as well as hydrologic model inputs and calibration. Ensemble projections forecast an increase of streamflow maxima for 2050 with pronounced forecast standard error, including an increase of +30(±21), +38(±34) and +51(±85)% for 2, 20 and 100 year streamflow events for the wet temperate region studied. The variance of maxima projections was dominated by climate model factors and extreme value analyses.
Koyama, Tetsuo; Marumoto, Kohei; Miyake, Hiroji; Domen, Kazuhisa
2013-11-01
This study examined the relationship between fractional anisotropy (FA) values of magnetic resonance-diffusion tensor imaging (DTI) and motor outcome (1 month after onset) in 15 patients with hemiparesis after ischemic stroke of corona radiata lesions. DTI data were obtained on days 14-18. FA values within the cerebral peduncle were analyzed using a computer-automated method. Motor outcome of hemiparesis was evaluated according to Brunnstrom stage (BRS; 6-point scale: severe to normal) for separate shoulder/elbow/forearm, wrist/hand, and lower extremity functions. The ratio of FA values in the affected hemisphere to those in the unaffected hemisphere (rFA) was assessed in relation to the BRS data (Spearman rank correlation test, P<.05). rFA values ranged from .715 to 1.002 (median=.924). BRS ranged from 1 to 6 (median=4) for shoulder/elbow/forearm, from 1 to 6 (median=5) for wrist/hand, and from 2 to 6 (median=4) for the lower extremities. Analysis revealed statistically significant relationships between rFA and upper extremity functions (correlation coefficient=.679 for shoulder/elbow/forearm and .706 for wrist/hand). Although slightly less evident, the relationship between rFA and lower extremity function was also statistically significant (correlation coefficient=.641). FA values within the cerebral peduncle are moderately associated with the outcome of both upper and lower extremity functions, suggesting that DTI may be applicable for outcome prediction in stroke patients with corona radiata infarct. Copyright © 2013 National Stroke Association. Published by Elsevier Inc. All rights reserved.
Will the warmer temperature bring the more intensity precipitation?
NASA Astrophysics Data System (ADS)
Yutong, Z., II; Wang, T.
2017-12-01
Will the warmer temperature bring the more intensity precipitation?Over the past several decades, changes in climate are amplified over the Tibetan Plateau(TP), with warming trend almost being twice as large as the global average. In sharp contrast, there is a large spatial discrepancy of the variations in precipitation extremes, with increasing trends found in the southern and decreasing trends in central TP. These features motivate are urgent need for an observation-based understanding of how precipitation extremes respond to climate change. Here we examine the relation between precipitation intensity with atmospheric temperature, dew point temperature (Td) and convective available potential energy (CAPE) in Tibet Plateau. Owing to the influences of the westerlies and Indian monsoon on Tibetan climate, the stations can be divided into three sub-regions in TP: the westerlies region (north of 35°N, N = 28), the monsoon region (south of 30°N in TP, N = 31), and the transition region (located between 30°N and 35°N, N = 48). We found that the intensity precipitation does not follow the C-C relation and there is a mix of positive and negative slope. To better understand why different scaling occurs with temperature in district region, using the dew point temperature replace the temperature, although there is significant variability in relative humidity values, at most stations, there appears to be a general increase in relative humidity associated. It is likely that the observed rise in relative humidity can assist in explaining the negative scaling of extreme precipitation at westerlies domain and monsoon domain, with the primary reason why precipitation extremes expected to increase follows from the fact that a warmer atmosphere can "hold" more moisture. This suggests that not only on how much the moisture the atmosphere can hold, but on how much moisture exits in atmosphere. To understand the role of dynamic on extreme precipitation, we repeat the precipitation intense analysis using ln(CAPE) as regression. The CAPE is the vertical integral of parcel buoyancy between the level of free convection and level of neutral buoyancy. We find almost all pixels are positive and pass through the 0.05 confidence limit. We can conclude that the intensity of moist convection is an important for the extreme precipitation.
Coastal-storm Inundation and Sea-level Rise in New Zealand Scott A. Stephens and Rob Bell
NASA Astrophysics Data System (ADS)
Stephens, S. A.; Bell, R.
2016-12-01
Coastal-storm inundation is a growing problem in New Zealand. It happens occasionally, when the combined forces of weather and sea line up, causing inundation of low-elevation land, coastal erosion, and rivers and stormwater systems to back up causing inland flooding. This becomes a risk where we have placed buildings and infrastructure too close to the coast. Coastal-storm inundation is not a new problem, it has happened historically, but it is becoming more frequent as the sea level continues to rise. From analyses of historic extreme sea-level events, we show how the different sea-level components, such as tide and storm surge, contribute to extreme sea-level and how these components vary around New Zealand. Recent sea-level analyses reveal some large storm surges, bigger than previously reported, and we show the type of weather patterns that drive them, and how this leads to differences in storm surge potential between the east and west coasts. Although large and damaging storm-tides have occurred historically, we show that there is potential for considerably larger elevations to be reached in the "perfect storm", and we estimate the likelihood of such extreme events occurring. Sea-level rise (SLR) will greatly increase the frequency, depth and consequences of coastal-storm inundation in the future. We show an application of a new method to determine the increasing frequency of extreme sea-levels with SLR, one which integrates the extreme tail with regularly-occurring high tides. We present spatial maps of several extreme sea-level threshold exceedance statistics for a case study at Mission Bay, Auckland, New Zealand. The maps show how the local community is likely to face decision points at various SLR thresholds, and we conclude that coastal hazard assessments should ideally use several SLR scenarios and time windows within the next 100 years or more to support the decision-making process for future coastal adaptation and when response options will be needed. In tandem, coastal hazard assessments should also provide information on SLR values linked to expected inundation frequency or depth. This can be linked to plausible timeframes for SLR thresholds to determine when critical decision points for adaptation might be reached, and we show how this might be achieved.
Catastrophe loss modelling of storm-surge flood risk in eastern England.
Muir Wood, Robert; Drayton, Michael; Berger, Agnete; Burgess, Paul; Wright, Tom
2005-06-15
Probabilistic catastrophe loss modelling techniques, comprising a large stochastic set of potential storm-surge flood events, each assigned an annual rate of occurrence, have been employed for quantifying risk in the coastal flood plain of eastern England. Based on the tracks of the causative extratropical cyclones, historical storm-surge events are categorized into three classes, with distinct windfields and surge geographies. Extreme combinations of "tide with surge" are then generated for an extreme value distribution developed for each class. Fragility curves are used to determine the probability and magnitude of breaching relative to water levels and wave action for each section of sea defence. Based on the time-history of water levels in the surge, and the simulated configuration of breaching, flow is time-stepped through the defences and propagated into the flood plain using a 50 m horizontal-resolution digital elevation model. Based on the values and locations of the building stock in the flood plain, losses are calculated using vulnerability functions linking flood depth and flood velocity to measures of property loss. The outputs from this model for a UK insurance industry portfolio include "loss exceedence probabilities" as well as "average annualized losses", which can be employed for calculating coastal flood risk premiums in each postcode.
NASA Astrophysics Data System (ADS)
El-Shobokshy, Mohammad S.; Al-Saedi, Yaseen G.
This paper investigates some of the air pollution problems which have been created as a result of the Gulf war in early 1991. Temporary periods of increased dust storm activity have been observed in Saudi Arabia. This is presumably due to disturbance of the desert surface by the extremely large number of tanks and other war machines before and during the war. The concentrations of inhalable dust particles (<15 μm) increased during the months just after the war by a factor of about 1.5 of their values during the same months of the previous year, 1990. The total horizontal solar energy flux in Riyadh has been significantly reduced during dry days with no clouds. This is attributed to the presence of soot particles, which have been generated at an extremely high rate from the fired oil fields in Kuwait. The direct normal solar insolation were also measured at the photovoltaic solar power plant in Riyadh during these days and significant reductions were observed due to the effective absorption of solar radiation by soot particles. The generated power from the plant has been reduced during days with a polluted atmosphere by about 50-80% of the expected value for such days, if the atmosphere were dry and clear.
Anomaly detection driven active learning for identifying suspicious tracks and events in WAMI video
NASA Astrophysics Data System (ADS)
Miller, David J.; Natraj, Aditya; Hockenbury, Ryler; Dunn, Katherine; Sheffler, Michael; Sullivan, Kevin
2012-06-01
We describe a comprehensive system for learning to identify suspicious vehicle tracks from wide-area motion (WAMI) video. First, since the road network for the scene of interest is assumed unknown, agglomerative hierarchical clustering is applied to all spatial vehicle measurements, resulting in spatial cells that largely capture individual road segments. Next, for each track, both at the cell (speed, acceleration, azimuth) and track (range, total distance, duration) levels, extreme value feature statistics are both computed and aggregated, to form summary (p-value based) anomaly statistics for each track. Here, to fairly evaluate tracks that travel across different numbers of spatial cells, for each cell-level feature type, a single (most extreme) statistic is chosen, over all cells traveled. Finally, a novel active learning paradigm, applied to a (logistic regression) track classifier, is invoked to learn to distinguish suspicious from merely anomalous tracks, starting from anomaly-ranked track prioritization, with ground-truth labeling by a human operator. This system has been applied to WAMI video data (ARGUS), with the tracks automatically extracted by a system developed in-house at Toyon Research Corporation. Our system gives promising preliminary results in highly ranking as suspicious aerial vehicles, dismounts, and traffic violators, and in learning which features are most indicative of suspicious tracks.
Nearly extremal apparent horizons in simulations of merging black holes
NASA Astrophysics Data System (ADS)
Lovelace, Geoffrey; Scheel, Mark A.; Owen, Robert; Giesler, Matthew; Katebi, Reza; Szilágyi, Béla; Chu, Tony; Demos, Nicholas; Hemberger, Daniel A.; Kidder, Lawrence E.; Pfeiffer, Harald P.; Afshari, Nousha
2015-03-01
The spin angular momentum S of an isolated Kerr black hole is bounded by the surface area A of its apparent horizon: 8π S≤slant A, with equality for extremal black holes. In this paper, we explore the extremality of individual and common apparent horizons for merging, rapidly spinning binary black holes. We consider simulations of merging black holes with equal masses M and initial spin angular momenta aligned with the orbital angular momentum, including new simulations with spin magnitudes up to S/{{M}2}=0.994. We measure the area and (using approximate Killing vectors) the spin on the individual and common apparent horizons, finding that the inequality 8π S\\lt A is satisfied in all cases but is very close to equality on the common apparent horizon at the instant it first appears. We also evaluate the Booth-Fairhurst extremality, whose value for a given apparent horizon depends on the scaling of the horizon’s null normal vectors. In particular, we introduce a gauge-invariant lower bound on the extremality by computing the smallest value that Booth and Fairhurst’s extremality parameter can take for any scaling. Using this lower bound, we conclude that the common horizons are at least moderately close to extremal just after they appear. Finally, following Lovelace et al (2008 Phys. Rev. D 78 084017), we construct quasiequilibrium binary-black hole initial data with ‘overspun’ marginally trapped surfaces with 8π S\\gt A. We show that the overspun surfaces are indeed superextremal: our lower bound on their Booth-Fairhurst extremality exceeds unity. However, we confirm that these superextremal surfaces are always surrounded by marginally outer trapped surfaces (i.e., by apparent horizons) with 8π S\\lt A. The extremality lower bound on the enclosing apparent horizon is always less than unity but can exceed the value for an extremal Kerr black hole.
On the precision of experimentally determined protein folding rates and φ-values
De Los Rios, Miguel A.; Muralidhara, B.K.; Wildes, David; Sosnick, Tobin R.; Marqusee, Susan; Wittung-Stafshede, Pernilla; Plaxco, Kevin W.; Ruczinski, Ingo
2006-01-01
φ-Values, a relatively direct probe of transition-state structure, are an important benchmark in both experimental and theoretical studies of protein folding. Recently, however, significant controversy has emerged regarding the reliability with which φ-values can be determined experimentally: Because φ is a ratio of differences between experimental observables it is extremely sensitive to errors in those observations when the differences are small. Here we address this issue directly by performing blind, replicate measurements in three laboratories. By monitoring within- and between-laboratory variability, we have determined the precision with which folding rates and φ-values are measured using generally accepted laboratory practices and under conditions typical of our laboratories. We find that, unless the change in free energy associated with the probing mutation is quite large, the precision of φ-values is relatively poor when determined using rates extrapolated to the absence of denaturant. In contrast, when we employ rates estimated at nonzero denaturant concentrations or assume that the slopes of the chevron arms (mf and mu) are invariant upon mutation, the precision of our estimates of φ is significantly improved. Nevertheless, the reproducibility we thus obtain still compares poorly with the confidence intervals typically reported in the literature. This discrepancy appears to arise due to differences in how precision is calculated, the dependence of precision on the number of data points employed in defining a chevron, and interlaboratory sources of variability that may have been largely ignored in the prior literature. PMID:16501226
Futatsuka, Makoto
2015-01-01
Large-scale food poisoning caused by methylmercury was identified in Minamata, Japan, in the 1950s (Minamata Disease). Although the diagnostic criteria for the disease was controversial and difficult during that time, we, the Kumamoto University Study Group, carried out a large-scale study to assess the clinical features in 1972-1973. The author tried to reassess the results of that study to appraise the diagnostic criteria established in 1977 on the basis of those results. A substantial number of residents in the exposed area exhibited neurologic signs, especially paresthesia of only the extremities, namely, the male residents of Minamata City showed a positive predictive value of 0.73 and a negative predictive value of 0.23. The relative risks of paresthesia only were 2.6 (2.0-3.3) and 1.2 (0.9-1.5), in Minamata and Goshonoura related to Ariake (control), respectively. At least until 1977, the diagnostic criteria remained valid, although it was inadequate. Nevertheless, presently, a follow-up study of the certified patients may lead to the development of efficient new diagnostic criteria.
Vibration isolation using extreme geometric nonlinearity
NASA Astrophysics Data System (ADS)
Virgin, L. N.; Santillan, S. T.; Plaut, R. H.
2008-08-01
A highly deformed, slender beam (or strip), attached to a vertically oscillating base, is used in a vibration isolation application to reduce the motion of a supported mass. The isolator is a thin strip that is bent so that the two ends are clamped together, forming a loop. The clamped ends are attached to an excitation source and the supported system is attached at the loop midpoint directly above the base. The strip is modeled as an elastica, and the resulting nonlinear boundary value problem is solved numerically using a shooting method. First the equilibrium shapes of the loop with varying static loads and lengths are studied. The analysis reveals a large degree of stiffness tunability; the stiffness is dependent on the geometric configuration, which itself is determined by the supported mass, loop length, and loop self-weight. Free vibration frequencies and mode shapes are also found. Finally, the case of forced vibration is studied, and the displacement transmissibility over a large range of forcing frequencies is determined for varying parameter values. Experiments using polycarbonate strips are conducted to verify equilibrium and dynamic behavior.
ERIC Educational Resources Information Center
Sharma, Kshitij; Chavez-Demoulin, Valérie; Dillenbourg, Pierre
2017-01-01
The statistics used in education research are based on central trends such as the mean or standard deviation, discarding outliers. This paper adopts another viewpoint that has emerged in statistics, called extreme value theory (EVT). EVT claims that the bulk of normal distribution is comprised mainly of uninteresting variations while the most…
NASA Astrophysics Data System (ADS)
Möller, Jens; Heinrich, Hartmut
2017-04-01
As a consequence of climate change atmospheric and oceanographic extremes and their potential impacts on coastal regions are of growing concern for governmental authorities responsible for the transportation infrastructure. Highest risks for shipping as well as for rail and road traffic originate from combined effects of extremes of storm surges and heavy rainfall which sometimes lead to insufficient dewatering of inland waterways. The German Ministry of Transport and digital Infrastructure therefore has tasked its Network of Experts to investigate the possible evolutions of extreme threats for low lands and especially for Kiel Canal, which is an important shortcut for shipping between the North and Baltic Seas. In this study we present results of a comparison of an Extreme Value Analysis (EVA) carried out on gauge observations and values derived from a coupled Regional Ocean-Atmosphere Climate Model (MPI-OM). High water levels at the coasts of the North and Baltic Seas are one of the most important hazards which increase the risk of flooding of the low-lying land and prevents such areas from an adequate dewatering. In this study changes in the intensity (magnitude of the extremes) and duration of extreme water levels (above a selected threshold) are investigated for several gauge stations with data partly reaching back to 1843. Different methods are used for the extreme value statistics, (1) a stationary general Pareto distribution (GPD) model as well as (2) an instationary statistical model for better reproduction of the impact of climate change. Most gauge stations show an increase of the mean water level of about 1-2 mm/year, with a stronger increase of the highest water levels and a decrease (or lower increase) of the lowest water levels. Also, the duration of possible dewatering time intervals for the Kiel-Canal was analysed. The results for the historical gauge station observations are compared to the statistics of modelled water levels from the coupled atmosphere-ocean climate model MPI-OM for the time interval from 1951 to 2000. We demonstrate that for high water levels the observations and MPI-OM results are in good agreement, and we provide an estimate on the decreasing dewatering potential for Kiel Canal until the end of the 21st century.
Extreme value statistics for two-dimensional convective penetration in a pre-main sequence star
NASA Astrophysics Data System (ADS)
Pratt, J.; Baraffe, I.; Goffrey, T.; Constantino, T.; Viallet, M.; Popov, M. V.; Walder, R.; Folini, D.
2017-08-01
Context. In the interior of stars, a convectively unstable zone typically borders a zone that is stable to convection. Convective motions can penetrate the boundary between these zones, creating a layer characterized by intermittent convective mixing, and gradual erosion of the density and temperature stratification. Aims: We examine a penetration layer formed between a central radiative zone and a large convection zone in the deep interior of a young low-mass star. Using the Multidimensional Stellar Implicit Code (MUSIC) to simulate two-dimensional compressible stellar convection in a spherical geometry over long times, we produce statistics that characterize the extent and impact of convective penetration in this layer. Methods: We apply extreme value theory to the maximal extent of convective penetration at any time. We compare statistical results from simulations which treat non-local convection, throughout a large portion of the stellar radius, with simulations designed to treat local convection in a small region surrounding the penetration layer. For each of these situations, we compare simulations of different resolution, which have different velocity magnitudes. We also compare statistical results between simulations that radiate energy at a constant rate to those that allow energy to radiate from the stellar surface according to the local surface temperature. Results: Based on the frequency and depth of penetrating convective structures, we observe two distinct layers that form between the convection zone and the stable radiative zone. We show that the probability density function of the maximal depth of convective penetration at any time corresponds closely in space with the radial position where internal waves are excited. We find that the maximal penetration depth can be modeled by a Weibull distribution with a small shape parameter. Using these results, and building on established scalings for diffusion enhanced by large-scale convective motions, we propose a new form for the diffusion coefficient that may be used for one-dimensional stellar evolution calculations in the large Péclet number regime. These results should contribute to the 321D link.
Trends in 1970-2010 southern California surface maximum temperatures: extremes and heat waves
NASA Astrophysics Data System (ADS)
Ghebreegziabher, Amanuel T.
Daily maximum temperatures from 1970-2010 were obtained from the National Climatic Data Center (NCDC) for 28 South Coast Air Basin (SoCAB) Cooperative Network (COOP) sites. Analyses were carried out on the entire data set, as well as on the 1970-1974 and 2006-2010 sub-periods, including construction of spatial distributions and time-series trends of both summer-average and annual-maximum values and of the frequency of two and four consecutive "daytime" heat wave events. Spatial patterns of average and extreme values showed three areas consistent with climatological SoCAB flow patterns: cold coastal, warm inland low-elevation, and cool further-inland mountain top. Difference (2006-2010 minus 1970-1974) distributions of both average and extreme-value trends were consistent with the shorter period (1970-2005) study of previous study, as they showed the expected inland regional warming and a "reverse-reaction" cooling in low elevation coastal and inland areas open to increasing sea breeze flows. Annual-extreme trends generally showed cooling at sites below 600 m and warming at higher elevations. As the warming trends of the extremes were larger than those of the averages, regional warming thus impacts extremes more than averages. Spatial distributions of hot-day frequencies showed expected maximum at inland low-elevation sites. Regional warming again thus induced increases at both elevated-coastal areas, but low-elevation areas showed reverse-reaction decreases.
Origin of the extremely large magnetoresistance in the semimetal YSb
Xu, J.; Ghimire, N. J.; Jiang, J. S.; ...
2017-08-29
Extremely large magnetoresistance (XMR) was recently discovered in YSb but its origin, along with that of many other XMR materials, is an active subject of debate. Here we demonstrate that YSb, with a cubic crystalline lattice and anisotropic bulk electron Fermi pockets, can be an excellent candidate for revealing the origin of XMR. We carried out angle dependent Shubnikov – de Haas quantum oscillation measurements to determine the volume and shape of the Fermi pockets. In addition, by investigating both Hall and longitudinal magnetoresistivities, we reveal that the origin of XMR in YSb lies in its carrier high mobility withmore » a diminishing Hall factor that is obtained from the ratio of the Hall and longitudinal magentoresistivities. The high mobility leads to a strong magnetic field dependence of the longitudinal magnetoconductivity while a diminishing Hall factor reveals the latent XMR hidden in the longitudinal magnetoconductivity whose inverse has a nearly quadratic magnetic-field dependence. The Hall factor highlights the deviation of the measured magnetoresistivity from its full potential value and provides a general formulation to reveal the origin of XMR behavior in high mobility materials and of nonsaturating MR behavior as a whole. Our approach can be readily applied to other XMR materials.« less
Multiresolution modeling with a JMASS-JWARS HLA Federation
NASA Astrophysics Data System (ADS)
Prince, John D.; Painter, Ron D.; Pendell, Brian; Richert, Walt; Wolcott, Christopher
2002-07-01
CACI, Inc.-Federal has built, tested, and demonstrated the use of a JMASS-JWARS HLA Federation that supports multi- resolution modeling of a weapon system and its subsystems in a JMASS engineering and engagement model environment, while providing a realistic JWARS theater campaign-level synthetic battle space and operational context to assess the weapon system's value added and deployment/employment supportability in a multi-day, combined force-on-force scenario. Traditionally, acquisition analyses require a hierarchical suite of simulation models to address engineering, engagement, mission and theater/campaign measures of performance, measures of effectiveness and measures of merit. Configuring and running this suite of simulations and transferring the appropriate data between each model is both time consuming and error prone. The ideal solution would be a single simulation with the requisite resolution and fidelity to perform all four levels of acquisition analysis. However, current computer hardware technologies cannot deliver the runtime performance necessary to support the resulting extremely large simulation. One viable alternative is to integrate the current hierarchical suite of simulation models using the DoD's High Level Architecture in order to support multi- resolution modeling. An HLA integration eliminates the extremely large model problem, provides a well-defined and manageable mixed resolution simulation and minimizes VV&A issues.
Analyzing phenological extreme events over the past five decades in Germany
NASA Astrophysics Data System (ADS)
Schleip, Christoph; Menzel, Annette; Estrella, Nicole; Graeser, Philipp
2010-05-01
As climate change may alter the frequency and intensity of extreme temperatures, we analysed whether warming of the last 5 decades has already changed the statistics of phenological extreme events. In this context, two extreme value statistical concepts are discussed and applied to existing phenological datasets of German Weather Service (DWD) in order to derive probabilities of occurrence for extreme early or late phenological events. We analyse four phenological groups; "begin of flowering, "leaf foliation", "fruit ripening" and "leaf colouring" as well as DWD indicator phases of the "phenological year". Additionally we put an emphasis on a between-species analysis; a comparison of differences in extreme onsets between three common northern conifers. Furthermore we conducted a within-species analysis with different phases of horse chestnut throughout a year. The first statistical approach fits data to a Gaussian model using traditional statistical techniques, and then analyses the extreme quantile. The key point of this approach is the adoption of an appropriate probability density function (PDF) to the observed data and the assessment of the PDF parameters change in time. The full analytical description in terms of the estimated PDF for defined time steps of the observation period allows probability assessments of extreme values for e.g. annual or decadal time steps. Related with this approach is the possibility of counting out the onsets which fall in our defined extreme percentiles. The estimation of the probability of extreme events on the basis of the whole data set is in contrast to analyses with the generalized extreme value distribution (GEV). The second approach deals with the extreme PDFs itself and fits the GEV distribution to annual minima of phenological series to provide useful estimates about return levels. For flowering and leaf unfolding phases exceptionally early extremes are seen since the mid 1980s and especially for the single years 1961, 1990 and 2007 whereas exceptionally extreme late events are seen in the year 1970. Summer phases such as fruit ripening exhibit stronger shifts to early extremes than spring phases. Leaf colouring phases reveal increasing probability for late extremes. The with GEV estimated 100-year event of Picea, Pinus and Larix amount to extreme early events of about -27, -31.48 and -32.79 days, respectively. If we assume non-stationary minimum data we get a more extreme 100-year event of about -35.40 for Picea but associated with wider confidence intervals. The GEV is simply another probability distribution but for purposes of extreme analysis in phenology it should be considered as equally important as (if not more important than) the Gaussian PDF approach.
NASA Astrophysics Data System (ADS)
Wanders, Niko; Wada, Yoshihide
2015-12-01
Long-term hydrological forecasts are important to increase our resilience and preparedness to extreme hydrological events. The skill in these forecasts is still limited due to large uncertainties inherent in hydrological models and poor predictability of long-term meteorological conditions. Here we show that strong (lagged) correlations exist between four different major climate oscillation modes and modeled and observed discharge anomalies over a 100 year period. The strongest correlations are found between the El Niño-Southern Oscillation signal and river discharge anomalies all year round, while North Atlantic Oscillation and Antarctic Oscillation time series are strongly correlated with winter discharge anomalies. The correlation signal is significant for periods up to 5 years for some regions, indicating a high added value of this information for long-term hydrological forecasting. The results suggest that long-term hydrological forecasting could be significantly improved by including the climate oscillation signals and thus improve our preparedness for hydrological extremes in the near future.
Modeling, Forecasting and Mitigating Extreme Earthquakes
NASA Astrophysics Data System (ADS)
Ismail-Zadeh, A.; Le Mouel, J.; Soloviev, A.
2012-12-01
Recent earthquake disasters highlighted the importance of multi- and trans-disciplinary studies of earthquake risk. A major component of earthquake disaster risk analysis is hazards research, which should cover not only a traditional assessment of ground shaking, but also studies of geodetic, paleoseismic, geomagnetic, hydrological, deep drilling and other geophysical and geological observations together with comprehensive modeling of earthquakes and forecasting extreme events. Extreme earthquakes (large magnitude and rare events) are manifestations of complex behavior of the lithosphere structured as a hierarchical system of blocks of different sizes. Understanding of physics and dynamics of the extreme events comes from observations, measurements and modeling. A quantitative approach to simulate earthquakes in models of fault dynamics will be presented. The models reproduce basic features of the observed seismicity (e.g., the frequency-magnitude relationship, clustering of earthquakes, occurrence of extreme seismic events). They provide a link between geodynamic processes and seismicity, allow studying extreme events, influence of fault network properties on seismic patterns and seismic cycles, and assist, in a broader sense, in earthquake forecast modeling. Some aspects of predictability of large earthquakes (how well can large earthquakes be predicted today?) will be also discussed along with possibilities in mitigation of earthquake disasters (e.g., on 'inverse' forensic investigations of earthquake disasters).
A storm severity index based on return levels of wind speeds
NASA Astrophysics Data System (ADS)
Becker, Nico; Nissen, Katrin M.; Ulbrich, Uwe
2015-04-01
European windstorms related to extra-tropical cyclones cause considerable damages to infrastructure during the winter season. Leckebusch et al. (2008) introduced a storm severity index (SSI) based on the exceedances of the local 98th percentile of wind speeds. The SSI is based on the assumption that (insured) damage usually occurs within the upper 2%-quantile of the local wind speed distribution (i.e. if the 98th percentile is exceeded). However, critical infrastructure, for example related to the power network or the transportation system, is usually designed to withstand wind speeds reaching the local 50-year return level, which is much higher than the 98th percentile. The aim of this work is to use the 50-year return level to develop a modified SSI, which takes into account only extreme wind speeds relevant to critical infrastructure. As a first step we use the block maxima approach to estimate the spatial distribution of return levels by fitting the generalized extreme value (GEV) distribution to the wind speeds retrieved from different reanalysis products. We show that the spatial distributions of the 50-year return levels derived from different reanalyses agree well within large parts of Europe. The differences between the reanalyses are largely within the range of the uncertainty intervals of the estimated return levels. As a second step the exceedances of the 50-year return level are evaluated and compared to the exceedances of the 98th percentiles for different extreme European windstorms. The areas where the wind speeds exceed the 50-year return level in the reanalysis data do largely agree with the areas where the largest damages were reported, e.g. France in the case of "Lothar" and "Martin" and Central Europe in the case of "Kyrill". Leckebusch, G. C., Renggli, D., & Ulbrich, U. (2008). Development and application of an objective storm severity measure for the Northeast Atlantic region. Meteorologische Zeitschrift, 17(5), 575-587.
Weighted mining of massive collections of [Formula: see text]-values by convex optimization.
Dobriban, Edgar
2018-06-01
Researchers in data-rich disciplines-think of computational genomics and observational cosmology-often wish to mine large bodies of [Formula: see text]-values looking for significant effects, while controlling the false discovery rate or family-wise error rate. Increasingly, researchers also wish to prioritize certain hypotheses, for example, those thought to have larger effect sizes, by upweighting, and to impose constraints on the underlying mining, such as monotonicity along a certain sequence. We introduce Princessp , a principled method for performing weighted multiple testing by constrained convex optimization. Our method elegantly allows one to prioritize certain hypotheses through upweighting and to discount others through downweighting, while constraining the underlying weights involved in the mining process. When the [Formula: see text]-values derive from monotone likelihood ratio families such as the Gaussian means model, the new method allows exact solution of an important optimal weighting problem previously thought to be non-convex and computationally infeasible. Our method scales to massive data set sizes. We illustrate the applications of Princessp on a series of standard genomics data sets and offer comparisons with several previous 'standard' methods. Princessp offers both ease of operation and the ability to scale to extremely large problem sizes. The method is available as open-source software from github.com/dobriban/pvalue_weighting_matlab (accessed 11 October 2017).
NASA Technical Reports Server (NTRS)
Burris, John; McGee, Thomas; Hoegy, Walt; Newman, Paul; Lait, Leslie; Twigg, Laurence; Sumnicht, Grant; Heaps, William; Hostetler, Chris; Neuber, Roland;
2001-01-01
NASA Goddard Space Flight Center's Airborne Raman Ozone, Temperature and Aerosol Lidar (AROTEL) measured extremely cold temperatures during all three deployments (December 1-16, 1999, January 14-29, 2000 and February 27-March 15, 2000) of the Sage III Ozone Loss and Validation Experiment (SOLVE). Temperatures were significantly below values observed in previous years with large regions regularly below 191 K and frequent temperature retrievals yielding values at or below 187 K. Temperatures well below the saturation point of type I polar stratospheric clouds (PSCs) were regularly encountered but their presence was not well correlated with PSCs observed by the NASA Langley Research Center's Aerosol Lidar co-located with AROTEL. Temperature measurements by meteorological sondes launched within areas traversed by the DC-8 showed minimum temperatures consistent in time and vertical extent with those derived from AROTEL data. Calculations to establish whether PSCs could exist at measured AROTEL temperatures and observed mixing ratios of nitric acid and water vapor showed large regions favorable to PSC formation. On several occasions measured AROTEL temperatures up to 10 K below the NAT saturation temperature were insufficient to produce PSCs even though measured values of nitric acid and water were sufficient for their formation.
Ghosh, Purabi R.; Fawcett, Derek; Sharma, Shashi B.; Poinern, Gerrard E. J.
2017-01-01
The quantities of organic waste produced globally by aquacultural and horticulture are extremely large and offer an attractive renewable source of biomolecules and bioactive compounds. The availability of such large and diverse sources of waste materials creates a unique opportunity to develop new recycling and food waste utilisation strategies. The aim of this review is to report the current status of research in the emerging field of producing high-value nanoparticles from food waste. Eco-friendly biogenic processes are quite rapid, and are usually carried out at normal room temperature and pressure. These alternative clean technologies do not rely on the use of the toxic chemicals and solvents commonly associated with traditional nanoparticle manufacturing processes. The relatively small number of research articles in the field have been surveyed and evaluated. Among the diversity of waste types, promising candidates and their ability to produce various high-value nanoparticles are discussed. Experimental parameters, nanoparticle characteristics and potential applications for nanoparticles in pharmaceuticals and biomedical applications are discussed. In spite of the advantages, there are a number of challenges, including nanoparticle reproducibility and understanding the formation mechanisms between different food waste products. Thus, there is considerable scope and opportunity for further research in this emerging field. PMID:28773212
The multiple facets of Peto's paradox: a life-history model for the evolution of cancer suppression
Brown, Joel S.; Cunningham, Jessica J.; Gatenby, Robert A.
2015-01-01
Large animals should have higher lifetime probabilities of cancer than small animals because each cell division carries an attendant risk of mutating towards a tumour lineage. However, this is not observed—a (Peto's) paradox that suggests large and/or long-lived species have evolved effective cancer suppression mechanisms. Using the Euler–Lotka population model, we demonstrate the evolutionary value of cancer suppression as determined by the ‘cost’ (decreased fecundity) of suppression verses the ‘cost’ of cancer (reduced survivorship). Body size per se will not select for sufficient cancer suppression to explain the paradox. Rather, cancer suppression should be most extreme when the probability of non-cancer death decreases with age (e.g. alligators), maturation is delayed, fecundity rates are low and fecundity increases with age. Thus, the value of cancer suppression is predicted to be lowest in the vole (short lifespan, high fecundity) and highest in the naked mole rat (long lived with late female sexual maturity). The life history of pre-industrial humans likely selected for quite low levels of cancer suppression. In modern humans that live much longer, this level results in unusually high lifetime cancer risks. The model predicts a lifetime risk of 49% compared with the current empirical value of 43%. PMID:26056365
NASA Astrophysics Data System (ADS)
Liu, Wei; Li, Ying-jun; Jia, Zhen-yuan; Zhang, Jun; Qian, Min
2011-01-01
In working process of huge heavy-load manipulators, such as the free forging machine, hydraulic die-forging press, forging manipulator, heavy grasping manipulator, large displacement manipulator, measurement of six-dimensional heavy force/torque and real-time force feedback of the operation interface are basis to realize coordinate operation control and force compliance control. It is also an effective way to raise the control accuracy and achieve highly efficient manufacturing. Facing to solve dynamic measurement problem on six-dimensional time-varying heavy load in extremely manufacturing process, the novel principle of parallel load sharing on six-dimensional heavy force/torque is put forward. The measuring principle of six-dimensional force sensor is analyzed, and the spatial model is built and decoupled. The load sharing ratios are analyzed and calculated in vertical and horizontal directions. The mapping relationship between six-dimensional heavy force/torque value to be measured and output force value is built. The finite element model of parallel piezoelectric six-dimensional heavy force/torque sensor is set up, and its static characteristics are analyzed by ANSYS software. The main parameters, which affect load sharing ratio, are analyzed. The experiments for load sharing with different diameters of parallel axis are designed. The results show that the six-dimensional heavy force/torque sensor has good linearity. Non-linearity errors are less than 1%. The parallel axis makes good effect of load sharing. The larger the diameter is, the better the load sharing effect is. The results of experiments are in accordance with the FEM analysis. The sensor has advantages of large measuring range, good linearity, high inherent frequency, and high rigidity. It can be widely used in extreme environments for real-time accurate measurement of six-dimensional time-varying huge loads on manipulators.
Extreme sea storm in the Mediterranean Sea. Trends during the 2nd half of the 20th century.
NASA Astrophysics Data System (ADS)
Pino, C.; Lionello, P.; Galati, M. B.
2009-04-01
Extreme sea storm in the Mediterranean Sea. Trends during the 2nd half of the 20th century Piero Lionello, University of Salento, piero.lionello@unisalento.it Maria Barbara Galati, University of Salento, mariabarbara.galati@unisalento.it Cosimo Pino, University of Salento, pino@le.infn.it The analysis of extreme Significant Wave Height (SWH) values and their trend is crucial for planning and managing coastal defences and off-shore activities. The analysis provided by this study covers a 44-year long period (1958-2001). First the WW3 (Wave Watch 3) model forced with the REMO-Hipocas regional model wind fields has been used for the hindcast of extreme SWH values over the Mediterranean basin with a 0.25 deg lat-lon resolution. Subsequently, the model results have been processed with an ad hoc software to detect storms. GEV analysis has been perfomed and a set of indicators for extreme SWH have been computed, using the Mann Kendall test for assessing statistical significance of trends for different parameter such as the number of extreme events, their duration and their intensity. Results suggest a transition towards weaker extremes and a milder climate over most of the Mediterranean Sea.
The rate of planet formation and the solar system's small bodies
NASA Technical Reports Server (NTRS)
Safronov, Viktor S.
1991-01-01
The evolution of random velocities and the mass distribution of preplanetary body at the early stage of accumulation are currently under review. Arguments were presented for and against the view of an extremely rapid, runaway growth of the largest bodies at this stage with parameter values of Theta approximately greater than 10(exp 3). Difficulties are encountered assuming such a large Theta: (1) bodies of the Jovian zone penetrate the asteroid zone too late and do not have time to hinder the formation of a normal-sized planet in the asteroidal zone and thereby remove a significant portion of the mass of solid matter and (2) Uranus and Neptune cannot eject bodies from the solar system into the cometary cloud. Therefore, the values Theta less than 10(exp 2) appear to be preferable.
Research in Stochastic Processes
1988-08-31
stationary sequence, Stochastic Proc. Appl. 29, 1988, 155-169 T. Hsing, J. Husler and M.R. Leadbetter, On the exceedance point process for a stationary...Nandagopalan, On exceedance point processes for "regular" sample functions, Proc. Volume, Oberxolfach Conf. on Extreme Value Theory, J. Husler and R. Reiss...exceedance point processes for stationary sequences under mild oscillation restrictions, Apr. 88. Obermotfach Conf. on Extremal Value Theory. Ed. J. HUsler
Quinn, Terrance; Sinkala, Zachariah
2014-01-01
We develop a general method for computing extreme value distribution (Gumbel, 1958) parameters for gapped alignments. Our approach uses mixture distribution theory to obtain associated BLOSUM matrices for gapped alignments, which in turn are used for determining significance of gapped alignment scores for pairs of biological sequences. We compare our results with parameters already obtained in the literature.
Properties of Extreme Precipitation and Their Uncertainties in 3-year GPM Precipitation Radar Data
NASA Astrophysics Data System (ADS)
Liu, N.; Liu, C.
2017-12-01
Extreme high precipitation rates are often related to flash floods and have devastating impacts on human society and the environments. To better understand these rare events, 3-year Precipitation Features (PFs) are defined by grouping the contiguous areas with nonzero near-surface precipitation derived using Global Precipitation Measurement (GPM) Ku band Precipitation Radar (KuPR). The properties of PFs with extreme precipitation rates greater than 20, 50, 100 mm/hr, such as the geographical distribution, volumetric precipitation contribution, seasonal and diurnal variations, are examined. In addition to the large seasonal and regional variations, the rare extreme precipitation rates often have a larger contribution to the local total precipitation. Extreme precipitation rates occur more often over land than over ocean. The challenges in the retrieval of extreme precipitation might be from the attenuation correction and large uncertainties in the Z-R relationships from near-surface radar reflectivity to precipitation rates. These potential uncertainties are examined by using collocated ground based radar reflectivity and precipitation retrievals.
Extreme ultraviolet index due to broken clouds at a midlatitude site, Granada (southeastern Spain)
NASA Astrophysics Data System (ADS)
Antón, M.; Piedehierro, A. A.; Alados-Arboledas, L.; Wolfran, E.; Olmo, F. J.
2012-11-01
Cloud cover usually attenuates the ultraviolet (UV) solar radiation but, under certain sky conditions, the clouds may produce an enhancement effect increasing the UV levels at surface. The main objective of this paper is to analyze an extreme UV enhancement episode recorded on 16 June 2009 at Granada (southeastern Spain). This phenomenon was characterized by a quick and intense increase in surface UV radiation under broken cloud fields (5-7 oktas) in which the Sun was surrounded by cumulus clouds (confirmed with sky images). Thus, the UV index (UVI) showed an enhancement of a factor 4 in the course of only 30 min around midday, varying from 2.6 to 10.4 (higher than the corresponding clear-sky UVI value). Additionally, the UVI presented values higher than 10 (extreme erythemal risk) for about 20 min running, with a maximum value around 11.5. The use of an empirical model and the total ozone column (TOC) derived from the Global Ozone Monitoring Experiment (GOME) for the period 1995-2011 showed that the value of UVI ~ 11.5 is substantially larger than the highest index that could origin the natural TOC variations over Granada. Finally, the UV erythemal dose accumulated during the period of 20 min with the extreme UVI values under broken cloud fields was 350 J/m2 which surpass the energy required to produce sunburn of the most human skin types.
Isokinetic profile of elbow flexion and extension strength in elite junior tennis players.
Ellenbecker, Todd S; Roetert, E Paul
2003-02-01
Descriptive study. To determine whether bilateral differences exist in concentric elbow flexion and extension strength in elite junior tennis players. The repetitive nature of tennis frequently produces upper extremity overuse injuries. Prior research has identified tennis-specific strength adaptation in the dominant shoulder and distal upper extremity musculature of elite players. No previous study has addressed elbow flexion and extension strength. Thirty-eight elite junior tennis players were bilaterally tested for concentric elbow flexion and extension muscle performance on a Cybex 6000 isokinetic dynamometer at 90 degrees/s, 210 degrees/s, and 300 degrees/s. Repeated-measures ANOVAs were used to test for differences between extremities, muscle groups, and speed. Significantly greater (P<0.002) dominant-arm elbow extension peak torque values were measured at 90 degrees/s, 210 degrees/s, and 300 degrees/s for males. Significantly greater (P<0.002) dominant-arm single-repetition work values were also measured at 90 degrees/s and 210 degrees/s for males. No significant difference was measured between extremities in elbow flexion muscular performance in males and for elbow flexion or extension peak torque and single-repetition work values in females. No significant difference between extremities was measured in elbow flexion/extension strength ratios in females and significant differences between extremities in this ratio were only present at 210 degrees/s in males (P<0.002). These data indicate muscular adaptations around the dominant elbow in male elite junior tennis players but not females. These data have ramifications for clinicians rehabilitating upper extremity injuries in patients from this population.
NASA Astrophysics Data System (ADS)
Yang, Y.; Gan, T. Y.; Tan, X.
2017-12-01
In the past few decades, there have been more extreme climate events around the world, and Canada has also suffered from numerous extreme precipitation events. In this paper, trend analysis, change point analysis, probability distribution function, principal component analysis and wavelet analysis were used to investigate the spatial and temporal patterns of extreme precipitation in Canada. Ten extreme precipitation indices were calculated using long-term daily precipitation data from 164 gauging stations. Several large-scale climate patterns such as El Niño-Southern Oscillation (ENSO), Pacific Decadal Oscillation (PDO), Pacific-North American (PNA), and North Atlantic Oscillation (NAO) were selected to analyze the relationships between extreme precipitation and climate indices. Convective Available Potential Energy (CAPE), specific humidity, and surface temperature were employed to investigate the potential causes of the trends.The results show statistically significant positive trends for most indices, which indicate increasing extreme precipitation. The majority of indices display more increasing trends along the southern border of Canada while decreasing trends dominate in the central Canadian Prairies (CP). In addition, strong connections are found between the extreme precipitation and climate indices and the effects of climate pattern differ for each region. The seasonal CAPE, specific humidity, and temperature are found to be closely related to Canadian extreme precipitation.
Climatic extremes improve predictions of spatial patterns of tree species
Zimmermann, N.E.; Yoccoz, N.G.; Edwards, T.C.; Meier, E.S.; Thuiller, W.; Guisan, Antoine; Schmatz, D.R.; Pearman, P.B.
2009-01-01
Understanding niche evolution, dynamics, and the response of species to climate change requires knowledge of the determinants of the environmental niche and species range limits. Mean values of climatic variables are often used in such analyses. In contrast, the increasing frequency of climate extremes suggests the importance of understanding their additional influence on range limits. Here, we assess how measures representing climate extremes (i.e., interannual variability in climate parameters) explain and predict spatial patterns of 11 tree species in Switzerland. We find clear, although comparably small, improvement (+20% in adjusted D2, +8% and +3% in cross-validated True Skill Statistic and area under the receiver operating characteristics curve values) in models that use measures of extremes in addition to means. The primary effect of including information on climate extremes is a correction of local overprediction and underprediction. Our results demonstrate that measures of climate extremes are important for understanding the climatic limits of tree species and assessing species niche characteristics. The inclusion of climate variability likely will improve models of species range limits under future conditions, where changes in mean climate and increased variability are expected.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ghosh, Subimal; Das, Debasish; Kao, Shih-Chieh
Recent studies disagree on how rainfall extremes over India have changed in space and time over the past half century, as well as on whether the changes observed are due to global warming or regional urbanization. Although a uniform and consistent decrease in moderate rainfall has been reported, a lack of agreement about trends in heavy rainfall may be due in part to differences in the characterization and spatial averaging of extremes. Here we use extreme value theory to examine trends in Indian rainfall over the past half century in the context of long-term, low-frequency variability.We show that when generalizedmore » extreme value theory is applied to annual maximum rainfall over India, no statistically significant spatially uniform trends are observed, in agreement with previous studies using different approaches. Furthermore, our space time regression analysis of the return levels points to increasing spatial variability of rainfall extremes over India. Our findings highlight the need for systematic examination of global versus regional drivers of trends in Indian rainfall extremes, and may help to inform flood hazard preparedness and water resource management in the region.« less
Sequences of extremal radially excited rotating black holes.
Blázquez-Salcedo, Jose Luis; Kunz, Jutta; Navarro-Lérida, Francisco; Radu, Eugen
2014-01-10
In the Einstein-Maxwell-Chern-Simons theory the extremal Reissner-Nordström solution is no longer the single extremal solution with vanishing angular momentum, when the Chern-Simons coupling constant reaches a critical value. Instead a whole sequence of rotating extremal J=0 solutions arises, labeled by the node number of the magnetic U(1) potential. Associated with the same near horizon solution, the mass of these radially excited extremal solutions converges to the mass of the extremal Reissner-Nordström solution. On the other hand, not all near horizon solutions are also realized as global solutions.
Cazelle, Elodie; Eskes, Chantra; Hermann, Martina; Jones, Penny; McNamee, Pauline; Prinsen, Menk; Taylor, Hannah; Wijnands, Marcel V W
2015-04-01
A.I.S.E. investigated the suitability of the regulatory adopted ICE in vitro test method (OECD TG 438) with or without histopathology to identify detergent and cleaning formulations having extreme pH that require classification as EU CLP/UN GHS Category 1. To this aim, 18 extreme pH detergent and cleaning formulations were tested covering both alkaline and acidic extreme pHs. The ICE standard test method following OECD Test Guideline 438 showed good concordance with in vivo classification (83%) and good and balanced specificity and sensitivity values (83%) which are in line with the performances of currently adopted in vitro test guidelines, confirming its suitability to identify Category 1 extreme pH detergent and cleaning products. In contrast to previous findings obtained with non-extreme pH formulations, the use of histopathology did not improve the sensitivity of the assay whilst it strongly decreased its specificity for the extreme pH formulations. Furthermore, use of non-testing prediction rules for classification showed poor concordance values (33% for the extreme pH rule and 61% for the EU CLP additivity approach) with high rates of over-prediction (100% for the extreme pH rule and 50% for the additivity approach), indicating that these non-testing prediction rules are not suitable to predict Category 1 hazards of extreme pH detergent and cleaning formulations. Copyright © 2015 Elsevier Ltd. All rights reserved.
Elmallah, Randa K; Chughtai, Morad; Adib, Farshad; Bozic, Kevin J; Kurtz, Steven M; Mont, Michael A
2017-03-15
Following total hip arthroplasty, patients' perception of their postoperative improvement and health plays a large role in satisfaction with and success of the surgical procedure. The Short Form-6D (SF-6D) is a health-related quality-of-life measure that assigns numerical value to the perception of patients' own health. The purpose was to determine SF-6D values of patients after total hip arthroplasty, to determine whether score changes were clinically relevant, and to compare these with postoperative functional improvements. We evaluated 188 patients who underwent primary total hip arthroplasty at 7 institutions and who had a mean age of 69 years (range, 47 to 88 years) and a mean body mass index of 28.8 kg/m (range, 19.8 to 38.9 kg/m). The SF-6D values were obtained from patients' SF-36 scores, and clinical relevance of value changes was determined using effect size. Using previous research, effect sizes were considered small between 0.2 and 0.5, moderate between 0.6 to 0.8, and large at >0.8. Clinical correlation was assessed using the Lower-Extremity Activity Scale and Harris hip scores. Patients were assessed preoperatively and postoperatively at 6 months and 1, 2, 3, and 5 years. The SF-6D scores improved from preoperatively and achieved significance (p < 0.05) at all points. The effect size demonstrated good clinical relevance up to the latest follow-up: 1.27 at 6 months, 1.30 at 1 year, 1.07 at 2 years, 1.08 at 3 years, and 1.05 at 5 years. The Lower-Extremity Activity Scale improved at all follow-up points from preoperatively to 1.8 at 6 months, 2.0 at 1 year, 1.8 at 2 years, 1.5 at 3 years, and 1.6 points at 5 years. The Harris hip score improved to 38 points at 6 months, 40 points at 1 year, 38 points at 2 years, 39 points at 3 years, and 41 points at 5 years postoperatively. The improvements in the Lower-Extremity Activity Scale and the Harris hip score significantly positively correlated (p < 0.01) with the SF-6D scores at all time points. SF-6D scores after total hip arthroplasty correlate with functional outcomes and have clinical relevance, as demonstrated by their effect size. Incorporating this straightforward and easy-to-use measurement tool when evaluating patients following total hip arthroplasty will facilitate future cost-utility analyses. Therapeutic Level IV. See Instructions for Authors for a complete description of levels of evidence.
NASA Astrophysics Data System (ADS)
Rieder, Harald E.; di Rocco, Stefania; Staehelin, Johannes; Maeder, Jörg A.; Ribatet, Mathieu; Peter, Thomas; Davison, Anthony C.
2010-05-01
Tools from geostatistics and extreme value theory are applied to analyze spatial correlations in total ozone for the southern mid-latitudes. The dataset used in this study is the NIWA-assimilated total ozone dataset (Bodeker et al., 2001; Müller et al., 2008). Recently new tools from extreme value theory (Coles, 2001; Ribatet, 2007) have been applied to the world's longest total ozone record from Arosa, Switzerland (e.g. Staehelin 1998a,b) and 5 other long-term ground based stations to describe extreme events in low and high total ozone (Rieder et al., 2010a,b,c). Excursions in the frequency of extreme events reveal "fingerprints" of dynamical factors such as ENSO or NAO, and chemical factors, such as cold Arctic vortex ozone losses, as well as major volcanic eruptions of the 20th century (e.g. Gunung Agung, El Chichón, Mt. Pinatubo). Furthermore, atmospheric loading in ozone depleting substances lead to a continuous modification of column ozone in the northern hemisphere also with respect to extreme values (partly again in connection with polar vortex contributions). It is shown that application of extreme value theory allows the identification of many more of such fingerprints than conventional time series analysis on basis of annual and seasonal mean values. Especially, the analysis shows the strong influence of dynamics, revealing that even moderate ENSO and NAO events have a discernible effect on total ozone (Rieder et al., 2010b,c). Within the current study patterns in spatial correlation and frequency distributions of extreme events (e.g. ELOs and EHOs) are studied for the southern mid-latitudes. It is analyzed if "fingerprints"found for features in the northern hemisphere occur also in the southern mid-latitudes. New insights in spatial patterns of total ozone for the southern mid-latitudes are presented. Within this study the influence of changes in atmospheric dynamics (e.g. tropospheric and lower stratospheric pressure systems, ENSO) as well as influence of major volcanic eruptions (e.g. Mt. Pinatubo) and ozone depleting substances (ODS) on column ozone over the southern mid-latitudes is analyzed for the time period 1979-2007. References: Bodeker, G.E., J.C. Scott, K. Kreher, and R.L. McKenzie, Global ozone trends in potential vorticity coordinates using TOMS and GOME intercompared against the Dobson network: 1978-1998, J. Geophys. Res., 106 (D19), 23029-23042, 2001. Coles, S.: An Introduction to Statistical Modeling of Extreme Values, Springer Series in Statistics, ISBN:1852334592, Springer, Berlin, 2001. Müller, R., Grooß, J.-U., Lemmen, C., Heinze, D., Dameris, M., and Bodeker, G.: Simple measures of ozone depletion in the polar stratosphere, Atmos. Chem. Phys., 8, 251-264, 2008. Ribatet, M.: POT: Modelling peaks over a threshold, R News, 7, 34-36, 2007. Rieder ,H.E., Staehelin, J., Maeder, J.A., Ribatet, M., Stübi, R., Weihs, P., Holawe, F., Peter, T., and A.D., Davison (2010): Extreme events in total ozone over Arosa - Part I: Application of extreme value theory, to be submitted to ACPD. Rieder, H.E., Staehelin, J., Maeder, J.A., Ribatet, M., Stübi, R., Weihs, P., Holawe, F., Peter, T., and A.D., Davison (2010): Extreme events in total ozone over Arosa - Part II: Fingerprints of atmospheric dynamics and chemistry and effects on mean values and long-term changes, to be submitted to ACPD. Rieder, H.E., Jancso, L.M., Staehelin, J., Maeder, J.A., Ribatet, Peter, T., and A.D., Davison (2010): Extreme events in total ozone over the northern mid-latitudes: A case study based on long-term data sets from 5 ground-based stations, in preparation. Staehelin, J., Kegel, R., and Harris, N. R.: Trend analysis of the homogenized total ozone series of Arosa (Switzerland), 1929-1996, J. Geophys. Res., 103(D7), 8389-8400, doi:10.1029/97JD03650, 1998a. Staehelin, J., Renaud, A., Bader, J., McPeters, R., Viatte, P., Hoegger, B., Bugnion, V., Giroud, M., and Schill, H.: Total ozone series at Arosa (Switzerland): Homogenization and data comparison, J. Geophys. Res., 103(D5), 5827-5842, doi:10.1029/97JD02402, 1998b.
Climate projection of synoptic patterns forming extremely high wind speed over the Barents Sea
NASA Astrophysics Data System (ADS)
Surkova, Galina; Krylov, Aleksey
2017-04-01
Frequency of extreme weather events is not very high, but their consequences for the human well-being may be hazardous. These seldom events are not always well simulated by climate models directly. Sometimes it is more effective to analyze numerical projection of large-scale synoptic event generating extreme weather. For example, in mid-latitude surface wind speed depends mainly on the sea level pressure (SLP) field - its configuration and horizontal pressure gradient. This idea was implemented for analysis of extreme wind speed events over the Barents Sea. The calendar of high surface wind speed V (10 m above the surface) was prepared for events with V exceeding 99th percentile value in the central part of the Barents Sea. Analysis of probability distribution function of V was carried out on the base of ERA-Interim reanalysis data (6-hours, 0.75x0.75 degrees of latitude and longitude) for the period 1981-2010. Storm wind events number was found to be 240 days. Sea level pressure field over the sea and surrounding area was selected for each storm wind event. For the climate of the future (scenario RCP8.5), projections of SLP from CMIP5 numerical experiments were used. More than 20 climate models results of projected SLP (2006-2100) over the Barents Sea were correlated with modern storm wind SLP fields. Our calculations showed the positive tendency of annual frequency of storm SLP patterns over the Barents Sea by the end of 21st century.
Assessing changes in failure probability of dams in a changing climate
NASA Astrophysics Data System (ADS)
Mallakpour, I.; AghaKouchak, A.; Moftakhari, H.; Ragno, E.
2017-12-01
Dams are crucial infrastructures and provide resilience against hydrometeorological extremes (e.g., droughts and floods). In 2017, California experienced series of flooding events terminating a 5-year drought, and leading to incidents such as structural failure of Oroville Dam's spillway. Because of large socioeconomic repercussions of such incidents, it is of paramount importance to evaluate dam failure risks associated with projected shifts in the streamflow regime. This becomes even more important as the current procedures for design of hydraulic structures (e.g., dams, bridges, spillways) are based on the so-called stationary assumption. Yet, changes in climate are anticipated to result in changes in statistics of river flow (e.g., more extreme floods) and possibly increasing the failure probability of already aging dams. Here, we examine changes in discharge under two representative concentration pathways (RCPs): RCP4.5 and RCP8.5. In this study, we used routed daily streamflow data from ten global climate models (GCMs) in order to investigate possible climate-induced changes in streamflow in northern California. Our results show that while the average flow does not show a significant change, extreme floods are projected to increase in the future. Using the extreme value theory, we estimate changes in the return periods of 50-year and 100-year floods in the current and future climates. Finally, we use the historical and future return periods to quantify changes in failure probability of dams in a warming climate.
Individual differences in social information gathering revealed through Bayesian hierarchical models
Pearson, John M.; Watson, Karli K.; Klein, Jeffrey T.; Ebitz, R. Becket; Platt, Michael L.
2013-01-01
As studies of the neural circuits underlying choice expand to include more complicated behaviors, analysis of behaviors elicited in laboratory paradigms has grown increasingly difficult. Social behaviors present a particular challenge, since inter- and intra-individual variation are expected to play key roles. However, due to limitations on data collection, studies must often choose between pooling data across all subjects or using individual subjects' data in isolation. Hierarchical models mediate between these two extremes by modeling individual subjects as drawn from a population distribution, allowing the population at large to serve as prior information about individuals' behavior. Here, we apply this method to data collected across multiple experimental sessions from a set of rhesus macaques performing a social information valuation task. We show that, while the values of social images vary markedly between individuals and between experimental sessions for the same individual, individuals also differentially value particular categories of social images. Furthermore, we demonstrate covariance between values for image categories within individuals and find evidence suggesting that magnitudes of stimulus values tend to diminish over time. PMID:24062635
Decadal oscillations and extreme value distribution of river peak flows in the Meuse catchment
NASA Astrophysics Data System (ADS)
De Niel, Jan; Willems, Patrick
2017-04-01
In flood risk management, flood probabilities are often quantified through Generalized Pareto distributions of river peak flows. One of the main underlying assumptions is that all data points need to originate from one single underlying distribution (i.i.d. assumption). However, this hypothesis, although generally assumed to be correct for variables such as river peak flows, remains somehow questionable: flooding might indeed be caused by different hydrological and/or meteorological conditions. This study confirms these findings from previous research by showing a clear indication of the link between atmospheric conditions and flooding for the Meuse river in The Netherlands: decadal oscillations of river peak flows can (at least partially) be attributed to the occurrence of westerly weather types. The study further proposes a method to take this correlation between atmospheric conditions and river peak flows into account when calibrating an extreme value distribution for river peak flows. Rather than calibrating one single distribution to the data and potentially violating the i.i.d. assumption, weather type depending extreme value distributions are derived and composed. The study shows that, for the Meuse river in The Netherlands, such approach results in a more accurate extreme value distribution, especially with regards to extrapolations. Comparison of the proposed method with a traditional extreme value analysis approach and an alternative model-based approach for the same case study shows strong differences in the peak flow extrapolation. The design-flood for a 1,250 year return period is estimated at 4,800 m3s-1 for the proposed method, compared with 3,450 m3s-1 and 3,900 m3s-1 for the traditional method and a previous study. The methods were validated based on instrumental and documentary flood information of the past 500 years.
I know why you voted for Trump: (Over)inferring motives based on choice.
Barasz, Kate; Kim, Tami; Evangelidis, Ioannis
2018-05-10
People often speculate about why others make the choices they do. This paper investigates how such inferences are formed as a function of what is chosen. Specifically, when observers encounter someone else's choice (e.g., of political candidate), they use the chosen option's attribute values (e.g., a candidate's specific stance on a policy issue) to infer the importance of that attribute (e.g., the policy issue) to the decision-maker. Consequently, when a chosen option has an attribute whose value is extreme (e.g., an extreme policy stance), observers infer-sometimes incorrectly-that this attribute disproportionately motivated the decision-maker's choice. Seven studies demonstrate how observers use an attribute's value to infer its weight-the value-weight heuristic-and identify the role of perceived diagnosticity: more extreme attribute values give observers the subjective sense that they know more about a decision-maker's preferences, and in turn, increase the attribute's perceived importance. The paper explores how this heuristic can produce erroneous inferences and influence broader beliefs about decision-makers. Copyright © 2018 Elsevier B.V. All rights reserved.
The persistence of the large volumes in black holes
NASA Astrophysics Data System (ADS)
Ong, Yen Chin
2015-08-01
Classically, black holes admit maximal interior volumes that grow asymptotically linearly in time. We show that such volumes remain large when Hawking evaporation is taken into account. Even if a charged black hole approaches the extremal limit during this evolution, its volume continues to grow; although an exactly extremal black hole does not have a "large interior". We clarify this point and discuss the implications of our results to the information loss and firewall paradoxes.
Compound Extremes and Bunched Black (or Grouped Grey) Swans
NASA Astrophysics Data System (ADS)
Watkins, N. W.
2014-12-01
Observed "wild" natural fluctuations may differ substantially in their character. Some events may be genuinelyunforeseen (and unforeseeable), as with Taleb's "black swans". These may occur singly, or may have their impactfurther magnified by being "bunched" in time. Some of the others may, however, be the rare extreme events from alight-tailed underlying distribution. Studying their occurrence may then be tractable with the methods of extremevalue theory [e.g. Coles, 2001], suitably adapted to allow correlation if that is observed to be present. Yet others may belong to a third broad class, described in today's presentation [ reviewed in Watkins, GRLFrontiers, 2013, doi: 10.1002/grl.50103]. Such "bursty" time series may show comparatively frequent highamplitude events, and/or long range correlations between successive values. The frequent large values due to thefirst of these effects, modelled in economics by Mandelbrot in 1963 using heavy- tailed probability distributions,can give rise to an "IPCC type I" burst composed of successive wild events. Conversely, long range dependence,even in a light-tailed Gaussian model like Mandelbrot and van Ness' fractional Brownian motion, can integrate"mild" events into an extreme "IPCC type III" burst. I will show how a standard statistical time series model, linear fractional stable motion (LFSM), which de-scends from the two special cases advocated by Mandelbrot, allows these two effects to be varied independently,and will present results from a preliminary study of such bursts in LFSM. The consequences for burst scaling whenlow frequency effects due to dissipation (FARIMA models), and multiplicative cascades (such as multifractals)are included will also be discussed, and the physical assumptions and constraints associated with making a givenchoice of model.
NASA Astrophysics Data System (ADS)
Jacobson, Heather R.; Keller, Stefan; Frebel, Anna; Casey, Andrew R.; Asplund, Martin; Bessell, Michael S.; Da Costa, Gary S.; Lind, Karin; Marino, Anna F.; Norris, John E.; Peña, José M.; Schmidt, Brian P.; Tisserand, Patrick; Walsh, Jennifer M.; Yong, David; Yu, Qinsi
2015-07-01
The SkyMapper Southern Sky Survey is carrying out a search for the most metal-poor stars in the Galaxy. It identifies candidates by way of its unique filter set which allows for estimation of stellar atmospheric parameters. The set includes a narrow filter centered on the Ca ii K 3933 Å line, enabling a robust estimate of stellar metallicity. Promising candidates are then confirmed with spectroscopy. We present the analysis of Magellan Inamori Kyocera Echelle high-resolution spectroscopy of 122 metal-poor stars found by SkyMapper in the first two years of commissioning observations. Forty-one stars have [{Fe}/{{H}}]≤slant -3.0. Nine have [{Fe}/{{H}}]≤slant -3.5, with three at [{Fe}/{{H}}]∼ -4. A 1D LTE abundance analysis of the elements Li, C, Na, Mg, Al, Si, Ca, Sc, Ti, Cr, Mn, Co, Ni, Zn, Sr, Ba, and Eu shows these stars have [X/Fe] ratios typical of other halo stars. One star with low [X/Fe] values appears to be “Fe-enhanced,” while another star has an extremely large [Sr/Ba] ratio: \\gt 2. Only one other star is known to have a comparable value. Seven stars are “CEMP-no” stars ([{{C}}/{Fe}]\\gt 0.7, [{Ba}/{Fe}]\\lt 0). 21 stars exhibit mild r-process element enhancements (0.3≤slant [{Eu}/{Fe}]\\lt 1.0), while four stars have [{Eu}/{Fe}]≥slant 1.0. These results demonstrate the ability to identify extremely metal-poor stars from SkyMapper photometry, pointing to increased sample sizes and a better characterization of the metal-poor tail of the halo metallicity distribution function in the future. This paper includes data gathered with the 6.5 m Magellan Telescopes located at Las Campanas Observatory, Chile.
Batterman, Stuart; Su, Feng-Chiao; Li, Shi; Mukherjee, Bhramar; Jia, Chunrong
2015-01-01
INTRODUCTION Emission sources of volatile organic compounds (VOCs) are numerous and widespread in both indoor and outdoor environments. Concentrations of VOCs indoors typically exceed outdoor levels, and most people spend nearly 90% of their time indoors. Thus, indoor sources generally contribute the majority of VOC exposures for most people. VOC exposure has been associated with a wide range of acute and chronic health effects; for example, asthma, respiratory diseases, liver and kidney dysfunction, neurologic impairment, and cancer. Although exposures to most VOCs for most persons fall below health-based guidelines, and long-term trends show decreases in ambient emissions and concentrations, a subset of individuals experience much higher exposures that exceed guidelines. Thus, exposure to VOCs remains an important environmental health concern. The present understanding of VOC exposures is incomplete. With the exception of a few compounds, concentration and especially exposure data are limited; and like other environmental data, VOC exposure data can show multiple modes, low and high extreme values, and sometimes a large portion of data below method detection limits (MDLs). Field data also show considerable spatial or interpersonal variability, and although evidence is limited, temporal variability seems high. These characteristics can complicate modeling and other analyses aimed at risk assessment, policy actions, and exposure management. In addition to these analytic and statistical issues, exposure typically occurs as a mixture, and mixture components may interact or jointly contribute to adverse effects. However most pollutant regulations, guidelines, and studies remain focused on single compounds, and thus may underestimate cumulative exposures and risks arising from coexposures. In addition, the composition of VOC mixtures has not been thoroughly investigated, and mixture components show varying and complex dependencies. Finally, although many factors are known to affect VOC exposures, many personal, environmental, and socioeconomic determinants remain to be identified, and the significance and applicability of the determinants reported in the literature are uncertain. To help answer these unresolved questions and overcome limitations of previous analyses, this project used several novel and powerful statistical modeling and analysis techniques and two large data sets. The overall objectives of this project were (1) to identify and characterize exposure distributions (including extreme values), (2) evaluate mixtures (including dependencies), and (3) identify determinants of VOC exposure. METHODS VOC data were drawn from two large data sets: the Relationships of Indoor, Outdoor, and Personal Air (RIOPA) study (1999–2001) and the National Health and Nutrition Examination Survey (NHANES; 1999–2000). The RIOPA study used a convenience sample to collect outdoor, indoor, and personal exposure measurements in three cities (Elizabeth, NJ; Houston, TX; Los Angeles, CA). In each city, approximately 100 households with adults and children who did not smoke were sampled twice for 18 VOCs. In addition, information about 500 variables associated with exposure was collected. The NHANES used a nationally representative sample and included personal VOC measurements for 851 participants. NHANES sampled 10 VOCs in common with RIOPA. Both studies used similar sampling methods and study periods. Specific Aim 1 To estimate and model extreme value exposures, extreme value distribution models were fitted to the top 10% and 5% of VOC exposures. Health risks were estimated for individual VOCs and for three VOC mixtures. Simulated extreme value data sets, generated for each VOC and for fitted extreme value and lognormal distributions, were compared with measured concentrations (RIOPA observations) to evaluate each model’s goodness of fit. Mixture distributions were fitted with the conventional finite mixture of normal distributions and the semi-parametric Dirichlet process mixture (DPM) of normal distributions for three individual VOCs (chloroform, 1,4-DCB, and styrene). Goodness of fit for these full distribution models was also evaluated using simulated data. Specific Aim 2 Mixtures in the RIOPA VOC data set were identified using positive matrix factorization (PMF) and by toxicologic mode of action. Dependency structures of a mixture’s components were examined using mixture fractions and were modeled using copulas, which address correlations of multiple components across their entire distributions. Five candidate copulas (Gaussian, t, Gumbel, Clayton, and Frank) were evaluated, and the performance of fitted models was evaluated using simulation and mixture fractions. Cumulative cancer risks were calculated for mixtures, and results from copulas and multivariate lognormal models were compared with risks based on RIOPA observations. Specific Aim 3 Exposure determinants were identified using stepwise regressions and linear mixed-effects models (LMMs). RESULTS Specific Aim 1 Extreme value exposures in RIOPA typically were best fitted by three-parameter generalized extreme value (GEV) distributions, and sometimes by the two-parameter Gumbel distribution. In contrast, lognormal distributions significantly underestimated both the level and likelihood of extreme values. Among the VOCs measured in RIOPA, 1,4-dichlorobenzene (1,4-DCB) was associated with the greatest cancer risks; for example, for the highest 10% of measurements of 1,4-DCB, all individuals had risk levels above 10−4, and 13% of all participants had risk levels above 10−2. Of the full-distribution models, the finite mixture of normal distributions with two to four clusters and the DPM of normal distributions had superior performance in comparison with the lognormal models. DPM distributions provided slightly better fit than the finite mixture distributions; the advantages of the DPM model were avoiding certain convergence issues associated with the finite mixture distributions, adaptively selecting the number of needed clusters, and providing uncertainty estimates. Although the results apply to the RIOPA data set, GEV distributions and mixture models appear more broadly applicable. These models can be used to simulate VOC distributions, which are neither normally nor lognormally distributed, and they accurately represent the highest exposures, which may have the greatest health significance. Specific Aim 2 Four VOC mixtures were identified and apportioned by PMF; they represented gasoline vapor, vehicle exhaust, chlorinated solvents and disinfection byproducts, and cleaning products and odorants. The last mixture (cleaning products and odorants) accounted for the largest fraction of an individual’s total exposure (average of 42% across RIOPA participants). Often, a single compound dominated a mixture but the mixture fractions were heterogeneous; that is, the fractions of the compounds changed with the concentration of the mixture. Three VOC mixtures were identified by toxicologic mode of action and represented VOCs associated with hematopoietic, liver, and renal tumors. Estimated lifetime cumulative cancer risks exceeded 10−3 for about 10% of RIOPA participants. The dependency structures of the VOC mixtures in the RIOPA data set fitted Gumbel (two mixtures) and t copulas (four mixtures). These copula types emphasize dependencies found in the upper and lower tails of a distribution. The copulas reproduced both risk predictions and exposure fractions with a high degree of accuracy and performed better than multivariate lognormal distributions. Specific Aim 3 In an analysis focused on the home environment and the outdoor (close to home) environment, home VOC concentrations dominated personal exposures (66% to 78% of the total exposure, depending on VOC); this was largely the result of the amount of time participants spent at home and the fact that indoor concentrations were much higher than outdoor concentrations for most VOCs. In a different analysis focused on the sources inside the home and outside (but close to the home), it was assumed that 100% of VOCs from outside sources would penetrate the home. Outdoor VOC sources accounted for 5% (d-limonene) to 81% (carbon tetrachloride [CTC]) of the total exposure. Personal exposure and indoor measurements had similar determinants depending on the VOC. Gasoline-related VOCs (e.g., benzene and methyl tert-butyl ether [MTBE]) were associated with city, residences with attached garages, pumping gas, wind speed, and home air exchange rate (AER). Odorant and cleaning-related VOCs (e.g., 1,4-DCB and chloroform) also were associated with city, and a residence’s AER, size, and family members showering. Dry-cleaning and industry-related VOCs (e.g., tetrachloroethylene [or perchloroethylene, PERC] and trichloroethylene [TCE]) were associated with city, type of water supply to the home, and visits to the dry cleaner. These and other relationships were significant, they explained from 10% to 40% of the variance in the measurements, and are consistent with known emission sources and those reported in the literature. Outdoor concentrations of VOCs had only two determinants in common: city and wind speed. Overall, personal exposure was dominated by the home setting, although a large fraction of indoor VOC concentrations were due to outdoor sources. City of residence, personal activities, household characteristics, and meteorology were significant determinants. Concentrations in RIOPA were considerably lower than levels in the nationally representative NHANES for all VOCs except MTBE and 1,4-DCB. Differences between RIOPA and NHANES results can be explained by contrasts between the sampling designs and staging in the two studies, and by differences in the demographics, smoking, employment, occupations, and home locations. A portion of these differences are due to the nature of the convenience (RIOPA) and representative (NHANES) sampling strategies used in the two studies. CONCLUSIONS Accurate models for exposure data, which can feature extreme values, multiple modes, data below the MDL, heterogeneous interpollutant dependency structures, and other complex characteristics, are needed to estimate exposures and risks and to develop control and management guidelines and policies. Conventional and novel statistical methods were applied to data drawn from two large studies to understand the nature and significance of VOC exposures. Both extreme value distributions and mixture models were found to provide excellent fit to single VOC compounds (univariate distributions), and copulas may be the method of choice for VOC mixtures (multivariate distributions), especially for the highest exposures, which fit parametric models poorly and which may represent the greatest health risk. The identification of exposure determinants, including the influence of both certain activities (e.g., pumping gas) and environments (e.g., residences), provides information that can be used to manage and reduce exposures. The results obtained using the RIOPA data set add to our understanding of VOC exposures and further investigations using a more representative population and a wider suite of VOCs are suggested to extend and generalize results. PMID:25145040
Yang, Wu-Bin; Niu, He-Cai; Sun, Wei-Dong; Shan, Qiang; Zheng, Yong-Fei; Li, Ning-Bo; Li, Cong-Ying; Arndt, Nicholas T.; Xu, Xing; Jiang, Yu-Hang; Yu, Xue-Yuan
2013-01-01
Cretaceous represents one of the hottest greenhouse periods in the Earth's history, but some recent studies suggest that small ice caps might be present in non-polar regions during certain periods in the Early Cretaceous. Here we report extremely negative δ18O values of −18.12‰ to −13.19‰ for early Aptian hydrothermal zircon from an A-type granite at Baerzhe in northeastern China. Given that A-type granite is anhydrous and that magmatic zircon of the Baerzhe granite has δ18O value close to mantle values, the extremely negative δ18O values for hydrothermal zircon are attributed to addition of meteoric water with extremely low δ18O, mostly likely transported by glaciers. Considering the paleoaltitude of the region, continental glaciation is suggested to occur in the early Aptian, indicating much larger temperature fluctuations than previously thought during the supergreenhouse Cretaceous. This may have impact on the evolution of major organism in the Jehol Group during this period. PMID:24061068
Case Studies of Extreme Space Weather Effects on the New York State (NYS) Electric Power System
NASA Astrophysics Data System (ADS)
Chantale Damas, M.; Mohamed, Ahmed; Ngwira, Chigomyezo
2017-04-01
New York State (NYS) is home to one of the largest urban cities in the world, New York City (NYC). Understanding and mitigating the effects of extreme space weather events are important to reduce the vulnerabilities of the NYS present bulk power system, which includes NYC. Extreme space weather events perturb Earth's magnetic field and generate geo-electric fields that result in the flow of Geomagnetically Induced Currents (GICs) through transmission lines, followed by transformers and ground. GICs find paths to ground through transformer grounding wires causing half-cycle saturation to their magnetic cores. This causes transformers to overheat, inject harmonics to the grid and draw more reactive power than normal. Overheating, if sustained for a long duration, may lead to transformer failure or lifetime reduction. Presented work uses results from simulations performed by the Global SWMF-generated ground geomagnetic field perturbations. Results from computed values of simulated induced geo-electric fields at specific ground-based active INTERMAGNET magnetometer sites, combined with NYS electricity transmission network real data are used to examine the vulnerabilities of the NYS power grid. As an urban city with a large population, NYC is especially vulnerable and the results from this research can be used to model power systems for other urban cities.
Sueur, Jérôme; Mackie, David; Windmill, James F. C.
2011-01-01
To communicate at long range, animals have to produce intense but intelligible signals. This task might be difficult to achieve due to mechanical constraints, in particular relating to body size. Whilst the acoustic behaviour of large marine and terrestrial animals has been thoroughly studied, very little is known about the sound produced by small arthropods living in freshwater habitats. Here we analyse for the first time the calling song produced by the male of a small insect, the water boatman Micronecta scholtzi. The song is made of three distinct parts differing in their temporal and amplitude parameters, but not in their frequency content. Sound is produced at 78.9 (63.6–82.2) SPL rms re 2.10−5 Pa with a peak at 99.2 (85.7–104.6) SPL re 2.10−5 Pa estimated at a distance of one metre. This energy output is significant considering the small size of the insect. When scaled to body length and compared to 227 other acoustic species, the acoustic energy produced by M. scholtzi appears as an extreme value, outperforming marine and terrestrial mammal vocalisations. Such an extreme display may be interpreted as an exaggerated secondary sexual trait resulting from a runaway sexual selection without predation pressure. PMID:21698252
Sueur, Jérôme; Mackie, David; Windmill, James F C
2011-01-01
To communicate at long range, animals have to produce intense but intelligible signals. This task might be difficult to achieve due to mechanical constraints, in particular relating to body size. Whilst the acoustic behaviour of large marine and terrestrial animals has been thoroughly studied, very little is known about the sound produced by small arthropods living in freshwater habitats. Here we analyse for the first time the calling song produced by the male of a small insect, the water boatman Micronecta scholtzi. The song is made of three distinct parts differing in their temporal and amplitude parameters, but not in their frequency content. Sound is produced at 78.9 (63.6-82.2) SPL rms re 2.10(-5) Pa with a peak at 99.2 (85.7-104.6) SPL re 2.10(-5) Pa estimated at a distance of one metre. This energy output is significant considering the small size of the insect. When scaled to body length and compared to 227 other acoustic species, the acoustic energy produced by M. scholtzi appears as an extreme value, outperforming marine and terrestrial mammal vocalisations. Such an extreme display may be interpreted as an exaggerated secondary sexual trait resulting from a runaway sexual selection without predation pressure.
Rudas, Csilla; Surányi, Olivér; Yasseri, Taha; Török, János
2017-01-01
The Internet has provided us with great opportunities for large scale collaborative public good projects. Wikipedia is a predominant example of such projects where conflicts emerge and get resolved through bottom-up mechanisms leading to the emergence of the largest encyclopedia in human history. Disaccord arises whenever editors with different opinions try to produce an article reflecting a consensual view. The debates are mainly heated by editors with extreme views. Using a model of common value production, we show that the consensus can only be reached if groups with extreme views can actively take part in the discussion and if their views are also represented in the common outcome, at least temporarily. We show that banning problematic editors mostly hinders the consensus as it delays discussion and thus the whole consensus building process. To validate the model, relevant quantities are measured both in simulations and Wikipedia, which show satisfactory agreement. We also consider the role of direct communication between editors both in the model and in Wikipedia data (by analyzing the Wikipedia talk pages). While the model suggests that in certain conditions there is an optimal rate of “talking” vs “editing”, it correctly predicts that in the current settings of Wikipedia, more activity in talk pages is associated with more controversy. PMID:28323867
NASA Astrophysics Data System (ADS)
Bonanos, A. Z.; Stanek, K. Z.; Udalski, A.; Wyrzykowski, L.; Żebruń, K.; Kubiak, M.; Szymański, M. K.; Szewczyk, O.; Pietrzyński, G.; Soszyński, I.
2004-08-01
We present a high-precision I-band light curve for the Wolf-Rayet binary WR 20a, obtained as a subproject of the Optical Gravitational Lensing Experiment. Rauw et al. have recently presented spectroscopy for this system, strongly suggesting extremely large minimum masses of 70.7+/-4.0 and 68.8+/-3.8 Msolar for the component stars of the system, with the exact values depending strongly on the period of the system. We detect deep eclipses of about 0.4 mag in the light curve of WR 20a, confirming and refining the suspected period of P=3.686 days and deriving an inclination angle of i=74.5d+/-2.0d. Using these photometric data and the radial velocity data of Rauw et al., we derive the masses for the two components of WR 20a to be 83.0+/-5.0 and 82.0+/-5.0 Msolar. Therefore, WR 20a is confirmed to consist of two extremely massive stars and to be the most massive binary known with an accurate mass determination. Based on observations obtained with the 1.3 m Warsaw telescope at Las Campanas Observatory, which is operated by the Carnegie Institute of Washington.
A new framework for estimating return levels using regional frequency analysis
NASA Astrophysics Data System (ADS)
Winter, Hugo; Bernardara, Pietro; Clegg, Georgina
2017-04-01
We propose a new framework for incorporating more spatial and temporal information into the estimation of extreme return levels. Currently, most studies use extreme value models applied to data from a single site; an approach which is inefficient statistically and leads to return level estimates that are less physically realistic. We aim to highlight the benefits that could be obtained by using methodology based upon regional frequency analysis as opposed to classic single site extreme value analysis. This motivates a shift in thinking, which permits the evaluation of local and regional effects and makes use of the wide variety of data that are now available on high temporal and spatial resolutions. The recent winter storms over the UK during the winters of 2013-14 and 2015-16, which have caused wide-ranging disruption and damaged important infrastructure, provide the main motivation for the current work. One of the most impactful natural hazards is flooding, which is often initiated by extreme precipitation. In this presentation, we focus on extreme rainfall, but shall discuss other meteorological variables alongside potentially damaging hazard combinations. To understand the risks posed by extreme precipitation, we need reliable statistical models which can be used to estimate quantities such as the T-year return level, i.e. the level which is expected to be exceeded once every T-years. Extreme value theory provides the main collection of statistical models that can be used to estimate the risks posed by extreme precipitation events. Broadly, at a single site, a statistical model is fitted to exceedances of a high threshold and the model is used to extrapolate to levels beyond the range of the observed data. However, when we have data at many sites over a spatial domain, fitting a separate model for each separate site makes little sense and it would be better if we could incorporate all this information to improve the reliability of return level estimates. Here, we use the regional frequency analysis approach to define homogeneous regions which are affected by the same storms. Extreme value models are then fitted to the data pooled from across a region. We find that this approach leads to more spatially consistent return level estimates with reduced uncertainty bounds.
Uncertainty estimation of water levels for the Mitch flood event in Tegucigalpa
NASA Astrophysics Data System (ADS)
Fuentes Andino, D. C.; Halldin, S.; Lundin, L.; Xu, C.
2012-12-01
Hurricane Mitch in 1998 left a devastating flood in Tegucigalpa, the capital city of Honduras. Simulation of elevated water surfaces provides a good way to understand the hydraulic mechanism of large flood events. In this study the one-dimensional HEC-RAS model for steady flow conditions together with the two-dimensional Lisflood-fp model were used to estimate the water level for the Mitch event in the river reaches at Tegucigalpa. Parameters uncertainty of the model was investigated using the generalized likelihood uncertainty estimation (GLUE) framework. Because of the extremely large magnitude of the Mitch flood, no hydrometric measurements were taken during the event. However, post-event indirect measurements of discharge and observed water levels were obtained in previous works by JICA and USGS. To overcome the problem of lacking direct hydrometric measurement data, uncertainty in the discharge was estimated. Both models could well define the value for channel roughness, though more dispersion resulted from the floodplain value. Analysis of the data interaction showed that there was a tradeoff between discharge at the outlet and floodplain roughness for the 1D model. The estimated discharge range at the outlet of the study area encompassed the value indirectly estimated by JICA, however the indirect method used by the USGS overestimated the value. If behavioral parameter sets can well reproduce water surface levels for past events such as Mitch, more reliable predictions for future events can be expected. The results acquired in this research will provide guidelines to deal with the problem of modeling past floods when no direct data was measured during the event, and to predict future large events taking uncertainty into account. The obtained range of the uncertain flood extension will be an outcome useful for decision makers.
NASA Astrophysics Data System (ADS)
Rasmussen, Roy; Ikeda, Kyoko; Liu, Changhai; Gutmann, Ethan; Gochis, David
2016-04-01
Modeling of extreme weather events often require very finely resolved treatment of atmospheric circulation structures in order to produce and localize the large moisture fluxes that result in extreme precipitation. This is particularly true for cool season orographic precipitation processes where the representation of the landform can significantly impact vertical velocity profiles and cloud moisture entrainment rates. This study presents results for high resolution regional climate modeling study of the Colorado Headwaters region using an updated version of the Weather Research and Forecasting (WRF) model run at 4 km horizontal resolution and a hydrological extension package called WRF-Hydro. Previous work has shown that the WRF modeling system can produce credible depictions of winter orographic precipitation over the Colorado Rockies if run at horizontal resolutions < 6 km. Here we present results from a detailed study of an extreme springtime snowfall event that occurred along the Colorado Front Range in March 2003. Results from the impact of warming on total precipitation, snow-rain partitioning and surface hydrological fluxes (evapotranspiration and runoff) will be discussed in the context of how potential changes in temperature impact the amount of precipitation, the phase of precipitation (rain vs. snow) and the timing and amplitude of streamflow responses. The results show using the Pseudo Global Warming technique that intense precipitation rates significantly increased during the event and a significant fraction of the snowfall converts to rain which significantly amplifies the runoff response from one where runoff is produced gradually to one in which runoff is rapidly translated into streamflow values that approach significant flooding risks. Results from a new, CONUS scale high resolution climate simulation of extreme events in a current and future climate will be presented as time permits.
Historical trends and extremes in boreal Alaska river basins
Bennett, Katrina E.; Cannon, Alex J.; Hinzman, Larry
2015-05-12
Climate change will shift the frequency, intensity, duration and persistence of extreme hydroclimate events and have particularly disastrous consequences in vulnerable systems such as the warm permafrost-dominated Interior region of boreal Alaska. This work focuses on recent research results from nonparametric trends and nonstationary generalized extreme value (GEV) analyses at eight Interior Alaskan river basins for the past 50/60 years (1954/64–2013). Trends analysis of maximum and minimum streamflow indicates a strong (>+50%) and statistically significant increase in 11-day flow events during the late fall/winter and during the snowmelt period (late April/mid-May), followed by a significant decrease in the 11-day flowmore » events during the post-snowmelt period (late May and into the summer). The April–May–June seasonal trends show significant decreases in maximum streamflow for snowmelt dominated systems (<–50%) and glacially influenced basins (–24% to –33%). Annual maximum streamflow trends indicate that most systems are experiencing declines, while minimum flow trends are largely increasing. Nonstationary GEV analysis identifies time-dependent changes in the distribution of spring extremes for snowmelt dominated and glacially dominated systems. Temperature in spring influences the glacial and high elevation snowmelt systems and winter precipitation drives changes in the snowmelt dominated basins. The Pacific Decadal Oscillation was associated with changes occurring in snowmelt dominated systems, and the Arctic Oscillation was linked to one lake dominated basin, with half of the basins exhibiting no change in response to climate variability. The paper indicates that broad scale studies examining trend and direction of change should employ multiple methods across various scales and consider regime dependent shifts to identify and understand changes in extreme streamflow within boreal forested watersheds of Alaska.« less
Research in Stochastic Processes
1988-10-10
To appear in Proceedings Volume, Oberwolfach Conf. on Extremal Value Theory, Ed. J. HUsler and R. Reiss, Springer. 4. M.R. Leadbetter. The exceedance...Hsing, J. Husler and M.R. Leadbetter, On the exceedance point process for a stationary sequence, Probability Theor. Rel. Fields, 20, 1988, 97-112 Z.J...Oberwotfach Conf. on Extreme Value Theory. J. Husler and R. Reiss. eds.. Springer. to appear V. Mandrekar, On a limit theorem and invariance
2009-03-01
transition fatigue regimes; however, microplasticity (i.e., heterogeneous plasticity at the scale of microstructure) is relevant to understanding fatigue...and Socie [57] considered the affect of microplastic 14 Microstructure-Sensitive Extreme Value Probabilities for High Cycle Fatigue of Ni-Base...considers the local stress state as affected by intergranular interactions and microplasticity . For the calculations given below, the volumes over which
NASA Astrophysics Data System (ADS)
Hasan, Husna; Salam, Norfatin; Kassim, Suraiya
2013-04-01
Extreme temperature of several stations in Malaysia is modeled by fitting the annual maximum to the Generalized Extreme Value (GEV) distribution. The Augmented Dickey Fuller (ADF) and Phillips Perron (PP) tests are used to detect stochastic trends among the stations. The Mann-Kendall (MK) test suggests a non-stationary model. Three models are considered for stations with trend and the Likelihood Ratio test is used to determine the best-fitting model. The results show that Subang and Bayan Lepas stations favour a model which is linear for the location parameters while Kota Kinabalu and Sibu stations are suitable with a model in the logarithm of the scale parameters. The return level is the level of events (maximum temperature) which is expected to be exceeded once, on average, in a given number of years, is obtained.
Wind extremes in the North Sea basin under climate change: an ensemble study of 12 CMIP5 GCMs
NASA Astrophysics Data System (ADS)
de Winter, R.; Ruessink, G.; Sterl, A.
2012-12-01
Coastal safety may be influenced by climate change, as changes in extreme surge levels and wave extremes may increase the vulnerability of dunes and other coastal defenses. In the North Sea, an area already prone to severe flooding, these high surge levels and waves are generated by severe wind speeds during storm events. As a result of the geometry of the North Sea, not only the maximum wind speed is relevant, but also wind direction. Analyzing changes in a changing climate implies that several uncertainties need to be taken into account. First, there is the uncertainty in climate experiments, which represents the possible development of the emission of greenhouse gases. Second, there is uncertainty between the climate models that are used to analyze the effect of different climate experiments. The third uncertainty is the natural variability of the climate. When this system variability is large, small trends will be difficult to detect. The natural variability results in statistical uncertainty, especially for events with high return values. We addressed the first two types of uncertainties for extreme wind conditions in the North Sea using 12 CMIP5 GCMs. To evaluate the differences between the climate experiments, two climate experiments (rcp4.5 and rcp8.5) from 2050-2100 are compared with historical runs, running from 1950-2000. Rcp4.5 is considered to be a middle climate experiment and rcp8.5 represents high-end climate scenarios. The projections of the 12 GCMs for a given scenario illustrate model uncertainty. We focus on the North Sea basin, because changes in wind conditions could have a large impact on safety of the densely populated North Sea coast, an area that has already a high exposure to flooding. Our results show that, consistent with ERA-Interim results, the annual maximum wind speed in the historical run demonstrates large interannual variability. For the North Sea, the annual maximum wind speed is not projected to change in either rcp4.5 or rcp8.5. In fact, the differences in the 12 GCMs are larger than the difference between the three experiments. Furthermore, our results show that, the variation in direction of annual maximum wind speed is large and this precludes a firm statement on climate-change induced changes in these directions. Nonetheless, most models indicate a decrease in annual maximum wind speed from south-eastern directions and an increase from south-western and western directions. This might be caused by a poleward shift of the storm track. The amount of wind from north-west and north-north-west, wind directions that are responsible for the development of extreme storm surges in the southern part of the North Sea, are not projected to change. However, North Sea coasts that have the longest fetch for western direction, e.g. the German Bight, may encounter more often high storm surge levels and extreme waves when the annual maximum wind will indeed be more often from western direction.
Extreme-value statistics reveal rare failure-critical defects in additive manufacturing
Boyce, Brad L.; Salzbrenner, Bradley C.; Rodelas, Jeffrey M.; ...
2017-04-21
Additive manufacturing enables the rapid, cost effective production of large populations of material test coupons such as tensile bars. By adopting streamlined test methods including ‘drop-in’ grips and non-contact extensometry, testing these large populations becomes more efficient. Unlike hardness tests, the tensile test provides a direct measure of yield strength, flow properties, and ductility, which can be directly incorporated into solid mechanics simulations. In the present work, over 1000 nominally identical tensile tests were used to explore the effect of process variability on the mechanical property distributions of a precipitation hardened stainless steel, 17-4PH, produced by a laser powder bedmore » fusion process, also known as direct metal laser sintering. With this large dataset, rare defects are revealed that affect only ~2% of the population, stemming from a single build lot of material. Lastly, the rare defects caused a substantial loss in ductility and were associated with an interconnected network of porosity.« less
Surface atmospheric extremes (Launch and transportation areas)
NASA Technical Reports Server (NTRS)
1972-01-01
The effects of extreme values of surface and low altitude atmospheric parameters on space vehicle design, tests, and operations are discussed. Atmospheric extremes from the surface to 150 meters for geographic locations of interest to NASA are given. Thermal parameters (temperature and solar radiation), humidity, pressure, and atmospheric electricity (lighting and static) are presented. Weather charts and tables are included.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aristov, Andrey I.; Kabashin, Andrei V., E-mail: kabashin@lp3.univ-mrs.fr; Zywietz, Urs
2014-02-17
By using methods of laser-induced transfer combined with nanoparticle lithography, we design and fabricate large-area gold nanoparticle-based metamaterial arrays exhibiting extreme Heaviside-like phase jumps in reflected light due to a strong diffractive coupling of localized plasmons. When employed in sensing schemes, these phase singularities provide the sensitivity of 5 × 10{sup 4} deg. of phase shift per refractive index unit change that is comparable with best values reported for plasmonic biosensors. The implementation of sensor platforms on the basis of such metamaterial arrays promises a drastic improvement of sensitivity and cost efficiency of plasmonic biosensing devices.
Interval-Valued Rank in Finite Ordered Sets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joslyn, Cliff; Pogel, Alex; Purvine, Emilie
We consider the concept of rank as a measure of the vertical levels and positions of elements of partially ordered sets (posets). We are motivated by the need for algorithmic measures on large, real-world hierarchically-structured data objects like the semantic hierarchies of ontolog- ical databases. These rarely satisfy the strong property of gradedness, which is required for traditional rank functions to exist. Representing such semantic hierarchies as finite, bounded posets, we recognize the duality of ordered structures to motivate rank functions which respect verticality both from the bottom and from the top. Our rank functions are thus interval-valued, and alwaysmore » exist, even for non-graded posets, providing order homomorphisms to an interval order on the interval-valued ranks. The concept of rank width arises naturally, allowing us to identify the poset region with point-valued width as its longest graded portion (which we call the “spindle”). A standard interval rank function is naturally motivated both in terms of its extremality and on pragmatic grounds. Its properties are examined, including the relation- ship to traditional grading and rank functions, and methods to assess comparisons of standard interval-valued ranks.« less
Extreme Events: low and high total ozone over Arosa, Switzerland
NASA Astrophysics Data System (ADS)
Rieder, H. E.; Staehelin, J.; Maeder, J. A.; Ribatet, M.; Stübi, R.; Weihs, P.; Holawe, F.; Peter, T.; Davison, A. C.
2009-04-01
The frequency distribution of days with extreme low (termed ELOs) and high (termed EHOs) total ozone is analyzed for the world's longest total ozone record (Arosa, Switzerland - for details see Staehelin et al.,1998a,b), with new tools from extreme value theory (e.g. Coles, 2001; Ribatet, 2007). A heavy-tail focused approach is used through the fitting of the Generalized Pareto Distribution (GPD) to the Arosa time series. Asymptotic arguments (Pickands, 1975) justify the use of the GPD for modeling exceedances over a high (or below a low) enough threshold (Coles, 2001). The analysis shows that the GPD is appropriate for modeling the frequency distribution in total ozone above or below a mathematically well-defined threshold. While previous studies focused on so termed ozone mini-holes and mini-highs (e.g. Bojkov and Balis, 2001, Koch et al., 2005), this study is the first to present a mathematical description of extreme events in low and high total ozone for a northern mid-latitudes site (Rieder et al., 2009). The results show (a) an increase in days with extreme low (ELOs) and (b) a decrease in days with extreme high total ozone (EHOs) during the last decades, (c) that the general trend in total ozone is strongly determined by these extreme events and (d) that fitting the GPD is an appropriate method for the estimation of the frequency distribution of so-called ozone mini-holes. Furthermore, this concept allows one to separate the effect of Arctic ozone depletion from that of in situ mid-latitude ozone loss. As shown by this study, ELOs and EHOs have a strong influence on mean values in total ozone and the "extremes concept" could be further used also for validation of Chemistry-Climate-Models (CCMs) within the scientific community. References: Bojkov, R. D., and Balis, D.S.: Characteristics of episodes with extremely low ozone values in the northern middle latitudes 1975-2000, Ann. Geophys., 19, 797-807, 2001. Coles, S.: An Introduction to Statistical Modeling of Extreme Values, Springer Series in Statistics, ISBN:1852334592, Springer, Berlin, 2001. Koch, G., H. Wernli, C. Schwierz, J. Staehelin, and T. Peter (2005), A composite study on the structure and formation of ozone miniholes and minihighs over central Europe, Geophys. Res. Lett., 32, L12810, doi:10.1029/2004GL022062. Pickands, J.: Statistical-Inference using extreme order Statistics, Ann. Stat., 3, 1, 119-131, 1975. Ribatet, M.: POT: Modelling peaks over a threshold, R News, 7, 34-36, 2007. Rieder, H.E., Staehelin, J., Maeder, J.A., Ribatet, M., Stübi, R., Weihs, P., Holawe, F., Peter, T., and Davison, A.C.: From ozone mini holes and mini highs towards extreme value theory: New insights from extreme events and non stationarity, submitted to J. Geophys. Res., 2009. Staehelin, J., Kegel, R., and Harris, N. R.: Trend analysis of the homogenized total ozone series of Arosa (Switzerland), 1929-1996, J. Geophys. Res., 103(D7), 8389-8400, doi:10.1029/97JD03650, 1998a. Staehelin, J., Renaud, A., Bader, J., McPeters, R., Viatte, P., Hoegger, B., Bugnion, V., Giroud, M., and Schill, H.: Total ozone series at Arosa (Switzerland): Homogenization and data comparison, J. Geophys. Res., 103(D5), 5827-5842, doi:10.1029/97JD02402, 1998b.
A unified econophysics explanation for the power-law exponents of stock market activity
NASA Astrophysics Data System (ADS)
Gabaix, Xavier; Gopikrishnan, Parameswaran; Plerou, Vasiliki; Stanley, Eugene
2007-08-01
We survey a theory (first sketched in Nature in 2003, then fleshed out in the Quarterly Journal of Economics in 2006) of the economic underpinnings of the fat-tailed distributions of a number of financial variables, such as returns and trading volume. Our theory posits that they have a common origin in the strategic trading behavior of very large financial institutions in a relatively illiquid market. We show how the fat-tailed distribution of fund sizes can indeed generate extreme returns and volumes, even in the absence of fundamental news. Moreover, we are able to replicate the individually different empirical values of the power-law exponents for each distribution: 3 for returns, 3/2 for volumes, 1 for the assets under management of large investors. Large investors moderate their trades to reduce their price impact; coupled with a concave price impact function, this leads to volumes being more fat-tailed than returns but less fat-tailed than fund sizes. The trades of large institutions also offer a unified explanation for apparently disconnected empirical regularities that are otherwise a challenge for economic theory.
Quantifying uncertainties in wind energy assessment
NASA Astrophysics Data System (ADS)
Patlakas, Platon; Galanis, George; Kallos, George
2015-04-01
The constant rise of wind energy production and the subsequent penetration in global energy markets during the last decades resulted in new sites selection with various types of problems. Such problems arise due to the variability and the uncertainty of wind speed. The study of the wind speed distribution lower and upper tail may support the quantification of these uncertainties. Such approaches focused on extreme wind conditions or periods below the energy production threshold are necessary for a better management of operations. Towards this direction, different methodologies are presented for the credible evaluation of potential non-frequent/extreme values for these environmental conditions. The approaches used, take into consideration the structural design of the wind turbines according to their lifespan, the turbine failures, the time needed for repairing as well as the energy production distribution. In this work, a multi-parametric approach for studying extreme wind speed values will be discussed based on tools of Extreme Value Theory. In particular, the study is focused on extreme wind speed return periods and the persistence of no energy production based on a weather modeling system/hind cast/10-year dataset. More specifically, two methods (Annual Maxima and Peaks Over Threshold) were used for the estimation of extreme wind speeds and their recurrence intervals. Additionally, two different methodologies (intensity given duration and duration given intensity, both based on Annual Maxima method) were implied to calculate the extreme events duration, combined with their intensity as well as the event frequency. The obtained results prove that the proposed approaches converge, at least on the main findings, for each case. It is also remarkable that, despite the moderate wind speed climate of the area, several consequent days of no energy production are observed.
Springtime extreme moisture transport into the Arctic and its impact on sea ice concentration
NASA Astrophysics Data System (ADS)
Yang, Wenchang; Magnusdottir, Gudrun
2017-05-01
Recent studies suggest that springtime moisture transport into the Arctic can initiate sea ice melt that extends to a large area in the following summer and fall, which can help explain Arctic sea ice interannual variability. Yet the impact from an individual moisture transport event, especially the extreme ones, is unclear on synoptic to intraseasonal time scales and this is the focus of the current study. Springtime extreme moisture transport into the Arctic from a daily data set is found to be dominant over Atlantic longitudes. Lag composite analysis shows that these extreme events are accompanied by a substantial sea ice concentration reduction over the Greenland-Barents-Kara Seas that lasts around a week. Surface air temperature also becomes anomalously high over these seas and cold to the west of Greenland as well as over the interior Eurasian continent. The blocking weather regime over the North Atlantic is mainly responsible for the extreme moisture transport, occupying more than 60% of the total extreme days, while the negative North Atlantic Oscillation regime is hardly observed at all during the extreme transport days. These extreme moisture transport events appear to be preceded by eastward propagating large-scale tropical convective forcing by as long as 2 weeks but with great uncertainty due to lack of statistical significance.
Optical phased array configuration for an extremely large telescope.
Meinel, Aden Baker; Meinel, Marjorie Pettit
2004-01-20
Extremely large telescopes are currently under consideration by several groups in several countries. Extrapolation of current technology up to 30 m indicates a cost of over dollars 1 billion. Innovative concepts are being explored to find significant cost reductions. We explore the concept of an Optical Phased Array (OPA) telescope. Each element of the OPA is a separate Cassegrain telescope. Collimated beams from the array are sent via an associated set of delay lines to a central beam combiner. This array of small telescope elements offers the possibility of starting with a low-cost array of a few rings of elements, adding structure and additional Cass elements until the desired diameter telescope is attained. We address the salient features of such an extremely large telescope and cost elements relative to more conventional options.
NASA Astrophysics Data System (ADS)
Huttenlau, M.; Stötter, J.; Stiefelmeyer, H.
2010-12-01
Within the last decades serious flooding events occurred in many parts of Europe and especially in 2005 the Austrian Federal Province of Tyrol was serious affected. These events in general and particularly the 2005 event have sensitised decision makers and the public. Beside discussions pertaining to protection goals and lessons learnt, the issue concerning potential consequences of extreme and severe flooding events has been raised. Additionally to the general interest of the public, decision makers of the insurance industry, public authorities, and responsible politicians are especially confronted with the question of possible consequences of extreme events. Answers thereof are necessary for the implementation of preventive appropriate risk management strategies. Thereby, property and liability losses reflect a large proportion of the direct tangible losses. These are of great interest for the insurance sector and can be understood as main indicators to interpret the severity of potential events. The natural scientific-technical risk analysis concept provides a predefined and structured framework to analyse the quantities of affected elements at risk, their corresponding damage potentials, and the potential losses. Generally, this risk concept framework follows the process steps hazard analysis, exposition analysis, and consequence analysis. Additionally to the conventional hazard analysis, the potential amount of endangered elements and their corresponding damage potentials were analysed and, thereupon, concrete losses were estimated. These took the specific vulnerability of the various individual elements at risk into consideration. The present flood risk analysis estimates firstly the general exposures of the risk indicators in the study area and secondly analyses the specific exposures and consequences of five extreme event scenarios. In order to precisely identify, localize, and characterize the relevant risk indicators of buildings, dwellings and inventory, vehicles, and individuals, a detailed geodatabase of the existing stock of elements and values was established on a single object level. Therefore, the localized and functional differentiated stock of elements was assessed monetarily on the basis of derived representative mean insurance values. Thus, well known difference factors between the analysis of the stock of elements and values on local and on regional scale could be reduced considerably. The spatial join of the results of the hazard analysis with the stock of elements and values enables the identification and quantification of the elements at risk and their corresponding damage potential. Thereupon, Extreme Scenario Losses (ESL) were analysed under consideration of different vulnerability approaches which describe the individual element's specific susceptibility. This results in scenario-specific ranges of ESL rather than in single values. The exposure analysis of the general endangerment in Tyrol identifies (i) 105 330 individuals, (ii) 20 272 buildings and 50 157 dwellings with a corresponding damage potential of approx. EUR 20 bn. and (iii) 62 494 vehicles with a corresponding damage potential of EUR 1 bn. Depending on the individual extreme event scenarios, the ESL solely to buildings and inventory vary between EUR 0.9-1.3 bn. for the scenario with the least ESL and EUR 2.2-2.5 bn. for the most serious scenarios. The correlation of the private property losses to buildings and inventory with further direct tangible loss categories on the basis of investigation after the event in 2005, results in potential direct tangible ESL of up to EUR 7.6 bn. Apart from the specific study results a general finding shows that beside the further development of modelling capabilities and scenario concepts, the key to considerably decrease uncertainties of integral flood risk analyses is the development and implementation of more precise methods. These are to determine the stock of elements and values and to evaluate the vulnerability or susceptibility of affected structures to certain flood characteristics more differentiated.
Flood protection diversification to reduce probabilities of extreme losses.
Zhou, Qian; Lambert, James H; Karvetski, Christopher W; Keisler, Jeffrey M; Linkov, Igor
2012-11-01
Recent catastrophic losses because of floods require developing resilient approaches to flood risk protection. This article assesses how diversification of a system of coastal protections might decrease the probabilities of extreme flood losses. The study compares the performance of portfolios each consisting of four types of flood protection assets in a large region of dike rings. A parametric analysis suggests conditions in which diversifications of the types of included flood protection assets decrease extreme flood losses. Increased return periods of extreme losses are associated with portfolios where the asset types have low correlations of economic risk. The effort highlights the importance of understanding correlations across asset types in planning for large-scale flood protection. It allows explicit integration of climate change scenarios in developing flood mitigation strategy. © 2012 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Bárdossy, András; Pegram, Geoffrey
2017-01-01
The use of radar measurements for the space time estimation of precipitation has for many decades been a central topic in hydro-meteorology. In this paper we are interested specifically in daily and sub-daily extreme values of precipitation at gauged or ungauged locations which are important for design. The purpose of the paper is to develop a methodology to combine daily precipitation observations and radar measurements to estimate sub-daily extremes at point locations. Radar data corrected using precipitation-reflectivity relationships lead to biased estimations of extremes. Different possibilities of correcting systematic errors using the daily observations are investigated. Observed gauged daily amounts are interpolated to unsampled points and subsequently disaggregated using the sub-daily values obtained by the radar. Different corrections based on the spatial variability and the subdaily entropy of scaled rainfall distributions are used to provide unbiased corrections of short duration extremes. Additionally a statistical procedure not based on a matching day by day correction is tested. In this last procedure as we are only interested in rare extremes, low to medium values of rainfall depth were neglected leaving a small number of L days of ranked daily maxima in each set per year, whose sum typically comprises about 50% of each annual rainfall total. The sum of these L day maxima is first iterpolated using a Kriging procedure. Subsequently this sum is disaggregated to daily values using a nearest neighbour procedure. The daily sums are then disaggregated by using the relative values of the biggest L radar based days. Of course, the timings of radar and gauge maxima can be different, so the method presented here uses radar for disaggregating daily gauge totals down to 15 min intervals in order to extract the maxima of sub-hourly through to daily rainfall. The methodologies were tested in South Africa, where an S-band radar operated relatively continuously at Bethlehem from 1998 to 2003, whose scan at 1.5 km above ground [CAPPI] overlapped a dense (10 km spacing) set of 45 pluviometers recording in the same 6-year period. This valuable set of data was obtained from each of 37 selected radar pixels [1 km square in plan] which contained a pluviometer not masked out by the radar foot-print. The pluviometer data were also aggregated to daily totals, for the same purpose. The extremes obtained using disaggregation methods were compared to the observed extremes in a cross validation procedure. The unusual and novel goal was not to obtain the reproduction of the precipitation matching in space and time, but to obtain frequency distributions of the point extremes, which we found to be stable.
NASA Astrophysics Data System (ADS)
Bargaoui, Zoubeida Kebaili; Bardossy, Andràs
2015-10-01
The paper aims to develop researches on the spatial variability of heavy rainfall events estimation using spatial copula analysis. To demonstrate the methodology, short time resolution rainfall time series from Stuttgart region are analyzed. They are constituted by rainfall observations on continuous 30 min time scale recorded over a network composed by 17 raingages for the period July 1989-July 2004. The analysis is performed aggregating the observations from 30 min up to 24 h. Two parametric bivariate extreme copula models, the Husler-Reiss model and the Gumbel model are investigated. Both involve a single parameter to be estimated. Thus, model fitting is operated for every pair of stations for a giving time resolution. A rainfall threshold value representing a fixed rainfall quantile is adopted for model inference. Generalized maximum pseudo-likelihood estimation is adopted with censoring by analogy with methods of univariate estimation combining historical and paleoflood information with systematic data. Only pairs of observations greater than the threshold are assumed as systematic data. Using the estimated copula parameter, a synthetic copula field is randomly generated and helps evaluating model adequacy which is achieved using Kolmogorov Smirnov distance test. In order to assess dependence or independence in the upper tail, the extremal coefficient which characterises the tail of the joint bivariate distribution is adopted. Hence, the extremal coefficient is reported as a function of the interdistance between stations. If it is less than 1.7, stations are interpreted as dependent in the extremes. The analysis of the fitted extremal coefficients with respect to stations inter distance highlights two regimes with different dependence structures: a short spatial extent regime linked to short duration intervals (from 30 min to 6 h) with an extent of about 8 km and a large spatial extent regime related to longer rainfall intervals (from 12 h to 24 h) with an extent of 34 to 38 km.
The analyses of extreme climate events over China based on CMIP5 historical and future simulations
NASA Astrophysics Data System (ADS)
Yang, S.; Dong, W.; Feng, J.; Chou, J.
2013-12-01
The extreme climate events have a serious influence on human society. Based on observations and 12 simulations from Coupled Model Intercomparison Project Phase 5 (CMIP5), Climatic extremes and their changes over china in history and future scenarios of three Representative Concentration Pathways (RCPs) are analyzed. Because of the background of global warming, in observations, the frost days (FD) and low-temperature threshold days (TN10P) have decreasing trend, and summer days (SU), high-temperature threshold days (TX90P), the heavy precipitation days (R20) and contribution of heavy precipitation days (P95T) show an increasing trend. Most coupled models can basically simulate main characteristics of most extreme indexes. The models reproduce the mean FD and TX90P value best and can give basic trends of the FD, TN10P, SU and TX90P. High correlation coefficients between simulated results and observation are found in FD, SU and P95T. For FD and SU index, most of the models have good ability to capture the spatial differences between the mean state of the 1986-2005 and 1961-1980 periods, but for other indexes, most of models' simulation ability for spatial disparity are not so satisfactory and have to be promoted. Under the high emission scenario of RCP8.5, the century-scale linear changes of Multi-Model Ensembles (MME) for FD, SU, TN10P, TX90P, R20 and P95T are -46.9, 46.0, -27.1, 175.4, 2.9 days and 9.9%, respectively. Due to the complexities of physical process parameterizations and the limitation of forcing data, a large uncertainty still exists in the simulations of climatic extremes. Fig.1 Observed and modeled multi-year average for each index (Dotted line: observation) Table1. Extreme index definition
Pituitary, gonadal and adrenal hormones after prolonged residence at extreme altitude in man.
Basu, M; Pal, K; Prasad, R; Malhotra, A S; Rao, K S; Sawhney, R C
1997-06-01
High altitude-induced alterations in pituitary, gonadal and adrenal hormones were studied in (i) eugonadal men from the armed forces who were resident at sea level (SL), (ii) SL residents staying at an altitude of 3542 m for periods ranging from 3 to 12 months (acclimatized lowlanders, ALL), (iii) ALL who stayed at 6300 m for 6 months, (iv) ALL who trekked from 3542 to 5080 m and stayed at an altitude of more than 6300 m in the glacier region for 6 months, and (v) high-altitude natives (HAN) resident at an altitude of 3300-3700 m. Circulating levels of LH, FSH, prolactin, cortisol, testosterone, dihydrotestosterone (DHT) and progesterone in ALL at 3542 m and in HAN were not significantly different (p > 0.05) from the SL control values. When the ALL living at 3542 m trekked to an extreme altitude of 5080 m, their testosterone levels showed a significant decrease (p < 0.01) compared to the preceding altitude values but had returned to SL values when measured after 6 months' continuous stay at 6300 m. As with testosterone, the levels of DHT and oestradiol-17 beta (E2) after prolonged stay at extreme altitude were also not significantly different (p > 0.05) from the SL values. The LH levels after trekking to 5080 m were significantly higher (p < 0.01) than at an altitude of 3542 m, but decreased to levels found at 3542 m or SL after prolonged residence at extreme altitude. Plasma levels of ACTH, prolactin, FSH and cortisol on arrival at 5080 m, and after a 6-month stay at extreme altitude, were not significantly different (p > 0.05) from the SL values. Plasma progesterone levels tended to increase on arrival at 5080 m but a significant increase (p < 0.001) was evident only after a 6-month stay at extreme altitude. These observations suggest that prolonged residence at lower as well as at extreme altitude does not appreciably alter blood levels of pituitary, gonadal or adrenal hormones except for plasma levels of progesterone. The exact mechanism and significance of this increase remains unknown, but may be important in increasing the sensitivity of the hypoxic ventilatory response and activation of haemoglobin synthesis.
A Test-Length Correction to the Estimation of Extreme Proficiency Levels
ERIC Educational Resources Information Center
Magis, David; Beland, Sebastien; Raiche, Gilles
2011-01-01
In this study, the estimation of extremely large or extremely small proficiency levels, given the item parameters of a logistic item response model, is investigated. On one hand, the estimation of proficiency levels by maximum likelihood (ML), despite being asymptotically unbiased, may yield infinite estimates. On the other hand, with an…
Inclusion of extremes of prematurity in ventricular index centile charts.
Boyle, M; Shim, R; Gnanasekaran, R; Tarrant, A; Ryan, S; Foran, A; McCallion, N
2015-06-01
To assess the relationship between ventricular index (VI) measurements and postmenstrual age in preterm infants and to generate centile charts and normal ranges for frontal horn ratio (FHR) for a large contemporary cohort of preterm infants. A retrospective cohort study of 253 infants with birth gestation less than 32 weeks admitted between January 2009 and December 2011 to a tertiary NICU in Ireland. A total of 816 cranial ultrasounds were reviewed. Data collected were grouped according to postmenstrual age at the time of scan from 23 weeks to 45 weeks. Median values for VI show a general trend to increase with gestation. FHR did not significantly change with postmenstrual age at scan with a median value of 0.31. There is a slight increase in VI as gestation at the time of scans increases. These results provide the basis for updated centile charts which we propose for current practice.
The extrudate swell of HDPE: Rheological effects
NASA Astrophysics Data System (ADS)
Konaganti, Vinod Kumar; Ansari, Mahmoud; Mitsoulis, Evan; Hatzikiriakos, Savvas G.
2017-05-01
The extrudate swell of an industrial grade high molecular weight high-density polyethylene (HDPE) in capillary dies is studied experimentally and numerically using the integral K-BKZ constitutive model. The non-linear viscoelastic flow properties of the polymer resin are studied for a broad range of large step shear strains and high shear rates using the cone partitioned plate (CPP) geometry of the stress/strain controlled rotational rheometer. This allowed the determination of the rheological parameters accurately, in particular the damping function, which is proven to be the most important in simulating transient flows such as extrudate swell. A series of simulations performed using the integral K-BKZ Wagner model with different values of the Wagner exponent n, ranging from n=0.15 to 0.5, demonstrates that the extrudate swell predictions are extremely sensitive to the Wagner damping function exponent. Using the correct n-value resulted in extrudate swell predictions that are in excellent agreement with experimental measurements.
Shlomai, Amir; Kariv, Revital; Leshno, Moshe; Beth-or, Anat; Sheinberg, Bracha; Halpern, Zamir
2010-10-01
Serum alanine aminotransferase (ALT) is commonly used to detect liver damage. Recent studies indicate that ALT levels at the upper range of normal limits are predictors of adverse outcomes, especially diabetes mellitus (DM) and the metabolic syndrome. The aim of our study was to define the ALT threshold for both men and women that may predict the onset of DM. We analyzed a large Health Maintenance Organization cohort of 157 308 healthy subjects with no evidence of liver disease and with baseline ALT levels ≤ 120 U/L, and identified those who developed DM within 6 years. Overall, an elevated baseline serum ALT value was significantly associated with the development of DM, with an odds ratio of 3.3 when comparing the higher and the lower quartiles of the whole study population. A subgroup analysis revealed that baseline ALT values higher than 10 U/L among women and 22 U/L among men were already significantly associated with an increased risk for DM for any increment in ALT level. Notably, ALT values higher than ∼55 U/L were associated with increased risk for DM that was relatively constant for any increment in ALT. Higher baseline ALT levels were stronger predictors for DM as compared with age, triglycerides and cholesterol levels. Our study implies that ALT values higher than 10 U/L and 22 U/L for women and men, respectively, may predict DM. We suggest redefining ALT values as either 'normal' or 'healthy', with the later reflecting much lower values, above which an individual is at increased risk for DM. © 2010 Journal of Gastroenterology and Hepatology Foundation and Blackwell Publishing Asia Pty Ltd.
NASA Astrophysics Data System (ADS)
Parhi, P.; Giannini, A.; Lall, U.; Gentine, P.
2016-12-01
Assessing and managing risks posed by climate variability and change is challenging in the tropics, from both a socio-economic and a scientific perspective. Most of the vulnerable countries with a limited climate adaptation capability are in the tropics. However, climate projections, particularly of extreme precipitation, are highly uncertain there. The CMIP5 (Coupled Model Inter- comparison Project - Phase 5) inter-model range of extreme precipitation sensitivity to the global temperature under climate change is much larger in the tropics as compared to the extra-tropics. It ranges from nearly 0% to greater than 30% across models (O'Gorman 2012). The uncertainty is also large in historical gauge or satellite based observational records. These large uncertainties in the sensitivity of tropical precipitation extremes highlight the need to better understand how tropical precipitation extremes respond to warming. We hypothesize that one of the factors explaining the large uncertainty is due to differing sensitivities during different phases of warming. We consider the `growth' and `mature' phases of warming under climate variability case- typically associated with an El Niño event. In the remote tropics (away from tropical Pacific Ocean), the response of the precipitation extremes during the two phases can be through different pathways: i) a direct and fast changing radiative forcing in an atmospheric column, acting top-down due to the tropospheric warming, and/or ii) an indirect effect via changes in surface temperatures, acting bottom-up through surface water and energy fluxes. We also speculate that the insights gained here might be useful in interpreting the large sensitivity under climate change scenarios, since the physical mechanisms during the two warming phases under climate variability case, have some correspondence with an increasing and stabilized green house gas emission scenarios.
Potential of commercial microwave link network derived rainfall for river runoff simulations
NASA Astrophysics Data System (ADS)
Smiatek, Gerhard; Keis, Felix; Chwala, Christian; Fersch, Benjamin; Kunstmann, Harald
2017-03-01
Commercial microwave link networks allow for the quantification of path integrated precipitation because the attenuation by hydrometeors correlates with rainfall between transmitter and receiver stations. The networks, operated and maintained by cellphone companies, thereby provide completely new and country wide precipitation measurements. As the density of traditional precipitation station networks worldwide is significantly decreasing, microwave link derived precipitation estimates receive increasing attention not only by hydrologists but also by meteorological and hydrological services. We investigate the potential of microwave derived precipitation estimates for streamflow prediction and water balance analyses, exemplarily shown for an orographically complex region in the German Alps (River Ammer). We investigate the additional value of link derived rainfall estimations combined with station observations compared to station and weather radar derived values. Our river runoff simulation system employs a distributed hydrological model at 100 × 100 m grid resolution. We analyze the potential of microwave link derived precipitation estimates for two episodes of 30 days with typically moderate river flow and an episode of extreme flooding. The simulation results indicate the potential of this novel precipitation monitoring method: a significant improvement in hydrograph reproduction has been achieved in the extreme flooding period that was characterized by a large number of local strong precipitation events. The present rainfall monitoring gauges alone were not able to correctly capture these events.
Tsunami vs Infragravity Surge: Statistics and Physical Character of Extreme Runup
NASA Astrophysics Data System (ADS)
Lynett, P. J.; Montoya, L. H.
2017-12-01
Motivated by recent observations of energetic and impulsive infragravity (IG) flooding events - also known as sneaker waves - we will present recent work on the relative probabilities and dynamics of extreme flooding events from tsunamis and long period wind wave events. The discussion will be founded on videos and records of coastal flooding by both recent tsunamis and IG, such as those in the Philippines during Typhoon Haiyan. From these observations, it is evident that IG surges may approach the coast as breaking bores with periods of minutes; a very tsunami-like character. Numerical simulations will be used to estimate flow elevations and speeds from potential IG surges, and these will be compared with similar values from tsunamis, over a range of different beach profiles. We will examine the relative rareness of each type of flooding event, which for large values of IG runup is a particularly challenging topic. For example, for a given runup elevation or flooding speed, the related tsunami return period may be longer than that associated with IG, implying that deposit information associated with such elevations or speeds are more likely to be caused by IG. Our purpose is to provide a statistical and physical discriminant between tsunami and IG, such that in areas exposed to both, a proper interpretation of overland transport, deposition, and damage is possible.
Rate of digesta passage in the philippine flying lemur, Cynocephalus volans.
Wischusen, E W; Ingle, N; Richmond, M E
1994-01-01
The rate of digesta passage was measured in five captive Philippine flying lemurs (Cynocephalus volans). These animals were force fed capsules containing known quantities of either particulate or soluble markers. The volumes of the gastrointestinal tracts of three flying lemurs were determined based on the wet weight of the contents of each section of the gut. The mean rate of digesta passage was 14.37 +/- 3.31 h when determined using the particulate marker and 21.9 +/- 0.03 h when determined using the soluble marker. The values based on the particulate marker are between 2% and 10% of similar values for other arboreal folivores. The morphology of the gastrointestinal system of the Philippine flying lemur is similar to that of other hindgut fermenters. Flying lemurs have a simple stomach and a large caecum. The total gut capacity of the Philippine flying lemur is similar to that of other herbivores, but is slightly smaller than that of either the koala (Phascolarctos cinereus), a hindgut fermenter, or the three-toed sloth (Bradypus variegatus), a foregut fermenter. These data suggest that flying lemurs deal with the problems of a folivorous diet very differently than some other arboreal mammals. Phascolarctos cinereus and Bradypus variegatus may represent one extreme with Cynocephalus volans representing the other extreme along a continuum of foraging strategies that are compatible with the arboreal folivore lifestyle.
Extreme values and fat tails of multifractal fluctuations
NASA Astrophysics Data System (ADS)
Muzy, J. F.; Bacry, E.; Kozhemyak, A.
2006-06-01
In this paper we discuss the problem of the estimation of extreme event occurrence probability for data drawn from some multifractal process. We also study the heavy (power-law) tail behavior of probability density function associated with such data. We show that because of strong correlations, the standard extreme value approach is not valid and classical tail exponent estimators should be interpreted cautiously. Extreme statistics associated with multifractal random processes turn out to be characterized by non-self-averaging properties. Our considerations rely upon some analogy between random multiplicative cascades and the physics of disordered systems and also on recent mathematical results about the so-called multifractal formalism. Applied to financial time series, our findings allow us to propose an unified framework that accounts for the observed multiscaling properties of return fluctuations, the volatility clustering phenomenon and the observed “inverse cubic law” of the return pdf tails.
Probability distribution of extreme share returns in Malaysia
NASA Astrophysics Data System (ADS)
Zin, Wan Zawiah Wan; Safari, Muhammad Aslam Mohd; Jaaman, Saiful Hafizah; Yie, Wendy Ling Shin
2014-09-01
The objective of this study is to investigate the suitable probability distribution to model the extreme share returns in Malaysia. To achieve this, weekly and monthly maximum daily share returns are derived from share prices data obtained from Bursa Malaysia over the period of 2000 to 2012. The study starts with summary statistics of the data which will provide a clue on the likely candidates for the best fitting distribution. Next, the suitability of six extreme value distributions, namely the Gumbel, Generalized Extreme Value (GEV), Generalized Logistic (GLO) and Generalized Pareto (GPA), the Lognormal (GNO) and the Pearson (PE3) distributions are evaluated. The method of L-moments is used in parameter estimation. Based on several goodness of fit tests and L-moment diagram test, the Generalized Pareto distribution and the Pearson distribution are found to be the best fitted distribution to represent the weekly and monthly maximum share returns in Malaysia stock market during the studied period, respectively.
Trukhmanov, I M; Suslova, G A; Ponomarenko, G N
This paper is devoted to the characteristic of the informative value of the functional step test with the application of the heel cushions in the children for the purpose of differential diagnostics of anatomic and functional differences in the length of the lower extremities. A total of 85 schoolchildren with different length of the lower extremities have been examined. The comparative evaluation of the results of clinical and instrumental examinations was undertaken. The data obtained with the help of the functional step test give evidence of its very high sensitivity, specificity, and clinical significant as a tool for the examination of the children with different length of the low extremities. It is concluded that the test is one of the most informative predictors of the effectiveness of rehabilitation in the children with different length of the lower extremities.
Extreme value laws for fractal intensity functions in dynamical systems: Minkowski analysis
NASA Astrophysics Data System (ADS)
Mantica, Giorgio; Perotti, Luca
2016-09-01
Typically, in the dynamical theory of extremal events, the function that gauges the intensity of a phenomenon is assumed to be convex and maximal, or singular, at a single, or at most a finite collection of points in phase-space. In this paper we generalize this situation to fractal landscapes, i.e. intensity functions characterized by an uncountable set of singularities, located on a Cantor set. This reveals the dynamical rôle of classical quantities like the Minkowski dimension and content, whose definition we extend to account for singular continuous invariant measures. We also introduce the concept of extremely rare event, quantified by non-standard Minkowski constants and we study its consequences to extreme value statistics. Limit laws are derived from formal calculations and are verified by numerical experiments. Dedicated to the memory of Joseph Ford, on the twentieth anniversary of his departure.
A dependence modelling study of extreme rainfall in Madeira Island
NASA Astrophysics Data System (ADS)
Gouveia-Reis, Délia; Guerreiro Lopes, Luiz; Mendonça, Sandra
2016-08-01
The dependence between variables plays a central role in multivariate extremes. In this paper, spatial dependence of Madeira Island's rainfall data is addressed within an extreme value copula approach through an analysis of maximum annual data. The impact of altitude, slope orientation, distance between rain gauge stations and distance from the stations to the sea are investigated for two different periods of time. The results obtained highlight the influence of the island's complex topography on the spatial distribution of extreme rainfall in Madeira Island.
Probabilistic Models For Earthquakes With Large Return Periods In Himalaya Region
NASA Astrophysics Data System (ADS)
Chaudhary, Chhavi; Sharma, Mukat Lal
2017-12-01
Determination of the frequency of large earthquakes is of paramount importance for seismic risk assessment as large events contribute to significant fraction of the total deformation and these long return period events with low probability of occurrence are not easily captured by classical distributions. Generally, with a small catalogue these larger events follow different distribution function from the smaller and intermediate events. It is thus of special importance to use statistical methods that analyse as closely as possible the range of its extreme values or the tail of the distributions in addition to the main distributions. The generalised Pareto distribution family is widely used for modelling the events which are crossing a specified threshold value. The Pareto, Truncated Pareto, and Tapered Pareto are the special cases of the generalised Pareto family. In this work, the probability of earthquake occurrence has been estimated using the Pareto, Truncated Pareto, and Tapered Pareto distributions. As a case study, the Himalayas whose orogeny lies in generation of large earthquakes and which is one of the most active zones of the world, has been considered. The whole Himalayan region has been divided into five seismic source zones according to seismotectonic and clustering of events. Estimated probabilities of occurrence of earthquakes have also been compared with the modified Gutenberg-Richter distribution and the characteristics recurrence distribution. The statistical analysis reveals that the Tapered Pareto distribution better describes seismicity for the seismic source zones in comparison to other distributions considered in the present study.
Overview of the biology of extreme events
NASA Astrophysics Data System (ADS)
Gutschick, V. P.; Bassirirad, H.
2008-12-01
Extreme events have, variously, meteorological origins as in heat waves or precipitation extremes, or biological origins as in pest and disease eruptions (or tectonic, earth-orbital, or impact-body origins). Despite growing recognition that these events are changing in frequency and intensity, a universal model of ecological responses to these events is slow to emerge. Extreme events, negative and positive, contrast with normal events in terms of their effects on the physiology, ecology, and evolution of organisms, hence also on water, carbon, and nutrient cycles. They structure biogeographic ranges and biomes, almost surely more than mean values often used to define biogeography. They are challenging to study for obvious reasons of field-readiness but also because they are defined by sequences of driving variables such as temperature, not point events. As sequences, their statistics (return times, for example) are challenging to develop, as also from the involvement of multiple environmental variables. These statistics are not captured well by climate models. They are expected to change with climate and land-use change but our predictive capacity is currently limited. A number of tools for description and analysis of extreme events are available, if not widely applied to date. Extremes for organisms are defined by their fitness effects on those organisms, and are specific to genotypes, making them major agents of natural selection. There is evidence that effects of extreme events may be concentrated in an extended recovery phase. We review selected events covering ranges of time and magnitude, from Snowball Earth to leaf functional loss in weather events. A number of events, such as the 2003 European heat wave, evidence effects on water and carbon cycles over large regions. Rising CO2 is the recent extreme of note, for its climatic effects and consequences for growing seasons, transpiration, etc., but also directly in its action as a substrate of photosynthesis. Effects on water and N cycles are already marked. Adaptive responses of plants are very irregularly distributed among species and genotypes, most adaptive responses having been lost over 20 My of minimal or virtually accidental genetic selection for correlated traits. Offsets of plant activity from those of pollinators and pests may amplify direct physiological effects on plants. Another extreme of interest is the insect-mediated mass dieoff of conifers across western North America tied to a rare combination of drought and year-long high temperatures.
Geochemistry of extremely alkaline (pH>12) ground water in slag-fill aquifers.
Roadcap, George S; Kelly, Walton R; Bethke, Craig M
2005-01-01
Extremely alkaline ground water has been found underneath many shuttered steel mills and slag dumps and has been an impediment to the cleanup and economic redevelopment of these sites because little is known about the geochemistry. A large number of these sites occur in the Lake Calumet region of Chicago, Illinois, where large-scale infilling of the wetlands with steel slag has created an aquifer with pH values as high as 12.8. To understand the geochemistry of the alkaline ground water system, we analyzed samples of ground water and the associated slag and weathering products from four sites. We also considered several potential remediation schemes to lower the pH and toxicity of the water. The principal cause of the alkaline conditions is the weathering of calcium silicates within the slag. The resulting ground water at most of the sites is dominated by Ca2+ and OH- in equilibrium with Ca(OH)2. Where the alkaline ground water discharges in springs, atmospheric CO2 dissolves into the water and thick layers of calcite form. Iron, manganese, and other metals in the metallic portion of the slag have corroded to form more stable low-temperature oxides and sulfides and have not accumulated in large concentrations in the ground water. Calcite precipitated at the springs is rich in a number of heavy metals, suggesting that metals can move through the system as particulate matter. Air sparging appears to be an effective remediation strategy for reducing the toxicity of discharging alkaline water.
The Detection of Diffuse Extended Structure in 3C 273: Implications for Jet Power
NASA Astrophysics Data System (ADS)
Punsly, Brian; Kharb, Preeti
2016-12-01
We present deep Very Large Array imaging of 3C 273 in order to determine the diffuse, large scale radio structure of this famous radio-loud quasar. Diffuse extended structure (radio lobes) is detected for the first time in these observations as a consequence of high dynamic range in the 327.5 and 1365 MHz images. This emission is used to estimate a time averaged jet power, 7.2 × 1043 erg s-1 < \\overline{Q} < 3.7 × 1044 erg s-1. Brightness temperature arguments indicate consistent values of the time variability Doppler factor and the compactness Doppler factor for the inner jet, δ ≳ 10. Thus, the large apparent broadband bolometric luminosity of the jet, ˜3 × 1046 erg s-1, corresponds to a modest intrinsic luminosity ≳1042 erg s-1, or ˜1% of \\overline{Q}. In summary, we find that 3C 273 is actually a “typical” radio-loud quasar contrary to suggestions in the literature. The modest \\overline{Q} is near the peak of the luminosity distribution for radio-loud quasars and it is consistent with the current rate of dissipation emitted from millimeter wavelengths to gamma rays. The extreme core-jet morphology is an illusion from a near pole-on line of sight to a highly relativistic jet that produces a Doppler enhanced glow that previously swamped the lobe emission. 3C 273 apparently has the intrinsic kpc scale morphology of a classical double radio source, but it is distorted by an extreme Doppler aberration.
Knapp, Alan K; Hoover, David L; Wilcox, Kevin R; Avolio, Meghan L; Koerner, Sally E; La Pierre, Kimberly J; Loik, Michael E; Luo, Yiqi; Sala, Osvaldo E; Smith, Melinda D
2015-02-03
Climate change is intensifying the hydrologic cycle and is expected to increase the frequency of extreme wet and dry years. Beyond precipitation amount, extreme wet and dry years may differ in other ways, such as the number of precipitation events, event size, and the time between events. We assessed 1614 long-term (100 year) precipitation records from around the world to identify key attributes of precipitation regimes, besides amount, that distinguish statistically extreme wet from extreme dry years. In general, in regions where mean annual precipitation (MAP) exceeded 1000 mm, precipitation amounts in extreme wet and dry years differed from average years by ~40% and 30%, respectively. The magnitude of these deviations increased to >60% for dry years and to >150% for wet years in arid regions (MAP<500 mm). Extreme wet years were primarily distinguished from average and extreme dry years by the presence of multiple extreme (large) daily precipitation events (events >99th percentile of all events); these occurred twice as often in extreme wet years compared to average years. In contrast, these large precipitation events were rare in extreme dry years. Less important for distinguishing extreme wet from dry years were mean event size and frequency, or the number of dry days between events. However, extreme dry years were distinguished from average years by an increase in the number of dry days between events. These precipitation regime attributes consistently differed between extreme wet and dry years across 12 major terrestrial ecoregions from around the world, from deserts to the tropics. Thus, we recommend that climate change experiments and model simulations incorporate these differences in key precipitation regime attributes, as well as amount into treatments. This will allow experiments to more realistically simulate extreme precipitation years and more accurately assess the ecological consequences. © 2015 John Wiley & Sons Ltd.
A New Approach to Extreme Value Estimation Applicable to a Wide Variety of Random Variables
NASA Technical Reports Server (NTRS)
Holland, Frederic A., Jr.
1997-01-01
Designing reliable structures requires an estimate of the maximum and minimum values (i.e., strength and load) that may be encountered in service. Yet designs based on very extreme values (to insure safety) can result in extra material usage and hence, uneconomic systems. In aerospace applications, severe over-design cannot be tolerated making it almost mandatory to design closer to the assumed limits of the design random variables. The issue then is predicting extreme values that are practical, i.e. neither too conservative or non-conservative. Obtaining design values by employing safety factors is well known to often result in overly conservative designs and. Safety factor values have historically been selected rather arbitrarily, often lacking a sound rational basis. To answer the question of how safe a design needs to be has lead design theorists to probabilistic and statistical methods. The so-called three-sigma approach is one such method and has been described as the first step in utilizing information about the data dispersion. However, this method is based on the assumption that the random variable is dispersed symmetrically about the mean and is essentially limited to normally distributed random variables. Use of this method can therefore result in unsafe or overly conservative design allowables if the common assumption of normality is incorrect.
Gusts and shear within hurricane eyewalls can exceed offshore wind turbine design standards
NASA Astrophysics Data System (ADS)
Worsnop, Rochelle P.; Lundquist, Julie K.; Bryan, George H.; Damiani, Rick; Musial, Walt
2017-06-01
Offshore wind energy development is underway in the U.S., with proposed sites located in hurricane-prone regions. Turbine design criteria outlined by the International Electrotechnical Commission do not encompass the extreme wind speeds and directional shifts of hurricanes stronger than category 2. We examine a hurricane's turbulent eyewall using large-eddy simulations with Cloud Model 1. Gusts and mean wind speeds near the eyewall of a category 5 hurricane exceed the current Class I turbine design threshold of 50 m s-1 mean wind and 70 m s-1 gusts. Largest gust factors occur at the eye-eyewall interface. Further, shifts in wind direction suggest that turbines must rotate or yaw faster than current practice. Although current design standards omit mention of wind direction change across the rotor layer, large values (15-50°) suggest that veer should be considered.
Gusts and shear within hurricane eyewalls can exceed offshore wind turbine design standards
Worsnop, Rochelle P.; Lundquist, Julie K.; Bryan, George H.; ...
2017-05-30
Here, offshore wind energy development is underway in the U.S., with proposed sites located in hurricane-prone regions. Turbine design criteria outlined by the International Electrotechnical Commission do not encompass the extreme wind speeds and directional shifts of hurricanes stronger than category 2. We examine a hurricane's turbulent eyewall using large-eddy simulations with Cloud Model 1. Gusts and mean wind speeds near the eyewall of a category 5 hurricane exceed the current Class I turbine design threshold of 50 m s –1 mean wind and 70 m s –1 gusts. Largest gust factors occur at the eye-eyewall interface. Further, shifts inmore » wind direction suggest that turbines must rotate or yaw faster than current practice. Although current design standards omit mention of wind direction change across the rotor layer, large values (15–50°) suggest that veer should be considered.« less
NASA Astrophysics Data System (ADS)
Sun, Liang; McKay, Matthew R.
2014-08-01
This paper studies the sum rate performance of a low complexity quantized CSI-based Tomlinson-Harashima (TH) precoding scheme for downlink multiuser MIMO tansmission, employing greedy user selection. The asymptotic distribution of the output signal to interference plus noise ratio of each selected user and the asymptotic sum rate as the number of users K grows large are derived by using extreme value theory. For fixed finite signal to noise ratios and a finite number of transmit antennas $n_T$, we prove that as K grows large, the proposed approach can achieve the optimal sum rate scaling of the MIMO broadcast channel. We also prove that, if we ignore the precoding loss, the average sum rate of this approach converges to the average sum capacity of the MIMO broadcast channel. Our results provide insights into the effect of multiuser interference caused by quantized CSI on the multiuser diversity gain.
Gusts and shear within hurricane eyewalls can exceed offshore wind turbine design standards
DOE Office of Scientific and Technical Information (OSTI.GOV)
Worsnop, Rochelle P.; Lundquist, Julie K.; Bryan, George H.
Here, offshore wind energy development is underway in the U.S., with proposed sites located in hurricane-prone regions. Turbine design criteria outlined by the International Electrotechnical Commission do not encompass the extreme wind speeds and directional shifts of hurricanes stronger than category 2. We examine a hurricane's turbulent eyewall using large-eddy simulations with Cloud Model 1. Gusts and mean wind speeds near the eyewall of a category 5 hurricane exceed the current Class I turbine design threshold of 50 m s –1 mean wind and 70 m s –1 gusts. Largest gust factors occur at the eye-eyewall interface. Further, shifts inmore » wind direction suggest that turbines must rotate or yaw faster than current practice. Although current design standards omit mention of wind direction change across the rotor layer, large values (15–50°) suggest that veer should be considered.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rose, P.; Powers, W.; Hritzay, D.
1959-06-01
The development of an arc wind tunnel capable of stagnation pressures in the excess of twenty atmospheres and using as much as fifteen megawatts of electrical power is described. The calibration of this facility shows that it is capable of reproducing the aerodynamic environment encountered by vehicles flying at velocities as great as satellite velocity. Its use as a missile re-entry material test facility is described. The large power capacity of this facility allows one to make material tests on specimens of size sufficient to be useful for material development yet at realistic energy and Reynolds number values. By themore » addition of a high-capacity vacuum system, this facility can be used to produce the low density, high Mach number environment needed for simulating satellite re-entry, as well as hypersonic flight at extreme altitudes. (auth)« less
NASA Astrophysics Data System (ADS)
Rieder, H. E.; Staehelin, J.; Maeder, J. A.; Ribatet, M.; Stübi, R.; Weihs, P.; Holawe, F.; Peter, T.; Davison, A. C.
2009-04-01
Over the last few decades negative trends in stratospheric ozone have been studied because of the direct link between decreasing stratospheric ozone and increasing surface UV-radiation. Recently a discussion on ozone recovery has begun. Long-term measurements of total ozone extending back earlier than 1958 are limited and only available from a few stations in the northern hemisphere. The world's longest total ozone record is available from Arosa, Switzerland (Staehelin et al., 1998a,b). At this site total ozone measurements have been made since late 1926 through the present day. Within this study (Rieder et al., 2009) new tools from extreme value theory (e.g. Coles, 2001; Ribatet, 2007) are applied to select mathematically well-defined thresholds for extreme low and extreme high total ozone. A heavy-tail focused approach is used by fitting the Generalized Pareto Distribution (GPD) to the Arosa time series. Asymptotic arguments (Pickands, 1975) justify the use of the GPD for modeling exceedances over a sufficiently high (or below a sufficiently low) threshold (Coles, 2001). More precisely, the GPD is the limiting distribution of normalized excesses over a threshold, as the threshold approaches the endpoint of the distribution. In practice, GPD parameters are fitted, to exceedances by maximum likelihood or other methods - such as the probability weighted moments. A preliminary step consists in defining an appropriate threshold for which the asymptotic GPD approximation holds. Suitable tools for threshold selection as the MRL-plot (mean residual life plot) and TC-plot (stability plot) from the POT-package (Ribatet, 2007) are presented. The frequency distribution of extremes in low (termed ELOs) and high (termed EHOs) total ozone and their influence on the long-term changes in total ozone are analyzed. Further it is shown that from the GPD-model the distribution of so-called ozone mini holes (e.g. Bojkov and Balis, 2001) can be precisely estimated and that the "extremes concept" provides new information on the data distribution and variability within the Arosa record as well as on the influence of ELOs and EHOs on the long-term trends of the ozone time series. References: Bojkov, R. D., and Balis, D.S.: Characteristics of episodes with extremely low ozone values in the northern middle latitudes 1975-2000, Ann. Geophys., 19, 797-807, 2001. Coles, S.: An Introduction to Statistical Modeling of Extreme Values, Springer Series in Statistics, ISBN:1852334592, Springer, Berlin, 2001. Pickands, J.: Statistical inference using extreme order statistics, Ann. Stat., 3, 1, 119-131, 1975. Ribatet, M.: POT: Modelling peaks over a threshold, R News, 7, 34-36, 2007. Rieder, H.E., Staehelin, J., Maeder, J.A., Stübi, R., Weihs, P., Holawe, F., and M. Ribatet: From ozone mini holes and mini highs towards extreme value theory: New insights from extreme events and non stationarity, submitted to J. Geophys. Res., 2009. Staehelin, J., Kegel, R., and Harris, N. R.: Trend analysis of the homogenized total ozone series of Arosa (Switzerland), 1929-1996, J. Geophys. Res., 103(D7), 8389-8400, doi:10.1029/97JD03650, 1998a. Staehelin, J., Renaud, A., Bader, J., McPeters, R., Viatte, P., Hoegger, B., Bugnion, V., Giroud, M., and Schill, H.: Total ozone series at Arosa (Switzerland): Homogenization and data comparison, J. Geophys. Res., 103(D5), 5827-5842, doi:10.1029/97JD02402, 1998b.
NASA Astrophysics Data System (ADS)
García-Cueto, O. Rafael; Cavazos, M. Tereza; de Grau, Pamela; Santillán-Soto, Néstor
2014-04-01
The generalized extreme value distribution is applied in this article to model the statistical behavior of the maximum and minimum temperature distribution tails in four cities of Baja California in northwestern Mexico, using data from 1950-2010. The approach used of the maximum of annual time blocks. Temporal trends were included as covariates in the location parameter (μ), which resulted in significant improvements to the proposed models, particularly for the extreme maximum temperature values in the cities of Mexicali, Tijuana, and Tecate, and the extreme minimum temperature values in Mexicali and Ensenada. These models were used to estimate future probabilities over the next 100 years (2015-2110) for different time periods, and they were compared with changes in the extreme (P90th and P10th) percentiles of maximum and minimum temperature scenarios for a set of six general circulation models under low (RCP4.5) and high (RCP8.5) radiative forcings. By the end of the twenty-first century, the scenarios of the changes in extreme maximum summer temperature are of the same order in both the statistical model and the high radiative scenario (increases of 4-5 °C). The low radiative scenario is more conservative (increases of 2-3 °C). The winter scenario shows that minimum temperatures could be less severe; the temperature increases suggested by the probabilistic model are greater than those projected for the end of the century by the set of global models under RCP4.5 and RCP8.5 scenarios. The likely impacts on the region are discussed.
The effect of local parameters on gas turbine emissions
NASA Technical Reports Server (NTRS)
Kauffman, C. W.; Correa, S. M.; Orozco, N. J.
1980-01-01
Gas turbine engine inlet parameters reflect changes in local atmospheric conditions. The pollutant emissions for the engine reflects these changes. In attempting to model the effect of the changing ambient conditions on the emissions it was found that these emissions exhibit an extreme sensitivity to some of the details of the combustion process such as the local fuel-air ratio and the size of the drops in the fuel spray. Fuel-air ratios have been mapped under nonburning conditions using a single JT8D-17 combustion can at simulated idle conditions, and significant variations in the local values have been found. Modelling of the combustor employs a combination of perfectly stirred and plug flow reactors including a finite rate vaporization treatment of the fuel spray. Results show that a small increase in the mean drop size can lead to a large increase in hydrocarbon emissions and decreasing the value of the CO-OH rate constant can lead to large increases in the carbon monoxide emissions. These emissions may also be affected by the spray characteristics with larger drops retarding the combustion process. Hydrocarbon, carbon monoxide, and oxides of nitrogen emissions calculated using the model accurately reflect measured emission variations caused by changing engine inlet conditions.
Tomographic local 2D analyses of the WISExSuperCOSMOS all-sky galaxy catalogue
NASA Astrophysics Data System (ADS)
Novaes, C. P.; Bernui, A.; Xavier, H. S.; Marques, G. A.
2018-05-01
The recent progress in obtaining larger and deeper galaxy catalogues is of fundamental importance for cosmological studies, especially to robustly measure the large scale density fluctuations in the Universe. The present work uses the Minkowski Functionals (MF) to probe the galaxy density field from the WISExSuperCOSMOS (WSC) all-sky catalogue by performing tomographic local analyses in five redshift shells (of thickness δz = 0.05) in the total range of 0.10 < z < 0.35. Here, for the first time, the MF are applied to 2D projections of the galaxy number count (GNC) fields with the purpose of looking for regions in the WSC catalogue with unexpected features compared to ΛCDM mock realisations. Our methodology reveals 1 - 3 regions of the GNC maps in each redshift shell with an uncommon behaviour (extreme regions), i.e., p-value < 1.4%. Indeed, the resulting MF curves show signatures that suggest the uncommon behaviour to be associated with the presence of over- or under-densities there, but contamination due to residual foregrounds is not discarded. Additionally, even though our analyses indicate a good agreement among data and simulations, we identify 1 highly extreme region, seemingly associated to a large clustered distribution of galaxies. Our results confirm the usefulness of the MF to analyse GNC maps from photometric galaxy datasets.
Modeling jointly low, moderate, and heavy rainfall intensities without a threshold selection
NASA Astrophysics Data System (ADS)
Naveau, Philippe; Huser, Raphael; Ribereau, Pierre; Hannart, Alexis
2016-04-01
In statistics, extreme events are often defined as excesses above a given large threshold. This definition allows hydrologists and flood planners to apply Extreme-Value Theory (EVT) to their time series of interest. Even in the stationary univariate context, this approach has at least two main drawbacks. First, working with excesses implies that a lot of observations (those below the chosen threshold) are completely disregarded. The range of precipitation is artificially shopped down into two pieces, namely large intensities and the rest, which necessarily imposes different statistical models for each piece. Second, this strategy raises a nontrivial and very practical difficultly: how to choose the optimal threshold which correctly discriminates between low and heavy rainfall intensities. To address these issues, we propose a statistical model in which EVT results apply not only to heavy, but also to low precipitation amounts (zeros excluded). Our model is in compliance with EVT on both ends of the spectrum and allows a smooth transition between the two tails, while keeping a low number of parameters. In terms of inference, we have implemented and tested two classical methods of estimation: likelihood maximization and probability weighed moments. Last but not least, there is no need to choose a threshold to define low and high excesses. The performance and flexibility of this approach are illustrated on simulated and hourly precipitation recorded in Lyon, France.
The Seasonal Predictability of Extreme Wind Events in the Southwest United States
NASA Astrophysics Data System (ADS)
Seastrand, Simona Renee
Extreme wind events are a common phenomenon in the Southwest United States. Entities such as the United States Air Force (USAF) find the Southwest appealing for many reasons, primarily for the an expansive, unpopulated, and electronically unpolluted space for large-scale training and testing. However, wind events can cause hazards for the USAF including: surface wind gusts can impact the take-off and landing of all aircraft, can tip the airframes of large wing-surface aircraft during the performance of maneuvers close to the ground, and can even impact weapons systems. This dissertation is comprised of three sections intended to further our knowledge and understanding of wind events in the Southwest. The first section builds a climatology of wind events for seven locations in the Southwest during the twelve 3-month seasons of the year. The first section further examines the wind events in relation to terrain and the large-scale flow of the atmosphere. The second section builds upon the first by taking the wind events and generating mid-level composites for each of the twelve 3-month seasons. In the third section, teleconnections identified as consistent with the large-scale circulation in the second paper were used as predictor variables to build a Poisson regression model for each of the twelve 3-month seasons. The purpose of this research is to increase our understanding of the climatology of extreme wind events, increase our understanding of how the large-scale circulation influences extreme wind events, and create a model to enhance predictability of extreme wind events in the Southwest. Knowledge from this paper will help protect personnel and property associated with not only the USAF, but all those in the Southwest.
NASA Astrophysics Data System (ADS)
Yu, Lejiang; Zhong, Shiyuan; Pei, Lisi; Bian, Xindi; Heilman, Warren E.
2016-04-01
The mean global climate has warmed as a result of the increasing emission of greenhouse gases induced by human activities. This warming is considered the main reason for the increasing number of extreme precipitation events in the US. While much attention has been given to extreme precipitation events occurring over several days, which are usually responsible for severe flooding over a large region, little is known about how extreme precipitation events that cause flash flooding and occur at sub-daily time scales have changed over time. Here we use the observed hourly precipitation from the North American Land Data Assimilation System Phase 2 forcing datasets to determine trends in the frequency of extreme precipitation events of short (1 h, 3 h, 6 h, 12 h and 24 h) duration for the period 1979-2013. The results indicate an increasing trend in the central and eastern US. Over most of the western US, especially the Southwest and the Intermountain West, the trends are generally negative. These trends can be largely explained by the interdecadal variability of the Pacific Decadal Oscillation and Atlantic Multidecadal Oscillation (AMO), with the AMO making a greater contribution to the trends in both warm and cold seasons.
Increasing climate whiplash in 21st century California
NASA Astrophysics Data System (ADS)
Swain, D. L.; Langenbrunner, B.; Neelin, J. D.; Hall, A. D.
2017-12-01
Temperate "Mediterranean" climate regimes across the globe are particularly susceptible to wide swings between drought and flood—of which California's rapid transition from record multi-year dryness between 2012-2016 to extreme wetness during 2016-2017 provides a dramatic example. The wide-ranging human and environmental impacts of this recent "climate whiplash" event in a highly-populated, economically critical, and biodiverse region highlight the importance of understanding weather and climate extremes at both ends of the hydroclimatic spectrum. Previous studies have examined the potential contribution of anthropogenic warming to recent California extremes, but findings to date have been mixed and primarily drought-focused. Here, we use specific historical California flood and drought events as thresholds for quantifying long-term changes in precipitation extremes using a large ensemble of multi-decadal climate model simulations (CESM-LENS). We find that greenhouse gas emissions are already responsible for a detectable increase in both wet and dry extremes across portions of California, and that increasing 21st century "climate whiplash" will likely yield large increases in the frequency of both rapid "dry-to-wet" transitions and severe flood events over a wide range of timescales. This projected intensification of California's hydrological cycle would seriously challenge the region's existing water storage, conveyance, and flood control infrastructure—even absent large changes in mean precipitation.
2010-04-01
000 the response of damage dependent processes like fatigue crack formation, a framework is needed that accounts for the extreme value life...many different damage processes (e.g. fatigue, creep, fracture). In this work, multiple material volumes for both IN100 and Ti-6Al-4V are simulated via...polycrystalline P/M Ni-base superalloy IN100 Typically, fatigue damage formation in polycrystalline superalloys has been linked to the existence of
NASA Astrophysics Data System (ADS)
Ludwig, R.
2017-12-01
There is as yet no confirmed knowledge whether and how climate change contributes to the magnitude and frequency of hydrological extreme events and how regional water management could adapt to the corresponding risks. The ClimEx project (2015-2019) investigates the effects of climate change on the meteorological and hydrological extreme events and their implications for water management in Bavaria and Québec. High Performance Computing is employed to enable the complex simulations in a hydro-climatological model processing chain, resulting in a unique high-resolution and transient (1950-2100) dataset of climatological and meteorological forcing and hydrological response: (1) The climate module has developed a large ensemble of high resolution data (12km) of the CRCM5 RCM for Central Europe and North-Eastern North America, downscaled from 50 members of the CanESM2 GCM. The dataset is complemented by all available data from the Euro-CORDEX project to account for the assessment of both natural climate variability and climate change. The large ensemble with several thousand model years provides the potential to catch rare extreme events and thus improves the process understanding of extreme events with return periods of 1000+ years. (2) The hydrology module comprises process-based and spatially explicit model setups (e.g. WaSiM) for all major catchments in Bavaria and Southern Québec in high temporal (3h) and spatial (500m) resolution. The simulations form the basis for in depth analysis of hydrological extreme events based on the inputs from the large climate model dataset. The specific data situation enables to establish a new method for `virtual perfect prediction', which assesses climate change impacts on flood risk and water resources management by identifying patterns in the data which reveal preferential triggers of hydrological extreme events. The presentation will highlight first results from the analysis of the large scale ClimEx model ensemble, showing the current and future ratio of natural variability and climate change impacts on meteorological extreme events. Selected data from the ensemble is used to drive a hydrological model experiment to illustrate the capacity to better determine the recurrence periods of hydrological extreme events under conditions of climate change.
Natural Hazards characterisation in industrial practice
NASA Astrophysics Data System (ADS)
Bernardara, Pietro
2017-04-01
The definition of rare hydroclimatic extremes (up to 10-4 annual probability of occurrence) is of the utmost importance for the design of high value industrial infrastructures, such as grids, power plants, offshore platforms. The underestimation as well as the overestimation of the risk may lead to huge costs (ex. mid-life expensive works or overdesign) which may even prevent the project to happen. Nevertheless, the uncertainty associated to the extrapolation towards the rare frequencies are huge and manifold. They are mainly due to the scarcity of observations, the lack of quality on the extreme value records and on the arbitrary choice of the models used for extrapolations. This often put the design engineers in uncomfortable situations when they must choose the design values to use. Providentially, the recent progresses in the earth observation techniques, information technology, historical data collection and weather and ocean modelling are making huge datasets available. A careful use of big datasets of observations and modelled data are leading towards a better understanding of the physics of the underlying phenomena, the complex interactions between them and thus of the extreme events frequency extrapolations. This will move the engineering practice from the single site, small sample, application of statistical analysis to a more spatially coherent, physically driven extrapolation of extreme values. Few examples, from the EDF industrial practice are given to illustrate these progresses and their potential impact on the design approaches.
NASA Astrophysics Data System (ADS)
Masud, M. B.; Khaliq, M. N.; Wheater, H. S.
2017-09-01
The effects of climate change on April-October short- and long-duration precipitation extremes over the Canadian Prairie Provinces were evaluated using a multi-Regional Climate Model (RCM) ensemble available through the North American Regional Climate Change Assessment Program. Simulations considered include those performed with six RCMs driven by the National Centre for Environmental Prediction (NCEP) reanalysis II product for the 1981-2000 period and those driven by four Atmosphere-Ocean General Circulation Models (AOGCMs) for the current 1971-2000 and future 2041-2070 periods (i.e. a total of 11 current-to-future period simulation pairs). A regional frequency analysis approach was used to develop 2-, 5-, 10-, 25-, and 50-year return values of precipitation extremes from NCEP and AOGCM-driven current and future period simulations that respectively were used to study the performance of RCMs and projected changes for selected return values at regional, grid-cell and local scales. Performance errors due to internal dynamics and physics of RCMs studied for the 1981-2000 period reveal considerable variation in the performance of the RCMs. However, the performance errors were found to be much smaller for RCM ensemble averages than for individual RCMs. Projected changes in future climate to selected regional return values of short-duration (e.g. 15- and 30-min) precipitation extremes and for longer return periods (e.g. 50-year) were found to be mostly larger than those to the longer duration (e.g. 24- and 48-h) extremes and short return periods (e.g. 2-year). Overall, projected changes in precipitation extremes were larger for southeastern regions followed by southern and northern regions and smaller for southwestern and western regions of the study area. The changes to return values were also found to be statistically significant for the majority of the RCM-AOGCM simulation pairs. These projections might be useful as a key input for the future planning of urban drainage infrastructure and development of strategic climate change adaptation measures.
European Extremely Large Telescope: progress report
NASA Astrophysics Data System (ADS)
Tamai, R.; Spyromilio, J.
2014-07-01
The European Extremely Large Telescope is a project of the European Southern Observatory to build and operate a 40-m class optical near-infrared telescope. The telescope design effort is largely concluded and construction contracts are being placed with industry and academic/research institutes for the various components. The siting of the telescope in Northern Chile close to the Paranal site allows for an integrated operation of the facility providing significant economies. The progress of the project in various areas is presented in this paper and references to other papers at this SPIE meeting are made.
NASA Astrophysics Data System (ADS)
Song, Xuezhen; Dong, Baoli; Kong, Xiuqi; Wang, Chao; Zhang, Nan; Lin, Weiying
2018-01-01
Hypochlorite is one of the important reactive oxygen species (ROS) and plays critical roles in many biologically vital processes. Herein, we present a unique ratiometric fluorescent probe (CBP) with an extremely large emission shift for detecting hypochlorite in living cells. Utilizing positively charged α,β-unsaturated carbonyl group as the reaction site, the probe CBP itself exhibited near-infrared (NIR) fluorescence at 662 nm, and can display strong blue fluorescence at 456 nm when responded to hypochlorite. Notably, the extremely large emission shift of 206 nm could enable the precise measurement of the fluorescence peak intensities and ratios. CBP showed high sensitivity, excellent selectivity, desirable performance at physiological pH, and low cytotoxicity. The bioimaging experiments demonstrate the biological application of CBP for the ratiometric imaging of hypochlorite in living cells.
NASA Astrophysics Data System (ADS)
Korytárová, J.; Vaňková, L.
2017-10-01
Paper builds on previous research of the authors into the evaluation of economic efficiency of transport infrastructure projects evaluated by the economic efficiency ratio - NPV, IRR and BCR. Values of indicators and subsequent outputs of the sensitivity analysis show extremely favourable values in some cases. The authors dealt with the analysis of these indicators down to the level of the input variables and examined which inputs have a larger share of these extreme values. NCF for the calculation of above mentioned ratios is created by benefits that arise as the difference between zero and investment options of the project (savings in travel and operating costs, savings in travel time costs, reduction in accident costs and savings in exogenous costs) as well as total agency costs. Savings in travel time costs which contribute to the overall utility of projects by more than 70% appear to be the most important benefits in the long term horizon. This is the reason why this benefit emphasized. The outcome of the article has resulted how the particular basic variables contributed to the total robustness of economic efficiency of these project.
NASA Astrophysics Data System (ADS)
Dai, Yimian; Wu, Yiquan; Song, Yu; Guo, Jun
2017-03-01
To further enhance the small targets and suppress the heavy clutters simultaneously, a robust non-negative infrared patch-image model via partial sum minimization of singular values is proposed. First, the intrinsic reason behind the undesirable performance of the state-of-the-art infrared patch-image (IPI) model when facing extremely complex backgrounds is analyzed. We point out that it lies in the mismatching of IPI model's implicit assumption of a large number of observations with the reality of deficient observations of strong edges. To fix this problem, instead of the nuclear norm, we adopt the partial sum of singular values to constrain the low-rank background patch-image, which could provide a more accurate background estimation and almost eliminate all the salient residuals in the decomposed target image. In addition, considering the fact that the infrared small target is always brighter than its adjacent background, we propose an additional non-negative constraint to the sparse target patch-image, which could not only wipe off more undesirable components ulteriorly but also accelerate the convergence rate. Finally, an algorithm based on inexact augmented Lagrange multiplier method is developed to solve the proposed model. A large number of experiments are conducted demonstrating that the proposed model has a significant improvement over the other nine competitive methods in terms of both clutter suppressing performance and convergence rate.
The multiple facets of Peto's paradox: a life-history model for the evolution of cancer suppression.
Brown, Joel S; Cunningham, Jessica J; Gatenby, Robert A
2015-07-19
Large animals should have higher lifetime probabilities of cancer than small animals because each cell division carries an attendant risk of mutating towards a tumour lineage. However, this is not observed--a (Peto's) paradox that suggests large and/or long-lived species have evolved effective cancer suppression mechanisms. Using the Euler-Lotka population model, we demonstrate the evolutionary value of cancer suppression as determined by the 'cost' (decreased fecundity) of suppression verses the 'cost' of cancer (reduced survivorship). Body size per se will not select for sufficient cancer suppression to explain the paradox. Rather, cancer suppression should be most extreme when the probability of non-cancer death decreases with age (e.g. alligators), maturation is delayed, fecundity rates are low and fecundity increases with age. Thus, the value of cancer suppression is predicted to be lowest in the vole (short lifespan, high fecundity) and highest in the naked mole rat (long lived with late female sexual maturity). The life history of pre-industrial humans likely selected for quite low levels of cancer suppression. In modern humans that live much longer, this level results in unusually high lifetime cancer risks. The model predicts a lifetime risk of 49% compared with the current empirical value of 43%. © 2015 The Author(s) Published by the Royal Society. All rights reserved.
NASA Technical Reports Server (NTRS)
Ashouri, Hamed; Sorooshian, Soroosh; Hsu, Kuo-Lin; Bosilovich, Michael G.; Lee, Jaechoul; Wehner, Michael F.; Collow, Allison
2016-01-01
This study evaluates the performance of NASA's Modern-Era Retrospective Analysis for Research and Applications (MERRA) precipitation product in reproducing the trend and distribution of extreme precipitation events. Utilizing the extreme value theory, time-invariant and time-variant extreme value distributions are developed to model the trends and changes in the patterns of extreme precipitation events over the contiguous United States during 1979-2010. The Climate Prediction Center (CPC) U.S.Unified gridded observation data are used as the observational dataset. The CPC analysis shows that the eastern and western parts of the United States are experiencing positive and negative trends in annual maxima, respectively. The continental-scale patterns of change found in MERRA seem to reasonably mirror the observed patterns of change found in CPC. This is not previously expected, given the difficulty in constraining precipitation in reanalysis products. MERRA tends to overestimate the frequency at which the 99th percentile of precipitation is exceeded because this threshold tends to be lower in MERRA, making it easier to be exceeded. This feature is dominant during the summer months. MERRA tends to reproduce spatial patterns of the scale and location parameters of the generalized extreme value and generalized Pareto distributions. However, MERRA underestimates these parameters, particularly over the Gulf Coast states, leading to lower magnitudes in extreme precipitation events. Two issues in MERRA are identified: 1) MERRA shows a spurious negative trend in Nebraska and Kansas, which is most likely related to the changes in the satellite observing system over time that has apparently affected the water cycle in the central United States, and 2) the patterns of positive trend over the Gulf Coast states and along the East Coast seem to be correlated with the tropical cyclones in these regions. The analysis of the trends in the seasonal precipitation extremes indicates that the hurricane and winter seasons are contributing the most to these trend patterns in the southeastern United States. In addition, the increasing annual trend simulated by MERRA in the Gulf Coast region is due to an incorrect trend in winter precipitation extremes.
Ashouri, Hamed; Sorooshian, Soroosh; Hsu, Kuo-Lin; ...
2016-02-03
This study evaluates the performance of NASA's Modern-Era Retrospective Analysis for Research and Applications (MERRA) precipitation product in reproducing the trend and distribution of extreme precipitation events. Utilizing the extreme value theory, time-invariant and time-variant extreme value distributions are developed to model the trends and changes in the patterns of extreme precipitation events over the contiguous United States during 1979-2010. The Climate Prediction Center (CPC)U.S.Unified gridded observation data are used as the observational dataset. The CPC analysis shows that the eastern and western parts of the United States are experiencing positive and negative trends in annual maxima, respectively. The continental-scalemore » patterns of change found in MERRA seem to reasonably mirror the observed patterns of change found in CPC. This is not previously expected, given the difficulty in constraining precipitation in reanalysis products. MERRA tends to overestimate the frequency at which the 99th percentile of precipitation is exceeded because this threshold tends to be lower in MERRA, making it easier to be exceeded. This feature is dominant during the summer months. MERRAtends to reproduce spatial patterns of the scale and location parameters of the generalized extreme value and generalized Pareto distributions. However, MERRA underestimates these parameters, particularly over the Gulf Coast states, leading to lower magnitudes in extreme precipitation events. Two issues in MERRA are identified: 1)MERRAshows a spurious negative trend in Nebraska andKansas, which ismost likely related to the changes in the satellite observing system over time that has apparently affected the water cycle in the central United States, and 2) the patterns of positive trend over theGulf Coast states and along the East Coast seem to be correlated with the tropical cyclones in these regions. The analysis of the trends in the seasonal precipitation extremes indicates that the hurricane and winter seasons are contributing the most to these trend patterns in the southeastern United States. The increasing annual trend simulated by MERRA in the Gulf Coast region is due to an incorrect trend in winter precipitation extremes.« less
Western US high June 2015 temperatures and their relation to global warming and soil moisture
NASA Astrophysics Data System (ADS)
Philip, Sjoukje Y.; Kew, Sarah F.; Hauser, Mathias; Guillod, Benoit P.; Teuling, Adriaan J.; Whan, Kirien; Uhe, Peter; Oldenborgh, Geert Jan van
2018-04-01
The Western US states Washington (WA), Oregon (OR) and California (CA) experienced extremely high temperatures in June 2015. The temperature anomalies were so extreme that they cannot be explained with global warming alone. We investigate the hypothesis that soil moisture played an important role as well. We use a land surface model and a large ensemble from the weather@home modelling effort to investigate the coupling between soil moisture and temperature in a warming world. Both models show that May was anomalously dry, satisfying a prerequisite for the extreme heat wave, and they indicate that WA and OR are in a wet-to-dry transitional soil moisture regime. We use two different land surface-atmosphere coupling metrics to show that there was strong coupling between temperature, latent heat flux and the effect of soil moisture deficits on the energy balance in June 2015 in WA and OR. June temperature anomalies conditioned on wet/dry conditions show that both the mean and extreme temperatures become hotter for dry soils, especially in WA and OR. Fitting a Gaussian model to temperatures using soil moisture as a covariate shows that the June 2015 temperature values fit well in the extrapolated empirical temperature/drought lines. The high temperature anomalies in WA and OR are thus to be expected, given the dry soil moisture conditions and that those regions are in the transition from a wet to a dry regime. CA is already in the dry regime and therefore the necessity of taking soil moisture into account is of lower importance.
Extremal states of positive partial transpose in a system of three qubits
NASA Astrophysics Data System (ADS)
Steensgaard Garberg, Øyvind; Irgens, Børge; Myrheim, Jan
2013-03-01
We have studied mixed states in the system of three qubits with the property that all their partial transposes are positive; these are called PPT states. We classify a PPT state by the ranks of the state itself and its three single partial transposes. In random numerical searches, we find entangled PPT states with a large variety of rank combinations. For ranks equal to five or higher, we find both extremal and nonextremal PPT states of nearly every rank combination, with the restriction that the square sum of the four ranks of an extremal PPT state can be at most 193. We have studied especially the rank-four entangled PPT states, which are found to have rank four for every partial transpose. These states are all extremal because of the previously known result that every PPT state of rank three or less is separable. We find two distinct classes of rank-4444 entangled PPT states, identified by a real valued quadratic expression invariant under local SL(2,C) transformations, mathematically equivalent to Lorentz transformations. This quadratic Lorentz invariant is nonzero for one class of states (type I in our terminology) and zero for the other class (type II). The previously known states based on unextendible product bases are a nongeneric subclass of the type-I states. We present analytical constructions of states of both types, general enough to reproduce all the rank-4444 PPT states we have found numerically. We can not exclude the possibility that there exist nongeneric rank-four PPT states that we do not find in our random numerical searches.
Extreme Water Levels in Bangladesh: Past Trends, Future Projections and their Impact on Mortality
NASA Astrophysics Data System (ADS)
Thiele-Eich, I.; Burkart, K.; Hopson, T. M.; Simmer, C.
2014-12-01
Climate change is expected to have an impact on meteorological and therefore hydrological extremes, thereby possibly altering the vulnerability of exposed populations. Our study focuses on Bangladesh, which is particularly vulnerable to changes in extremes due to both the large population at risk, as well as geographical characteristics such as the low-rising slope of the country through which the outflow of the combined catchments of the Ganges, Brahmaputra and Meghna rivers (GBM, ~1.75 million km2) is channeled.Time series of daily discharge and water level data for the past 100 years were analyzed with respect to trends in frequency, magnitude and duration, focusing on rare but particularly high-risk events using extreme-value theory. Mortality data is available for a five-year period (2003-2007), with a distributed lag non-linear model used to examine possible connections between extreme water levels and mortality. Then, using output from the Community Climate System Model CCSM4, projections were made regarding future flooding due to changes in precipitation intensity and frequency, while also accounting for the backwater effect of sea-level rise. For this, the upper catchment precipitation as well as monthly mean thermosteric sea-level rise at the river mouth outflow were taken from the four CCSM4 1° 20th Century ensemble members as well as from six CCSM4 1° ensemble members for the RCP scenarios RCP 2.6, 4.5, 6.0 and 8.5.Results show that while e.g. the mean water level did not significantly rise during the past 100 years, a change in extreme water levels can be detected. In addition, annual minimum water levels have decreased, which is of particular importance as there is a significant connection to an increase in mortality for low water levels. While mortality does not seem to increase significantly due to extreme floods, our results indicate that return levels projected for the future shift progressively, with the effect being strongest for RCP 8.5. Further measures to strengthen the resilience of the exposed population are therefore required to ensure that climate change effects do not overwhelm the population's coping capacities.
Zhang, Mi; Wen, Xue Fa; Zhang, Lei Ming; Wang, Hui Min; Guo, Yi Wen; Yu, Gui Rui
2018-02-01
Extreme high temperature is one of important extreme weathers that impact forest ecosystem carbon cycle. In this study, applying CO 2 flux and routine meteorological data measured during 2003-2012, we examined the impacts of extreme high temperature and extreme high temperature event on net carbon uptake of subtropical coniferous plantation in Qianyanzhou. Combining with wavelet analysis, we analyzed environmental controls on net carbon uptake at different temporal scales, when the extreme high temperature and extreme high temperature event happened. The results showed that mean daily cumulative NEE decreased by 51% in the days with daily maximum air temperature range between 35 ℃ and 40 ℃, compared with that in the days with the range between 30 ℃ and 34 ℃. The effects of the extreme high temperature and extreme high temperature event on monthly NEE and annual NEE related to the strength and duration of extreme high tempe-rature event. In 2003, when strong extreme high temperature event happened, the sum of monthly cumulative NEE in July and August was only -11.64 g C·m -2 ·(2 month) -1 . The value decreased by 90%, compared with multi-year average value. At the same time, the relative variation of annual NEE reached -6.7%. In July and August, when the extreme high temperature and extreme high temperature event occurred, air temperature (T a ) and vapor press deficit (VPD) were the dominant controller for the daily variation of NEE. The coherency between NEE T a and NEE VPD was 0.97 and 0.95, respectively. At 8-, 16-, and 32-day periods, T a , VPD, soil water content at 5 cm depth (SWC), and precipitation (P) controlled NEE. The coherency between NEE SWC and NEE P was higher than 0.8 at monthly scale. The results indicated that atmospheric water deficit impacted NEE at short temporal scale, when the extreme high temperature and extreme high temperature event occurred, both of atmospheric water deficit and soil drought stress impacted NEE at long temporal scales in this ecosystem.
Chen, Han-Kuang; Pemberton, Richard
2016-01-08
We report a case of a patient who presented with an extremely high serum prostate specific antigen (PSA) level and underwent radical prostatectomy for presumed prostate cancer. Surprisingly, the whole mount prostatectomy specimen showed only small volume, organ-confined prostate adenocarcinoma and a large, benign intraprostatic cyst, which was thought to be responsible for the PSA elevation. 2016 BMJ Publishing Group Ltd.
20 years of mass balances on the Piloto glacier, Las Cuevas river basin, Mendoza, Argentina
NASA Astrophysics Data System (ADS)
Leiva, J. C.; Cabrera, G. A.; Lenzano, L. E.
2007-10-01
Climatic changes of the 20th century have altered the water cycle in the Andean basins of central Argentina. The most visible change is seen in the mountain glaciers, with loss of part of their mass due to decreasing thickness and a substantial recession in the last 100 years. This paper briefly describes the results of glacier mass balance research since 1979 in the Piloto Glacier at the Cajón del Rubio, in the headwaters of Las Cuevas River, presenting new results for the period 1997-2003. Very large interannual variability of net annual specific balance is evident, due largely to variations in winter snow accumulation, with a maximum net annual value of + 151 cm w.e. and a minimum value of - 230 cm w.e. Wet El Niño years are normally associated with positive net annual balances, while dry La Niña years generally result in negative balances. Within the 24-year period, 67% of the years show negative net annual specific balances, with a cumulative mass balance loss of - 10.50 m water equivalent (w.e.). Except for exceptions normally related to El Niño events, a general decreasing trend of winter snow accumulation is evident in the record, particularly after 1992, which has a strong effect in the overall negative mass balance values. The glacier contribution to Las Cuevas River runoff is analysed based on the Punta de Vacas River gauge station for a hypothetical year without snow precipitation (YWSP), when the snowmelt component is zero. Extremely dry years similar to a YWSP have occurred in 1968-1969, 1969-1970 and 1996-1997. The Punta de Vacas gauge station is located 62 km downstream from Piloto Glacier, and the basin contains 3.0% of uncovered glacier ice and 3.7% of debris-covered ice. The total glacier contribution to Las Cuevas River discharge is calculated as 82 ± 8% during extremely dry years. If glacier wastage continues at the present trend as observed during the last 2 decades, it will severely affect the water resources in the arid central Andes of Argentina.
NASA Astrophysics Data System (ADS)
Kamenetsky, V. S.; Norman, M. D.; Garcia, M. O.
2002-12-01
Melt inclusions carry potentially unique information about magmatic processes and the compositional evolution of erupted lavas. Major element compositions of olivine-hosted melt inclusions in submarine tholeiitic picrites from the southwest rift zone of Mauna Loa volcano have been studied to examine the compositional variability of primitive magmas feeding the world's largest volcano. Approximately 600 naturally quenched inclusions were examined from 8 samples with 3-25 vol% olivine phenocrysts and 9-22 wt% MgO. Olivine compositions ranged from Fo91-Fo82. The inclusions show a continuous variation in FeO contents from near-magmatic values (9 to 11 wt%) in the most evolved olivines to extremely low values (3.5 to 7.0 wt%) in the most primitive olivines. This appears to reflect a complex magmatic history for these crystals involving extensive re-equlibration of melts trapped by early formed phenocrysts with their host olivine. Extreme compositional variability also characterizes incompatible elements that would not be affected by equilibration with the host olivine. Inclusions trapped in relatively primitive olivines (Fo88-91) show a large range of K2O contents (0.1 to 2.1 wt%), whereas inclusions in more evolved olivines converge on whole rock compositions with 0.3 to 0.4 wt% K2O. Similarly, TiO2/K2O, Na2O/K2O, and K2O/P2O5 ratios of inclusions in primitive olivines span a much larger range than do inclusions hosted by more evolved olivines, with TiO2/K2O ratios extending from enriched to depleted compositions (1.2 to 24.7) in primitive olivines, and converging on whole rock compositions (TiO2/K2O = 6-9) in more evolved host olivine. This points toward extreme compositional variability in melts feeding Mauna Loa, and effective mixing of these melt parcels in the shallower summit reservoir to produce the restricted range of whole rock compositions sampled by erupted lavas. Whole rock compositions, therefore provide an integrated view of melting and high-level mixing processes, whereas melt inclusions provide more detailed information about source characteristics.
Extreme storm surge and wind wave climate scenario simulations at the Venetian littoral
NASA Astrophysics Data System (ADS)
Lionello, P.; Galati, M. B.; Elvini, E.
Scenario climate projections for extreme marine storms producing storm surges and wind waves are very important for the northern flat coast of the Adriatic Sea, where the area at risk includes a unique cultural and environmental heritage, and important economic activities. This study uses a shallow water model and a spectral wave model for computing the storm surge and the wind wave field, respectively, from the sea level pressure and wind fields that have been computed by the RegCM regional climate model. Simulations cover the period 1961-1990 for the present climate (control simulations) and the period 2071-2100 for the A2 and B2 scenarios. Generalized Extreme Value analysis is used for estimating values for the 10 and 100 year return times. The adequacy of these modeling tools for a reliable estimation of the climate change signal, without needing further downscaling is shown. However, this study has mainly a methodological value, because issues such as interdecadal variability and intermodel variability cannot be addressed, since the analysis is based on single model 30-year long simulations. The control simulation looks reasonably accurate for extreme value analysis, though it overestimates/underestimates the frequency of high/low surge and wind wave events with respect to observations. Scenario simulations suggest higher frequency of intense storms for the B2 scenario, but not for the A2. Likely, these differences are not the effect of climate change, but of climate multidecadal variability. Extreme storms are stronger in future scenarios, but differences are not statistically significant. Therefore this study does not provide convincing evidence for more stormy conditions in future scenarios.
[Quantitative Evaluation of Metal Artifacts on CT Images on the Basis of Statistics of Extremes].
Kitaguchi, Shigetoshi; Imai, Kuniharu; Ueda, Suguru; Hashimoto, Naomi; Hattori, Shouta; Saika, Takahiro; Ono, Yoshifumi
2016-05-01
It is well-known that metal artifacts have a harmful effect on the image quality of computed tomography (CT) images. However, the physical property remains still unknown. In this study, we investigated the relationship between metal artifacts and tube currents using statistics of extremes. A commercially available phantom for measuring CT dose index 160 mm in diameter was prepared and a brass rod 13 mm in diameter was placed at the centerline of the phantom. This phantom was used as a target object to evaluate metal artifacts and was scanned using an area detector CT scanner with various tube currents under a constant tube voltage of 120 kV. Sixty parallel line segments with a length of 100 pixels were placed to cross metal artifacts on CT images and the largest difference between two adjacent CT values in each of 60 CT value profiles of these line segments was employed as a feature variable for measuring metal artifacts; these feature variables were analyzed on the basis of extreme value theory. The CT value variation induced by metal artifacts was statistically characterized by Gumbel distribution, which was one of the extreme value distributions; namely, metal artifacts have the same statistical characteristic as streak artifacts. Therefore, Gumbel evaluation method makes it possible to analyze not only streak artifacts but also metal artifacts. Furthermore, the location parameter in Gumbel distribution was shown to be in inverse proportion to the square root of a tube current. This result suggested that metal artifacts have the same dose dependence as image noises.
NASA Astrophysics Data System (ADS)
Felton, A. J.; Smith, M. D.
2016-12-01
Heightened climatic variability due to atmospheric warming is forecast to increase the frequency and severity of climate extremes. In particular, changes to interannual variability in precipitation, characterized by increases in extreme wet and dry years, are likely to impact virtually all terrestrial ecosystem processes. However, to date experimental approaches have yet to explicitly test how ecosystem processes respond to multiple levels of climatic extremity, limiting our understanding of how ecosystems will respond to forecast increases in the magnitude of climate extremes. Here we report the results of a replicated regression experimental approach, in which we imposed 9 and 11 levels of growing season precipitation amount and extremity in mesic grassland during 2015 and 2016, respectively. Each level corresponded to a specific percentile of the long-term record, which produced a large gradient of soil moisture conditions that ranged from extreme wet to extreme dry. In both 2015 and 2016, asymptotic responses to water availability were observed for soil respiration. This asymmetry was driven in part by transitions between soil moisture versus temperature constraints on respiration as conditions became increasingly dry versus increasingly wet. In 2015, aboveground net primary production (ANPP) exhibited asymmetric responses to precipitation that largely mirrored those of soil respiration. In total, our results suggest that in this mesic ecosystem, these two carbon cycle processes were more sensitive to extreme drought than to extreme wet years. Future work will assess ANPP responses for 2016, soil nutrient supply and physiological responses of the dominant plant species. Future efforts are needed to compare our findings across a diverse array of ecosystem types, and in particular how the timing and magnitude of precipitation events may modify the response of ecosystem processes to increasing magnitudes of precipitation extremes.
Jimsphere wind and turbulence exceedance statistic
NASA Technical Reports Server (NTRS)
Adelfang, S. I.; Court, A.
1972-01-01
Exceedance statistics of winds and gusts observed over Cape Kennedy with Jimsphere balloon sensors are described. Gust profiles containing positive and negative departures, from smoothed profiles, in the wavelength ranges 100-2500, 100-1900, 100-860, and 100-460 meters were computed from 1578 profiles with four 41 weight digital high pass filters. Extreme values of the square root of gust speed are normally distributed. Monthly and annual exceedance probability distributions of normalized rms gust speeds in three altitude bands (2-7, 6-11, and 9-14 km) are log-normal. The rms gust speeds are largest in the 100-2500 wavelength band between 9 and 14 km in late winter and early spring. A study of monthly and annual exceedance probabilities and the number of occurrences per kilometer of level crossings with positive slope indicates significant variability with season, altitude, and filter configuration. A decile sampling scheme is tested and an optimum approach is suggested for drawing a relatively small random sample that represents the characteristic extreme wind speeds and shears of a large parent population of Jimsphere wind profiles.
LYMAN, EMILY L.; LUTHAR, SUNIYA S.
2015-01-01
This study involved two academically-gifted samples of 11th and 12th grade youth at the socioeconomic status (SES) extremes; one from an exclusive private, affluent school, and the other from a magnet school with low-income students. Negative and positive adjustment outcomes were examined in relation to multiple dimensions of perfectionism including perceived parental pressures to be perfect, personal perfectionistic self-presentation, and envy of peers. The low-income students showed some areas of relative vulnerability, but when large group differences were found, it was the affluent youth who were at a disadvantage, with substantially higher substance use and peer envy. Affluent girls seemed particularly vulnerable, with pronounced elevations in perfectionistic tendencies, peer envy, as well as body dissatisfaction. Examination of risk and protective processes showed that relationships with mothers were associated with students’ distress as well as positive adjustment. Additionally, findings showed links between (a) envy of peers and multiple outcomes (among high SES girls in particular), (b) dimensions of perfectionism in relation to internalizing symptoms, and (c) high extrinsic versus intrinsic values in relation to externalizing symptoms. PMID:26345229
NASA Astrophysics Data System (ADS)
Pradas, Marc; Pumir, Alain; Huber, Greg; Wilkinson, Michael
2017-07-01
Chaos is widely understood as being a consequence of sensitive dependence upon initial conditions. This is the result of an instability in phase space, which separates trajectories exponentially. Here, we demonstrate that this criterion should be refined. Despite their overall intrinsic instability, trajectories may be very strongly convergent in phase space over extremely long periods, as revealed by our investigation of a simple chaotic system (a realistic model for small bodies in a turbulent flow). We establish that this strong convergence is a multi-facetted phenomenon, in which the clustering is intense, widespread and balanced by lacunarity of other regions. Power laws, indicative of scale-free features, characterize the distribution of particles in the system. We use large-deviation and extreme-value statistics to explain the effect. Our results show that the interpretation of the ‘butterfly effect’ needs to be carefully qualified. We argue that the combination of mixing and clustering processes makes our specific model relevant to understanding the evolution of simple organisms. Lastly, this notion of convergent chaos, which implies the existence of conditions for which uncertainties are unexpectedly small, may also be relevant to the valuation of insurance and futures contracts.
Inefficient star formation in extremely metal poor galaxies.
Shi, Yong; Armus, Lee; Helou, George; Stierwalt, Sabrina; Gao, Yu; Wang, Junzhi; Zhang, Zhi-Yu; Gu, Qiusheng
2014-10-16
The first galaxies contain stars born out of gas with few or no 'metals' (that is, elements heavier than helium). The lack of metals is expected to inhibit efficient gas cooling and star formation, but this effect has yet to be observed in galaxies with an oxygen abundance (relative to hydrogen) below a tenth of that of the Sun. Extremely metal poor nearby galaxies may be our best local laboratories for studying in detail the conditions that prevailed in low metallicity galaxies at early epochs. Carbon monoxide emission is unreliable as a tracer of gas at low metallicities, and while dust has been used to trace gas in low-metallicity galaxies, low spatial resolution in the far-infrared has typically led to large uncertainties. Here we report spatially resolved infrared observations of two galaxies with oxygen abundances below ten per cent of the solar value, and show that stars formed very inefficiently in seven star-forming clumps in these galaxies. The efficiencies are less than a tenth of those found in normal, metal rich galaxies today, suggesting that star formation may have been very inefficient in the early Universe.
Assessing Regional Scale Variability in Extreme Value Statistics Under Altered Climate Scenarios
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brunsell, Nathaniel; Mechem, David; Ma, Chunsheng
Recent studies have suggested that low-frequency modes of climate variability can significantly influence regional climate. The climatology associated with extreme events has been shown to be particularly sensitive. This has profound implications for droughts, heat waves, and food production. We propose to examine regional climate simulations conducted over the continental United States by applying a recently developed technique which combines wavelet multi–resolution analysis with information theory metrics. This research is motivated by two fundamental questions concerning the spatial and temporal structure of extreme events. These questions are 1) what temporal scales of the extreme value distributions are most sensitive tomore » alteration by low-frequency climate forcings and 2) what is the nature of the spatial structure of variation in these timescales? The primary objective is to assess to what extent information theory metrics can be useful in characterizing the nature of extreme weather phenomena. Specifically, we hypothesize that (1) changes in the nature of extreme events will impact the temporal probability density functions and that information theory metrics will be sensitive these changes and (2) via a wavelet multi–resolution analysis, we will be able to characterize the relative contribution of different timescales on the stochastic nature of extreme events. In order to address these hypotheses, we propose a unique combination of an established regional climate modeling approach and advanced statistical techniques to assess the effects of low-frequency modes on climate extremes over North America. The behavior of climate extremes in RCM simulations for the 20th century will be compared with statistics calculated from the United States Historical Climatology Network (USHCN) and simulations from the North American Regional Climate Change Assessment Program (NARCCAP). This effort will serve to establish the baseline behavior of climate extremes, the validity of an innovative multi–resolution information theory approach, and the ability of the RCM modeling framework to represent the low-frequency modulation of extreme climate events. Once the skill of the modeling and analysis methodology has been established, we will apply the same approach for the AR5 (IPCC Fifth Assessment Report) climate change scenarios in order to assess how climate extremes and the the influence of lowfrequency variability on climate extremes might vary under changing climate. The research specifically addresses the DOE focus area 2. Simulation of climate extremes under a changing climate. Specific results will include (1) a better understanding of the spatial and temporal structure of extreme events, (2) a thorough quantification of how extreme values are impacted by low-frequency climate teleconnections, (3) increased knowledge of current regional climate models ability to ascertain these influences, and (4) a detailed examination of the how the distribution of extreme events are likely to change under different climate change scenarios. In addition, this research will assess the ability of the innovative wavelet information theory approach to characterize extreme events. Any and all of these results will greatly enhance society’s ability to understand and mitigate the regional ramifications of future global climate change.« less
Wang, Jinghong; Lo, Siuming; Wang, Qingsong; Sun, Jinhua; Mu, Honglin
2013-08-01
Crowd density is a key factor that influences the moving characteristics of a large group of people during a large-scale evacuation. In this article, the macro features of crowd flow and subsequent rescue strategies were considered, and a series of characteristic crowd densities that affect large-scale people movement, as well as the maximum bearing density when the crowd is extremely congested, were analyzed. On the basis of characteristic crowd densities, the queuing theory was applied to simulate crowd movement. Accordingly, the moving characteristics of the crowd and the effects of typical crowd density-which is viewed as the representation of the crowd's arrival intensity in front of the evacuation passageways-on rescue strategies was studied. Furthermore, a "risk axle of crowd density" is proposed to determine the efficiency of rescue strategies in a large-scale evacuation, i.e., whether the rescue strategies are able to effectively maintain or improve evacuation efficiency. Finally, through some rational hypotheses for the value of evacuation risk, a three-dimensional distribution of the evacuation risk is established to illustrate the risk axle of crowd density. This work aims to make some macro, but original, analysis on the risk of large-scale crowd evacuation from the perspective of the efficiency of rescue strategies. © 2012 Society for Risk Analysis.
Nakajima, Arata; Aoki, Yasuchika; Sonobe, Masato; Takahashi, Hiroshi; Saito, Masahiko; Terayama, Keiichiro; Nakagawa, Koichi
2016-07-01
Radiographic progression of damage to the small joints in patients with rheumatoid arthritis (RA) is well known; however, it has not been studied fully in the large joints. In this study, we looked at the prevalence of radiographic progression of large joint damage in patients with RA treated with biological disease-modifying anti-rheumatic drugs (bDMARDs). A total of 273 large joints in the upper and lower extremities of 67 patients with RA treated with bDMARDs were investigated. Radiographs for tender and/or swollen large joints were taken at least twice during the study period (mean 18.6 months), and the progression of damage was evaluated. Progressive damage was found in 20.9% of patients and 6.2% of joints. A multivariate analysis revealed that the Larsen grade (LG) alone was a risk factor for progressive damage. The LG cutoff value was determined to be 2.5 (sensitivity: 0.529, specificity: 0.805). The only factor to predict progressive damage was the LG of the joints with symptoms, and the damage must be stopped within LG II. Regular radiographic examinations for large joints should be performed in addition to routine examinations for small joints, such as the hand and foot.
NASA Astrophysics Data System (ADS)
Fink, G.; Koch, M.
2010-12-01
An important aspect in water resources and hydrological engineering is the assessment of hydrological risk, due to the occurrence of extreme events, e.g. droughts or floods. When dealing with the latter - as is the focus here - the classical methods of flood frequency analysis (FFA) are usually being used for the proper dimensioning of a hydraulic structure, for the purpose of bringing down the flood risk to an acceptable level. FFA is based on extreme value statistics theory. Despite the progress of methods in this scientific branch, the development, decision, and fitting of an appropriate distribution function stills remains a challenge, particularly, when certain underlying assumptions of the theory are not met in real applications. This is, for example, the case when the stationarity-condition for a random flood time series is not satisfied anymore, as could be the situation when long-term hydrological impacts of future climate change are to be considered. The objective here is to verify the applicability of classical (stationary) FFA to predicted flood time series in the Fulda catchment in central Germany, as they may occur in the wake of climate change during the 21st century. These discharge time series at the outlet of the Fulda basin have been simulated with a distributed hydrological model (SWAT) that is forced by predicted climate variables of a regional climate model for Germany (REMO). From the simulated future daily time series, annual maximum (extremes) values are computed and analyzed for the purpose of risk evaluation. Although the 21st century estimated extreme flood series of the Fulda river turn out to be only mildly non-stationary, alleviating the need for further action and concern at the first sight, the more detailed analysis of the risk, as quantified, for example, by the return period, shows non-negligent differences in the calculated risk levels. This could be verified by employing a new method, the so-called flood series maximum analysis (FSMA) method, which consists in the stochastic simulation of numerous trajectories of a stochastic process with a given GEV-distribution over a certain length of time (> larger than a desired return period). Then the maximum value for each trajectory is computed, all of which are then used to determine the empirical distribution of this maximum series. Through graphical inversion of this distribution function the size of the design flood for a given risk (quantile) and given life duration can be inferred. The results of numerous simulations show that for stationary flood series, the new FSMA method results, expectedly, in nearly identical risk values as the classical FFA approach. However, once the flood time series becomes slightly non-stationary - for reasons as discussed - and regardless of whether the trend is increasing or decreasing, large differences in the computed risk values for a given design flood occur. Or in other word, for the same risk, the new FSMA method would lead to different values in the design flood for a hydraulic structure than the classical FFA method. This, in turn, could lead to some cost savings in the realization of a hydraulic project.
NASA Astrophysics Data System (ADS)
Wu, Chunhung
2016-04-01
Few researches have discussed about the applicability of applying the statistical landslide susceptibility (LS) model for extreme rainfall-induced landslide events. The researches focuses on the comparison and applicability of LS models based on four methods, including landslide ratio-based logistic regression (LRBLR), frequency ratio (FR), weight of evidence (WOE), and instability index (II) methods, in an extreme rainfall-induced landslide cases. The landslide inventory in the Chishan river watershed, Southwestern Taiwan, after 2009 Typhoon Morakot is the main materials in this research. The Chishan river watershed is a tributary watershed of Kaoping river watershed, which is a landslide- and erosion-prone watershed with the annual average suspended load of 3.6×107 MT/yr (ranks 11th in the world). Typhoon Morakot struck Southern Taiwan from Aug. 6-10 in 2009 and dumped nearly 2,000 mm of rainfall in the Chishan river watershed. The 24-hour, 48-hour, and 72-hours accumulated rainfall in the Chishan river watershed exceeded the 200-year return period accumulated rainfall. 2,389 landslide polygons in the Chishan river watershed were extracted from SPOT 5 images after 2009 Typhoon Morakot. The total landslide area is around 33.5 km2, equals to the landslide ratio of 4.1%. The main landslide types based on Varnes' (1978) classification are rotational and translational slides. The two characteristics of extreme rainfall-induced landslide event are dense landslide distribution and large occupation of downslope landslide areas owing to headward erosion and bank erosion in the flooding processes. The area of downslope landslide in the Chishan river watershed after 2009 Typhoon Morakot is 3.2 times higher than that of upslope landslide areas. The prediction accuracy of LS models based on LRBLR, FR, WOE, and II methods have been proven over 70%. The model performance and applicability of four models in a landslide-prone watershed with dense distribution of rainfall-induced landslide are interesting and meaningful. Eight landslide-related factors, including elevation, slope, aspect, geology, accumulated rainfall during 2009 Typhoon Morakot, landuse, distance to the fault, and distance to the rivers, were considered in this research. The research builds and compares the difference of the LS maps based on four methods. The average LS value from each method is 0.27 for LRBLR, 0.368 for FR, 0.553 for WOE, and 0.498 for II. The correlation analysis was conducted to identify similarities between the four LS maps. The correlation coefficients are 0.913, 0.829, 0.930, 0.756, 0.729, and 0.652 for the LRBLR vs FR, LRBLR vs WOE, FR vs WOE, LRBLR vs II, FR vs II, and WOE vs II. The research compares the model performance of four LS maps by calculating the AUC value (area under the ROC curve) and ACR value (average correct-predicted ratio). The AUC values of LS maps based on LRBLR, FR, WOE, and II methods are 0.819, 0.819, 0.822 and 0.785. The ACR values of LS maps based on LRBLR, FR, WOE, and II methods are 75.1%, 73.7%, 68.4%, and 64.2%. The results indicate that the model performance based on LRBLR method in an extreme rainfall-landslide event is better than that based on the other three methods.
Influence of climate variability versus change at multi-decadal time scales on hydrological extremes
NASA Astrophysics Data System (ADS)
Willems, Patrick
2014-05-01
Recent studies have shown that rainfall and hydrological extremes do not randomly occur in time, but are subject to multidecadal oscillations. In addition to these oscillations, there are temporal trends due to climate change. Design statistics, such as intensity-duration-frequency (IDF) for extreme rainfall or flow-duration-frequency (QDF) relationships, are affected by both types of temporal changes (short term and long term). This presentation discusses these changes, how they influence water engineering design and decision making, and how this influence can be assessed and taken into account in practice. The multidecadal oscillations in rainfall and hydrological extremes were studied based on a technique for the identification and analysis of changes in extreme quantiles. The statistical significance of the oscillations was evaluated by means of a non-parametric bootstrapping method. Oscillations in large scale atmospheric circulation were identified as the main drivers for the temporal oscillations in rainfall and hydrological extremes. They also explain why spatial phase shifts (e.g. north-south variations in Europe) exist between the oscillation highs and lows. Next to the multidecadal climate oscillations, several stations show trends during the most recent decades, which may be attributed to climate change as a result of anthropogenic global warming. Such attribution to anthropogenic global warming is, however, uncertain. It can be done based on simulation results with climate models, but it is shown that the climate model results are too uncertain to enable a clear attribution. Water engineering design statistics, such as extreme rainfall IDF or peak or low flow QDF statistics, obviously are influenced by these temporal variations (oscillations, trends). It is shown in the paper, based on the Brussels 10-minutes rainfall data, that rainfall design values may be about 20% biased or different when based on short rainfall series of 10 to 15 years length, and still 8% for series of 25 years lengths. Methods for bias correction are demonstrated. The definition of "bias" depends on a number of factors, which needs further debate in the hydrological and water engineering community. References: Willems P. (2013), 'Multidecadal oscillatory behaviour of rainfall extremes in Europe', Climatic Change, 120(4), 931-944 Willems, P. (2013). 'Adjustment of extreme rainfall statistics accounting for multidecadal climate oscillations', Journal of Hydrology, 490, 126-133 Willems, P., Olsson, J., Arnbjerg-Nielsen, K., Beecham, S., Pathirana, A., Bülow Gregersen, I., Madsen, H., Nguyen, V-T-V. (2012), 'Impacts of climate change on rainfall extremes and urban drainage', IWA Publishing, 252p., Paperback Print ISBN 9781780401256; Ebook ISBN 9781780401263
Pb isotopes of Gorgona Island (Colombia): isotopic variations correlated with magma type
NASA Astrophysics Data System (ADS)
Dupré, B.; Echeverría, L. M.
1984-02-01
Lead isotopic results obtained on komatiites and basalts from Gorgona Island provide evidence of large isotopic variations within a restricted area (8 × 2.5 km). The variations are correlated with differences in volcanic rock type. The highest isotopic ratios ( 206Pb/ 204Pb˜ 19.75 ) correspond to tholeiites which make up most of the island. The lowest ratios (18.3) correspond to the komatiites of the west coast of the island. Other rock types (komatiites of the east coast, K-tholeiites, picrites and tuffs) have isotopic characteristics intermediate between these two extreme values. These results are explained by the existence of two distinct mantle source regions, and by mixing or contamination between them.
Solar rotation effects on the thermospheres of Mars and Earth.
Forbes, Jeffrey M; Bruinsma, Sean; Lemoine, Frank G
2006-06-02
The responses of Earth's and Mars' thermospheres to the quasi-periodic (27-day) variation of solar flux due to solar rotation were measured contemporaneously, revealing that this response is twice as large for Earth as for Mars. Per typical 20-unit change in 10.7-centimeter radio flux (used as a proxy for extreme ultraviolet flux) reaching each planet, we found temperature changes of 42.0 +/- 8.0 kelvin and 19.2 +/- 3.6 kelvin for Earth and Mars, respectively. Existing data for Venus indicate values of 3.6 +/- 0.6 kelvin. Our observational result constrains comparative planetary thermosphere simulations and may help resolve existing uncertainties in thermal balance processes, particularly CO2 cooling.
Sun, Xu; Dai, Daoxin; Thylén, Lars; Wosinski, Lech
2015-10-05
A Mach-Zehnder Interferometer (MZI) liquid sensor, employing ultra-compact double-slot hybrid plasmonic (DSHP) waveguide as active sensing arm, is developed. Numerical results show that extremely large optical confinement factor of the tested analytes (as high as 88%) can be obtained by DSHP waveguide with optimized geometrical parameters, which is larger than both, conventional SOI waveguides and plasmonic slot waveguides with same widths. As for MZI sensor with 40μm long DSHP active sensing area, the sensitivity can reach as high value as 1061nm/RIU (refractive index unit). The total loss, excluding the coupling loss of the grating coupler, is around 4.5dB.
Using ensembles in water management: forecasting dry and wet episodes
NASA Astrophysics Data System (ADS)
van het Schip-Haverkamp, Tessa; van den Berg, Wim; van de Beek, Remco
2015-04-01
Extreme weather situations as droughts and extensive precipitation are becoming more frequent, which makes it more important to obtain accurate weather forecasts for the short and long term. Ensembles can provide a solution in terms of scenario forecasts. MeteoGroup uses ensembles in a new forecasting technique which presents a number of weather scenarios for a dynamical water management project, called Water-Rijk, in which water storage and water retention plays a large role. The Water-Rijk is part of Park Lingezegen, which is located between Arnhem and Nijmegen in the Netherlands. In collaboration with the University of Wageningen, Alterra and Eijkelkamp a forecasting system is developed for this area which can provide water boards with a number of weather and hydrology scenarios in order to assist in the decision whether or not water retention or water storage is necessary in the near future. In order to make a forecast for drought and extensive precipitation, the difference 'precipitation- evaporation' is used as a measurement of drought in the weather forecasts. In case of an upcoming drought this difference will take larger negative values. In case of a wet episode, this difference will be positive. The Makkink potential evaporation is used which gives the most accurate potential evaporation values during the summer, when evaporation plays an important role in the availability of surface water. Scenarios are determined by reducing the large number of forecasts in the ensemble to a number of averaged members with each its own likelihood of occurrence. For the Water-Rijk project 5 scenario forecasts are calculated: extreme dry, dry, normal, wet and extreme wet. These scenarios are constructed for two forecasting periods, each using its own ensemble technique: up to 48 hours ahead and up to 15 days ahead. The 48-hour forecast uses an ensemble constructed from forecasts of multiple high-resolution regional models: UKMO's Euro4 model,the ECMWF model, WRF and Hirlam. Using multiple model runs and additional post processing, an ensemble can be created from non-ensemble models. The 15-day forecast uses the ECMWF Ensemble Prediction System forecast from which scenarios can be deduced directly. A combination of the ensembles from the two forecasting periods is used in order to have the highest possible resolution of the forecast for the first 48 hours followed by the lower resolution long term forecast.
Predicting the cosmological constant with the scale-factor cutoff measure
DOE Office of Scientific and Technical Information (OSTI.GOV)
De Simone, Andrea; Guth, Alan H.; Salem, Michael P.
2008-09-15
It is well known that anthropic selection from a landscape with a flat prior distribution of cosmological constant {lambda} gives a reasonable fit to observation. However, a realistic model of the multiverse has a physical volume that diverges with time, and the predicted distribution of {lambda} depends on how the spacetime volume is regulated. A very promising method of regulation uses a scale-factor cutoff, which avoids a number of serious problems that arise in other approaches. In particular, the scale-factor cutoff avoids the 'youngness problem' (high probability of living in a much younger universe) and the 'Q and G catastrophes'more » (high probability for the primordial density contrast Q and gravitational constant G to have extremely large or small values). We apply the scale-factor cutoff measure to the probability distribution of {lambda}, considering both positive and negative values. The results are in good agreement with observation. In particular, the scale-factor cutoff strongly suppresses the probability for values of {lambda} that are more than about 10 times the observed value. We also discuss qualitatively the prediction for the density parameter {omega}, indicating that with this measure there is a possibility of detectable negative curvature.« less
NASA Astrophysics Data System (ADS)
Pegram, Geoff; Bardossy, Andras; Sinclair, Scott
2017-04-01
The use of radar measurements for the space time estimation of precipitation has for many decades been a central topic in hydro-meteorology. In this presentation we are interested specifically in daily and sub-daily extreme values of precipitation at gauged or ungauged locations which are important for design. The purpose of the presentation is to develop a methodology to combine daily precipitation observations and radar measurements to estimate sub-daily extremes at point locations. Radar data corrected using precipitation-reflectivity relationships lead to biased estimations of extremes. Different possibilities of correcting systematic errors using the daily observations are investigated. Observed gauged daily amounts are interpolated to un-sampled points and subsequently disaggregated using the sub-daily values obtained by the radar. Different corrections based on the spatial variability and the sub-daily entropy of scaled rainfall distributions are used to provide unbiased corrections of short duration extremes. In addition, a statistical procedure not based on a matching day by day correction is tested. In this last procedure, as we are only interested in rare extremes, low to medium values of rainfall depth were neglected leaving 12 days of ranked daily maxima in each set per year, whose sum typically comprises about 50% of each annual rainfall total. The sum of these 12 day maxima is first interpolated using a Kriging procedure. Subsequently this sum is disaggregated to daily values using a nearest neighbour procedure. The daily sums are then disaggregated by using the relative values of the biggest 12 radar based days in each year. Of course, the timings of radar and gauge maxima can be different, so the new method presented here uses radar for disaggregating daily gauge totals down to 15 min intervals in order to extract the maxima of sub-hourly through to daily rainfall. The methodologies were tested in South Africa, where an S-band radar operated relatively continuously at Bethlehem from 1998 to 2003, whose scan at 1.5 km above ground [CAPPI] overlapped a dense [10 km spacing] set of 45 pluviometers recording in the same 6-year period. This valuable set of data was obtained from each of 37 selected radar pixels [1 km square in plan] which contained a pluviometer, not masked out by the radar foot-print. The pluviometer data were also aggregated to daily totals, for the same purpose. The extremes obtained using disaggregation methods were compared to the observed extremes in a cross validation procedure. The unusual and novel goal was not to obtain the reproduction of the precipitation matching in space and time, but to obtain frequency distributions of the point extremes, which we found to be stable. Published as: Bárdossy, A., and G. G. S. Pegram (2017) Journal of Hydrology, Volume 544, pp 397-406
Ethical research as the target of animal extremism: an international problem.
Conn, P Michael; Rantin, F T
2010-02-01
Animal extremism has been increasing worldwide; frequently researchers are the targets of actions by groups with extreme animal rights agendas. Sometimes this targeting is violent and may involve assaults on family members or destruction of property. In this article, we summarize recent events and suggest steps that researchers can take to educate the public on the value of animal research both for people and animals.
Estimating the extreme low-temperature event using nonparametric methods
NASA Astrophysics Data System (ADS)
D'Silva, Anisha
This thesis presents a new method of estimating the one-in-N low temperature threshold using a non-parametric statistical method called kernel density estimation applied to daily average wind-adjusted temperatures. We apply our One-in-N Algorithm to local gas distribution companies (LDCs), as they have to forecast the daily natural gas needs of their consumers. In winter, demand for natural gas is high. Extreme low temperature events are not directly related to an LDCs gas demand forecasting, but knowledge of extreme low temperatures is important to ensure that an LDC has enough capacity to meet customer demands when extreme low temperatures are experienced. We present a detailed explanation of our One-in-N Algorithm and compare it to the methods using the generalized extreme value distribution, the normal distribution, and the variance-weighted composite distribution. We show that our One-in-N Algorithm estimates the one-in- N low temperature threshold more accurately than the methods using the generalized extreme value distribution, the normal distribution, and the variance-weighted composite distribution according to root mean square error (RMSE) measure at a 5% level of significance. The One-in- N Algorithm is tested by counting the number of times the daily average wind-adjusted temperature is less than or equal to the one-in- N low temperature threshold.
A rational decision rule with extreme events.
Basili, Marcello
2006-12-01
Risks induced by extreme events are characterized by small or ambiguous probabilities, catastrophic losses, or windfall gains. Through a new functional, that mimics the restricted Bayes-Hurwicz criterion within the Choquet expected utility approach, it is possible to represent the decisionmaker behavior facing both risky (large and reliable probability) and extreme (small or ambiguous probability) events. A new formalization of the precautionary principle (PP) is shown and a new functional, which encompasses both extreme outcomes and expectation of all the possible results for every act, is claimed.