A Fiducial Approach to Extremes and Multiple Comparisons
ERIC Educational Resources Information Center
Wandler, Damian V.
2010-01-01
Generalized fiducial inference is a powerful tool for many difficult problems. Based on an extension of R. A. Fisher's work, we used generalized fiducial inference for two extreme value problems and a multiple comparison procedure. The first extreme value problem is dealing with the generalized Pareto distribution. The generalized Pareto…
Extreme values and the level-crossing problem: An application to the Feller process
NASA Astrophysics Data System (ADS)
Masoliver, Jaume
2014-04-01
We review the question of the extreme values attained by a random process. We relate it to level crossings to one boundary (first-passage problems) as well as to two boundaries (escape problems). The extremes studied are the maximum, the minimum, the maximum absolute value, and the range or span. We specialize in diffusion processes and present detailed results for the Wiener and Feller processes.
Applied extreme-value statistics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kinnison, R.R.
1983-05-01
The statistical theory of extreme values is a well established part of theoretical statistics. Unfortunately, it is seldom part of applied statistics and is infrequently a part of statistical curricula except in advanced studies programs. This has resulted in the impression that it is difficult to understand and not of practical value. In recent environmental and pollution literature, several short articles have appeared with the purpose of documenting all that is necessary for the practical application of extreme value theory to field problems (for example, Roberts, 1979). These articles are so concise that only a statistician can recognise all themore » subtleties and assumptions necessary for the correct use of the material presented. The intent of this text is to expand upon several recent articles, and to provide the necessary statistical background so that the non-statistician scientist can recognize and extreme value problem when it occurs in his work, be confident in handling simple extreme value problems himself, and know when the problem is statistically beyond his capabilities and requires consultation.« less
The application of the statistical theory of extreme values to gust-load problems
NASA Technical Reports Server (NTRS)
Press, Harry
1950-01-01
An analysis is presented which indicates that the statistical theory of extreme values is applicable to the problems of predicting the frequency of encountering the larger gust loads and gust velocities for both specific test conditions as well as commercial transport operations. The extreme-value theory provides an analytic form for the distributions of maximum values of gust load and velocity. Methods of fitting the distribution are given along with a method of estimating the reliability of the predictions. The theory of extreme values is applied to available load data from commercial transport operations. The results indicate that the estimates of the frequency of encountering the larger loads are more consistent with the data and more reliable than those obtained in previous analyses. (author)
Nonparametric Regression Subject to a Given Number of Local Extreme Value
2001-07-01
compilation report: ADP013708 thru ADP013761 UNCLASSIFIED Nonparametric regression subject to a given number of local extreme value Ali Majidi and Laurie...locations of the local extremes for the smoothing algorithm. 280 A. Majidi and L. Davies 3 The smoothing problem We make the smoothing problem precise...is the solution of QP3. k--oo 282 A. Majidi and L. Davies FiG. 2. The captions top-left, top-right, bottom-left, bottom-right show the result of the
Neighboring extremals of dynamic optimization problems with path equality constraints
NASA Technical Reports Server (NTRS)
Lee, A. Y.
1988-01-01
Neighboring extremals of dynamic optimization problems with path equality constraints and with an unknown parameter vector are considered in this paper. With some simplifications, the problem is reduced to solving a linear, time-varying two-point boundary-value problem with integral path equality constraints. A modified backward sweep method is used to solve this problem. Two example problems are solved to illustrate the validity and usefulness of the solution technique.
Extreme value problems without calculus: a good link with geometry and elementary maths
NASA Astrophysics Data System (ADS)
Ganci, Salvatore
2016-11-01
Some classical examples of problem solving, where an extreme value condition is required, are here considered and/or revisited. The search for non-calculus solutions appears pedagogically useful and intriguing as shown through a rich literature. A teacher, who teaches both maths and physics, (as happens in Italian High schools) can find in these kinds of problems a mind stimulating exercise compared with the standard solution obtained by the differential calculus. A good link between the geometric and analytical explanations is so established.
Takasaki, Hiroshi; Okuyama, Kousuke; Rosedale, Richard
2017-02-01
Mechanical Diagnosis and Therapy (MDT) is used in the treatment of extremity problems. Classifying clinical problems is one method of providing effective treatment to a target population. Classification reliability is a key factor to determine the precise clinical problem and to direct an appropriate intervention. To explore inter-examiner reliability of the MDT classification for extremity problems in three reliability designs: 1) vignette reliability using surveys with patient vignettes, 2) concurrent reliability, where multiple assessors decide a classification by observing someone's assessment, 3) successive reliability, where multiple assessors independently assess the same patient at different times. Systematic review with data synthesis in a quantitative format. Agreement of MDT subgroups was examined using the Kappa value, with the operational definition of acceptable reliability set at ≥ 0.6. The level of evidence was determined considering the methodological quality of the studies. Six studies were included and all studies met the criteria for high quality. Kappa values for the vignette reliability design (five studies) were ≥ 0.7. There was data from two cohorts in one study for the concurrent reliability design and the Kappa values ranged from 0.45 to 1.0. Kappa values for the successive reliability design (data from three cohorts in one study) were < 0.6. The current review found strong evidence of acceptable inter-examiner reliability of MDT classification for extremity problems in the vignette reliability design, limited evidence of acceptable reliability in the concurrent reliability design and unacceptable reliability in the successive reliability design. Copyright © 2017 Elsevier Ltd. All rights reserved.
Probabilistic forecasting of extreme weather events based on extreme value theory
NASA Astrophysics Data System (ADS)
Van De Vyver, Hans; Van Schaeybroeck, Bert
2016-04-01
Extreme events in weather and climate such as high wind gusts, heavy precipitation or extreme temperatures are commonly associated with high impacts on both environment and society. Forecasting extreme weather events is difficult, and very high-resolution models are needed to describe explicitly extreme weather phenomena. A prediction system for such events should therefore preferably be probabilistic in nature. Probabilistic forecasts and state estimations are nowadays common in the numerical weather prediction community. In this work, we develop a new probabilistic framework based on extreme value theory that aims to provide early warnings up to several days in advance. We consider the combined events when an observation variable Y (for instance wind speed) exceeds a high threshold y and its corresponding deterministic forecasts X also exceeds a high forecast threshold y. More specifically two problems are addressed:} We consider pairs (X,Y) of extreme events where X represents a deterministic forecast, and Y the observation variable (for instance wind speed). More specifically two problems are addressed: Given a high forecast X=x_0, what is the probability that Y>y? In other words: provide inference on the conditional probability: [ Pr{Y>y|X=x_0}. ] Given a probabilistic model for Problem 1, what is the impact on the verification analysis of extreme events. These problems can be solved with bivariate extremes (Coles, 2001), and the verification analysis in (Ferro, 2007). We apply the Ramos and Ledford (2009) parametric model for bivariate tail estimation of the pair (X,Y). The model accommodates different types of extremal dependence and asymmetry within a parsimonious representation. Results are presented using the ensemble reforecast system of the European Centre of Weather Forecasts (Hagedorn, 2008). Coles, S. (2001) An Introduction to Statistical modelling of Extreme Values. Springer-Verlag.Ferro, C.A.T. (2007) A probability model for verifying deterministic forecasts of extreme events. Wea. Forecasting {22}, 1089-1100.Hagedorn, R. (2008) Using the ECMWF reforecast dataset to calibrate EPS forecasts. ECMWF Newsletter, {117}, 8-13.Ramos, A., Ledford, A. (2009) A new class of models for bivariate joint tails. J.R. Statist. Soc. B {71}, 219-241.
Extreme value analysis in biometrics.
Hüsler, Jürg
2009-04-01
We review some approaches of extreme value analysis in the context of biometrical applications. The classical extreme value analysis is based on iid random variables. Two different general methods are applied, which will be discussed together with biometrical examples. Different estimation, testing, goodness-of-fit procedures for applications are discussed. Furthermore, some non-classical situations are considered where the data are possibly dependent, where a non-stationary behavior is observed in the data or where the observations are not univariate. A few open problems are also stated.
Extremes in ecology: Avoiding the misleading effects of sampling variation in summary analyses
Link, W.A.; Sauer, J.R.
1996-01-01
Surveys such as the North American Breeding Bird Survey (BBS) produce large collections of parameter estimates. One's natural inclination when confronted with lists of parameter estimates is to look for the extreme values: in the BBS, these correspond to the species that appear to have the greatest changes in population size through time. Unfortunately, extreme estimates are liable to correspond to the most poorly estimated parameters. Consequently, the most extreme parameters may not match up with the most extreme parameter estimates. The ranking of parameter values on the basis of their estimates are a difficult statistical problem. We use data from the BBS and simulations to illustrate the potential misleading effects of sampling variation in rankings of parameters. We describe empirical Bayes and constrained empirical Bayes procedures which provide partial solutions to the problem of ranking in the presence of sampling variation.
Hay preservation with propionic acid
USDA-ARS?s Scientific Manuscript database
Most hay producers are quite familiar with the problems associated with baling moist hays. Normally, these problems include spontaneous heating, increased evidence of mold, losses of dry matter (DM) during storage, poorer nutritive value, and (in extreme cases) spontaneous combustion. Numerous fact...
Preservation of hay with propionic acid
USDA-ARS?s Scientific Manuscript database
Most hay producers are quite familiar with the problems associated with baling moist hays. Normally, these problems include spontaneous heating, increased evidence of mold, losses of dry matter (DM) during storage, poorer nutritive value, and (in extreme cases) spontaneous combustion. Numerous fact...
Quintela-del-Río, Alejandro; Francisco-Fernández, Mario
2011-02-01
The study of extreme values and prediction of ozone data is an important topic of research when dealing with environmental problems. Classical extreme value theory is usually used in air-pollution studies. It consists in fitting a parametric generalised extreme value (GEV) distribution to a data set of extreme values, and using the estimated distribution to compute return levels and other quantities of interest. Here, we propose to estimate these values using nonparametric functional data methods. Functional data analysis is a relatively new statistical methodology that generally deals with data consisting of curves or multi-dimensional variables. In this paper, we use this technique, jointly with nonparametric curve estimation, to provide alternatives to the usual parametric statistical tools. The nonparametric estimators are applied to real samples of maximum ozone values obtained from several monitoring stations belonging to the Automatic Urban and Rural Network (AURN) in the UK. The results show that nonparametric estimators work satisfactorily, outperforming the behaviour of classical parametric estimators. Functional data analysis is also used to predict stratospheric ozone concentrations. We show an application, using the data set of mean monthly ozone concentrations in Arosa, Switzerland, and the results are compared with those obtained by classical time series (ARIMA) analysis. Copyright © 2010 Elsevier Ltd. All rights reserved.
Ethical research as the target of animal extremism: an international problem.
Conn, P Michael; Rantin, F T
2010-02-01
Animal extremism has been increasing worldwide; frequently researchers are the targets of actions by groups with extreme animal rights agendas. Sometimes this targeting is violent and may involve assaults on family members or destruction of property. In this article, we summarize recent events and suggest steps that researchers can take to educate the public on the value of animal research both for people and animals.
Persistence Mapping Using EUV Solar Imager Data
NASA Technical Reports Server (NTRS)
Thompson, B. J.; Young, C. A.
2016-01-01
We describe a simple image processing technique that is useful for the visualization and depiction of gradually evolving or intermittent structures in solar physics extreme-ultraviolet imagery. The technique is an application of image segmentation, which we call "Persistence Mapping," to isolate extreme values in a data set, and is particularly useful for the problem of capturing phenomena that are evolving in both space and time. While integration or "time-lapse" imaging uses the full sample (of size N ), Persistence Mapping rejects (N - 1)/N of the data set and identifies the most relevant 1/N values using the following rule: if a pixel reaches an extreme value, it retains that value until that value is exceeded. The simplest examples isolate minima and maxima, but any quantile or statistic can be used. This paper demonstrates how the technique has been used to extract the dynamics in long-term evolution of comet tails, erupting material, and EUV dimming regions.
Quantifying uncertainties in wind energy assessment
NASA Astrophysics Data System (ADS)
Patlakas, Platon; Galanis, George; Kallos, George
2015-04-01
The constant rise of wind energy production and the subsequent penetration in global energy markets during the last decades resulted in new sites selection with various types of problems. Such problems arise due to the variability and the uncertainty of wind speed. The study of the wind speed distribution lower and upper tail may support the quantification of these uncertainties. Such approaches focused on extreme wind conditions or periods below the energy production threshold are necessary for a better management of operations. Towards this direction, different methodologies are presented for the credible evaluation of potential non-frequent/extreme values for these environmental conditions. The approaches used, take into consideration the structural design of the wind turbines according to their lifespan, the turbine failures, the time needed for repairing as well as the energy production distribution. In this work, a multi-parametric approach for studying extreme wind speed values will be discussed based on tools of Extreme Value Theory. In particular, the study is focused on extreme wind speed return periods and the persistence of no energy production based on a weather modeling system/hind cast/10-year dataset. More specifically, two methods (Annual Maxima and Peaks Over Threshold) were used for the estimation of extreme wind speeds and their recurrence intervals. Additionally, two different methodologies (intensity given duration and duration given intensity, both based on Annual Maxima method) were implied to calculate the extreme events duration, combined with their intensity as well as the event frequency. The obtained results prove that the proposed approaches converge, at least on the main findings, for each case. It is also remarkable that, despite the moderate wind speed climate of the area, several consequent days of no energy production are observed.
PERSISTENCE MAPPING USING EUV SOLAR IMAGER DATA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thompson, B. J.; Young, C. A., E-mail: barbara.j.thompson@nasa.gov
We describe a simple image processing technique that is useful for the visualization and depiction of gradually evolving or intermittent structures in solar physics extreme-ultraviolet imagery. The technique is an application of image segmentation, which we call “Persistence Mapping,” to isolate extreme values in a data set, and is particularly useful for the problem of capturing phenomena that are evolving in both space and time. While integration or “time-lapse” imaging uses the full sample (of size N ), Persistence Mapping rejects ( N − 1)/ N of the data set and identifies the most relevant 1/ N values using themore » following rule: if a pixel reaches an extreme value, it retains that value until that value is exceeded. The simplest examples isolate minima and maxima, but any quantile or statistic can be used. This paper demonstrates how the technique has been used to extract the dynamics in long-term evolution of comet tails, erupting material, and EUV dimming regions.« less
Analysis of the dependence of extreme rainfalls
NASA Astrophysics Data System (ADS)
Padoan, Simone; Ancey, Christophe; Parlange, Marc
2010-05-01
The aim of spatial analysis is to quantitatively describe the behavior of environmental phenomena such as precipitation levels, wind speed or daily temperatures. A number of generic approaches to spatial modeling have been developed[1], but these are not necessarily ideal for handling extremal aspects given their focus on mean process levels. The areal modelling of the extremes of a natural process observed at points in space is important in environmental statistics; for example, understanding extremal spatial rainfall is crucial in flood protection. In light of recent concerns over climate change, the use of robust mathematical and statistical methods for such analyses has grown in importance. Multivariate extreme value models and the class of maxstable processes [2] have a similar asymptotic motivation to the univariate Generalized Extreme Value (GEV) distribution , but providing a general approach to modeling extreme processes incorporating temporal or spatial dependence. Statistical methods for max-stable processes and data analyses of practical problems are discussed by [3] and [4]. This work illustrates methods to the statistical modelling of spatial extremes and gives examples of their use by means of a real extremal data analysis of Switzerland precipitation levels. [1] Cressie, N. A. C. (1993). Statistics for Spatial Data. Wiley, New York. [2] de Haan, L and Ferreria A. (2006). Extreme Value Theory An Introduction. Springer, USA. [3] Padoan, S. A., Ribatet, M and Sisson, S. A. (2009). Likelihood-Based Inference for Max-Stable Processes. Journal of the American Statistical Association, Theory & Methods. In press. [4] Davison, A. C. and Gholamrezaee, M. (2009), Geostatistics of extremes. Journal of the Royal Statistical Society, Series B. To appear.
Outliers: A Potential Data Problem.
ERIC Educational Resources Information Center
Douzenis, Cordelia; Rakow, Ernest A.
Outliers, extreme data values relative to others in a sample, may distort statistics that assume internal levels of measurement and normal distribution. The outlier may be a valid value or an error. Several procedures are available for identifying outliers, and each may be applied to errors of prediction from the regression lines for utility in a…
Extreme values and fat tails of multifractal fluctuations
NASA Astrophysics Data System (ADS)
Muzy, J. F.; Bacry, E.; Kozhemyak, A.
2006-06-01
In this paper we discuss the problem of the estimation of extreme event occurrence probability for data drawn from some multifractal process. We also study the heavy (power-law) tail behavior of probability density function associated with such data. We show that because of strong correlations, the standard extreme value approach is not valid and classical tail exponent estimators should be interpreted cautiously. Extreme statistics associated with multifractal random processes turn out to be characterized by non-self-averaging properties. Our considerations rely upon some analogy between random multiplicative cascades and the physics of disordered systems and also on recent mathematical results about the so-called multifractal formalism. Applied to financial time series, our findings allow us to propose an unified framework that accounts for the observed multiscaling properties of return fluctuations, the volatility clustering phenomenon and the observed “inverse cubic law” of the return pdf tails.
Statistical Modeling of Extreme Values and Evidence of Presence of Dragon King (DK) in Solar Wind
NASA Astrophysics Data System (ADS)
Gomes, T.; Ramos, F.; Rempel, E. L.; Silva, S.; C-L Chian, A.
2017-12-01
The solar wind constitutes a nonlinear dynamical system, presenting intermittent turbulence, multifractality and chaotic dynamics. One characteristic shared by many such complex systems is the presence of extreme events, that play an important role in several Geophysical phenomena and their statistical characterization is a problem of great practical relevance. This work investigates the presence of extreme events in time series of the modulus of the interplanetary magnetic field measured by Cluster spacecraft on February 2, 2002. One of the main results is that the solar wind near the Earth's bow shock can be modeled by the Generalized Pareto (GP) and Generalized Extreme Values (GEV) distributions. Both models present a statistically significant positive shape parameter which implyies a heavy tail in the probability distribution functions and an unbounded growth in return values as return periods become too long. There is evidence that current sheets are the main responsible for positive values of the shape parameter. It is also shown that magnetic reconnection at the interface between two interplanetary magnetic flux ropes in the solar wind can be considered as Dragon Kings (DK), a class of extreme events whose formation mechanisms are fundamentally different from others. As long as magnetic reconnection can be classified as a Dragon King, there is the possibility of its identification and even its prediction. Dragon kings had previously been identified in time series of financial crashes, nuclear power generation accidents, stock market and so on. It is believed that they are associated with the occurrence of extreme events in dynamical systems at phase transition, bifurcation, crises or tipping points.
NASA Astrophysics Data System (ADS)
Zheng, Qin; Yang, Zubin; Sha, Jianxin; Yan, Jun
2017-02-01
In predictability problem research, the conditional nonlinear optimal perturbation (CNOP) describes the initial perturbation that satisfies a certain constraint condition and causes the largest prediction error at the prediction time. The CNOP has been successfully applied in estimation of the lower bound of maximum predictable time (LBMPT). Generally, CNOPs are calculated by a gradient descent algorithm based on the adjoint model, which is called ADJ-CNOP. This study, through the two-dimensional Ikeda model, investigates the impacts of the nonlinearity on ADJ-CNOP and the corresponding precision problems when using ADJ-CNOP to estimate the LBMPT. Our conclusions are that (1) when the initial perturbation is large or the prediction time is long, the strong nonlinearity of the dynamical model in the prediction variable will lead to failure of the ADJ-CNOP method, and (2) when the objective function has multiple extreme values, ADJ-CNOP has a large probability of producing local CNOPs, hence making a false estimation of the LBMPT. Furthermore, the particle swarm optimization (PSO) algorithm, one kind of intelligent algorithm, is introduced to solve this problem. The method using PSO to compute CNOP is called PSO-CNOP. The results of numerical experiments show that even with a large initial perturbation and long prediction time, or when the objective function has multiple extreme values, PSO-CNOP can always obtain the global CNOP. Since the PSO algorithm is a heuristic search algorithm based on the population, it can overcome the impact of nonlinearity and the disturbance from multiple extremes of the objective function. In addition, to check the estimation accuracy of the LBMPT presented by PSO-CNOP and ADJ-CNOP, we partition the constraint domain of initial perturbations into sufficiently fine grid meshes and take the LBMPT obtained by the filtering method as a benchmark. The result shows that the estimation presented by PSO-CNOP is closer to the true value than the one by ADJ-CNOP with the forecast time increasing.
An Automatic Orthonormalization Method for Solving Stiff Boundary-Value Problems
NASA Astrophysics Data System (ADS)
Davey, A.
1983-08-01
A new initial-value method is described, based on a remark by Drury, for solving stiff linear differential two-point cigenvalue and boundary-value problems. The method is extremely reliable, it is especially suitable for high-order differential systems, and it is capable of accommodating realms of stiffness which other methods cannot reach. The key idea behind the method is to decompose the stiff differential operator into two non-stiff operators, one of which is nonlinear. The nonlinear one is specially chosen so that it advances an orthonormal frame, indeed the method is essentially a kind of automatic orthonormalization; the second is auxiliary but it is needed to determine the required function. The usefulness of the method is demonstrated by calculating some eigenfunctions for an Orr-Sommerfeld problem when the Reynolds number is as large as 10°.
A Yang-Mills field on the extremal Reissner-Nordström black hole
NASA Astrophysics Data System (ADS)
Bizoń, Piotr; Kahl, Michał
2016-09-01
We consider a spherically symmetric (magnetic) SU(2) Yang-Mills field propagating on the exterior of the extremal Reissner-Nordström black hole. Taking advantage of the conformal symmetry, we reduce the problem to the study of the Yang-Mills equation in a geodesically complete spacetime with two asymptotically flat ends. We prove the existence of infinitely many static solutions (two of which are found in closed form) and determine the spectrum of their linear perturbations and quasinormal modes. Finally, using the hyperboloidal approach to the initial value problem, we describe the process of relaxation to the static endstates of evolution, both stable (for generic initial data) and unstable (for codimension-one initial data).
Persistence analysis of extreme CO, NO2 and O3 concentrations in ambient air of Delhi
NASA Astrophysics Data System (ADS)
Chelani, Asha B.
2012-05-01
Persistence analysis of air pollutant concentration and corresponding exceedance time series is carried out to examine for temporal evolution. For this purpose, air pollutant concentrations, namely, CO, NO2 and O3 observed during 2000-2009 at a traffic site in Delhi are analyzed using detrended fluctuation analysis. Two types of extreme values are analyzed; exceeded concentrations to a threshold provided by national pollution controlling agency and time interval between two exceedances. The time series of three pollutants is observed to possess persistence property whereas the extreme value time series of only primary pollutant concentrations is found to be persistent. Two time scaling regions are observed to be significant in extreme time series of CO and NO2, mainly attributed to implementation of CNG in vehicles. The presence of persistence in three pollutant concentration time series is linked to the property of self-organized criticality. The observed persistence in the time interval between two exceeded levels is a matter of concern as persistent high concentrations can trigger health problems.
Classification-Assisted Memetic Algorithms for Equality-Constrained Optimization Problems
NASA Astrophysics Data System (ADS)
Handoko, Stephanus Daniel; Kwoh, Chee Keong; Ong, Yew Soon
Regressions has successfully been incorporated into memetic algorithm (MA) to build surrogate models for the objective or constraint landscape of optimization problems. This helps to alleviate the needs for expensive fitness function evaluations by performing local refinements on the approximated landscape. Classifications can alternatively be used to assist MA on the choice of individuals that would experience refinements. Support-vector-assisted MA were recently proposed to alleviate needs for function evaluations in the inequality-constrained optimization problems by distinguishing regions of feasible solutions from those of the infeasible ones based on some past solutions such that search efforts can be focussed on some potential regions only. For problems having equality constraints, however, the feasible space would obviously be extremely small. It is thus extremely difficult for the global search component of the MA to produce feasible solutions. Hence, the classification of feasible and infeasible space would become ineffective. In this paper, a novel strategy to overcome such limitation is proposed, particularly for problems having one and only one equality constraint. The raw constraint value of an individual, instead of its feasibility class, is utilized in this work.
Quadratic forms involving Green's and Robin functions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dubinin, Vladimir N
2009-10-31
General inequalities for quadratic forms with coefficients depending on the values of Green's and Robin functions are obtained. These inequalities cover also the reduced moduli of strips and half-strips. Some applications of the results obtained to extremal partitioning problems and related questions of geometric function theory are discussed. Bibliography: 29 titles.
Invited Article: Visualisation of extreme value events in optical communications
NASA Astrophysics Data System (ADS)
Derevyanko, Stanislav; Redyuk, Alexey; Vergeles, Sergey; Turitsyn, Sergei
2018-06-01
Fluctuations of a temporal signal propagating along long-haul transoceanic scale fiber links can be visualised in the spatio-temporal domain drawing visual analogy with ocean waves. Substantial overlapping of information symbols or use of multi-frequency signals leads to strong statistical deviations of local peak power from an average signal power level. We consider long-haul optical communication systems from this unusual angle, treating them as physical systems with a huge number of random statistical events, including extreme value fluctuations that potentially might affect the quality of data transmission. We apply the well-established concepts of adaptive wavefront shaping used in imaging through turbid medium to detect the detrimental phase modulated sequences in optical communications that can cause extreme power outages (rare optical waves of ultra-high amplitude) during propagation down the ultra-long fiber line. We illustrate the concept by a theoretical analysis of rare events of high-intensity fluctuations—optical freak waves, taking as an example an increasingly popular optical frequency division multiplexing data format where the problem of high peak to average power ratio is the most acute. We also show how such short living extreme value spikes in the optical data streams are affected by nonlinearity and demonstrate the negative impact of such events on the system performance.
Nonlinear Large-Deflection Boundary-Value Problems of Rectangular Plates
1948-03-01
nondimensional %’a T ,a2/m,2 respectively) xy , extreme- fiber bending and shearing stresses (nondimensiozml forms are e’x"a2/E’h2 , Cry"a2/Eh2, and Vxy"a2/Eh2...respectively) membrane strains in middle surface (nondimensional forms are _x ’a2/h2, _y ’a2/h 2, and _x_ ’a2/h2, respectlvel_ ) extreme- fiber bending...median- fiber stresses are _2F (_y - 8x2 82F T ! -- xy _x and the n_diarJ- fiber strains are , 8 NACA TN No. 1425 7_’ = _2(i+ _) _2F _. axa_ The extreme
Studying Weather and Climate Extremes in a Non-stationary Framework
NASA Astrophysics Data System (ADS)
Wu, Z.
2010-12-01
The study of weather and climate extremes often uses the theory of extreme values. Such a detection method has a major problem: to obtain the probability distribution of extremes, one has to implicitly assume the Earth’s climate is stationary over a long period within which the climatology is defined. While such detection makes some sense in a purely statistical view of stationary processes, it can lead to misleading statistical properties of weather and climate extremes caused by long term climate variability and change, and may also cause enormous difficulty in attributing and predicting these extremes. To alleviate this problem, here we report a novel non-stationary framework for studying weather and climate extremes in a non-stationary framework. In this new framework, the weather and climate extremes will be defined as timescale-dependent quantities derived from the anomalies with respect to non-stationary climatologies of different timescales. With this non-stationary framework, the non-stationary and nonlinear nature of climate system will be taken into account; and the attribution and the prediction of weather and climate extremes can then be separated into 1) the change of the statistical properties of the weather and climate extremes themselves and 2) the background climate variability and change. The new non-stationary framework will use the ensemble empirical mode decomposition (EEMD) method, which is a recent major improvement of the Hilbert-Huang Transform for time-frequency analysis. Using this tool, we will adaptively decompose various weather and climate data from observation and climate models in terms of the components of the various natural timescales contained in the data. With such decompositions, the non-stationary statistical properties (both spatial and temporal) of weather and climate anomalies and of their corresponding climatologies will be analyzed and documented.
Using Biweight M-Estimates in the Two-Sample Problem. 1. Symmetric Populations
1982-01-01
to a Student’s t distribution, across a broad range of a - levels . To be conservative, we might wish to approximate "t" by a Student’s t on nine-tenths...n-i0). While the robustness of classical procedures for extreme a - levels has not been investigated, a comparison with the values in Lee and...D’Agostino (1976) indicates that this procedure is highly robust of validity at a - .05, presumably this robustness extends to the extreme a - levels as well
Optimization of Cubic Polynomial Functions without Calculus
ERIC Educational Resources Information Center
Taylor, Ronald D., Jr.; Hansen, Ryan
2008-01-01
In algebra and precalculus courses, students are often asked to find extreme values of polynomial functions in the context of solving an applied problem; but without the notion of derivative, something is lost. Either the functions are reduced to quadratics, since students know the formula for the vertex of a parabola, or solutions are…
Therapist qualities preferred by sexual-minority individuals.
Burckell, Lisa A; Goldfried, Marvin R
2006-01-01
Psychotherapy research concerning lesbian, gay, and bisexual (LGB) individuals has focused on matching clients on gender and sexual orientation, yet has not considered how factors such as therapeutic skill, presenting problem, and cohort membership may influence preference for therapists. This study was designed to identify those therapist qualities that sexual-minority individuals prefer and to determine how the presenting problem influences therapist choice. Forty-two nonheterosexual adults between 18 and 29 years old ranked 63 therapist characteristics from "Extremely Uncharacteristic" to "Extremely Characteristic" when seeking treatment for a problem in which their sexual orientation was salient and one in which it was not. The analyses of both conditions yielded clusters of items reflecting therapist characteristics that participants considered unfavorable, neutral, beneficial, and essential. Participants valued therapists who had LGB-specific knowledge as well as general therapeutic skills, whereas they indicated that they would avoid therapists who held heterocentric views. Application of these findings to clinical practice and future directions are discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved).
Estimating meme fitness in adaptive memetic algorithms for combinatorial problems.
Smith, J E
2012-01-01
Among the most promising and active research areas in heuristic optimisation is the field of adaptive memetic algorithms (AMAs). These gain much of their reported robustness by adapting the probability with which each of a set of local improvement operators is applied, according to an estimate of their current value to the search process. This paper addresses the issue of how the current value should be estimated. Assuming the estimate occurs over several applications of a meme, we consider whether the extreme or mean improvements should be used, and whether this aggregation should be global, or local to some part of the solution space. To investigate these issues, we use the well-established COMA framework that coevolves the specification of a population of memes (representing different local search algorithms) alongside a population of candidate solutions to the problem at hand. Two very different memetic algorithms are considered: the first using adaptive operator pursuit to adjust the probabilities of applying a fixed set of memes, and a second which applies genetic operators to dynamically adapt and create memes and their functional definitions. For the latter, especially on combinatorial problems, credit assignment mechanisms based on historical records, or on notions of landscape locality, will have limited application, and it is necessary to estimate the value of a meme via some form of sampling. The results on a set of binary encoded combinatorial problems show that both methods are very effective, and that for some problems it is necessary to use thousands of variables in order to tease apart the differences between different reward schemes. However, for both memetic algorithms, a significant pattern emerges that reward based on mean improvement is better than that based on extreme improvement. This contradicts recent findings from adapting the parameters of operators involved in global evolutionary search. The results also show that local reward schemes outperform global reward schemes in combinatorial spaces, unlike in continuous spaces. An analysis of evolving meme behaviour is used to explain these findings.
NASA Technical Reports Server (NTRS)
Mosher, Richard A.; Bier, Milan; Righetti, Pier Giorgio
1986-01-01
Computer simulations of the concentration profiles of simple biprotic ampholytes with Delta pKs 1, 2, and 3, on immobilized pH gradients (IPG) at extreme pH values (pH 3-4 and pH 10-11) show markedly skewed steady-state profiles with increasing kurtosis at higher Delta pK values. Across neutrality, all the peaks are symmetric irrespective of their Delta pK values, but they show very high contribution to the conductivity of the background gel and significant alteration of the local buffering capacity. The problems of skewness, due to the exponential conductivity profiles at low and high pHs, and of gel burning due to a strong electroosmotic flow generated by the net charges in the gel matrix, also at low and high pHs, are solved by incorporating in the IPG gel a strong viscosity gradient. This is generated by a gradient of linear polyacrylamide which is trapped in the gel by the polymerization process.
NASA Astrophysics Data System (ADS)
Degtyar, V. G.; Kalashnikov, S. T.; Mokin, Yu. A.
2017-10-01
The paper considers problems of analyzing aerodynamic properties (ADP) of reenetry vehicles (RV) as blunted rotary bodies with small random surface distortions. The interactions of math simulation of surface distortions, selection of tools for predicting ADPs of shaped bodies, evaluation of different-type ADP variations and their adaptation for dynamic problems are analyzed. The possibilities of deterministic and probabilistic approaches to evaluation of ADP variations are considered. The practical value of the probabilistic approach is demonstrated. The examples of extremal deterministic evaluations of ADP variations for a sphere and a sharp cone are given.
The value of flexibility in conservation financing.
Lennox, Gareth D; Fargione, Joseph; Spector, Sacha; Williams, Gwyn; Armsworth, Paul R
2017-06-01
Land-acquisition strategies employed by conservation organizations vary in their flexibility. Conservation-planning theory largely fails to reflect this by presenting models that are either extremely inflexible-parcel acquisitions are irreversible and budgets are fixed-or extremely flexible-previously acquired parcels can readily be sold. This latter approach, the selling of protected areas, is infeasible or problematic in many situations. We considered the value to conservation organizations of increasing the flexibility of their land-acquisition strategies through their approach to financing deals. Specifically, we modeled 2 acquisition-financing methods commonly used by conservation organizations: borrowing and budget carry-over. Using simulated data, we compared results from these models with those from an inflexible fixed-budget model and an extremely flexible selling model in which previous acquisitions could be sold to fund new acquisitions. We then examined 3 case studies of how conservation organizations use borrowing and budget carry-over in practice. Model comparisons showed that borrowing and budget carry-over always returned considerably higher rewards than the fixed-budget model. How they performed relative to the selling model depended on the relative conservation value of past acquisitions. Both the models and case studies showed that incorporating flexibility through borrowing or budget carry-over gives conservation organizations the ability to purchase parcels of higher conservation value than when budgets are fixed without the problems associated with the selling of protected areas. © 2016 Society for Conservation Biology.
Estimating return periods of extreme values from relatively short time series of winds
NASA Astrophysics Data System (ADS)
Jonasson, Kristjan; Agustsson, Halfdan; Rognvaldsson, Olafur; Arfeuille, Gilles
2013-04-01
An important factor for determining the prospect of individual wind farm sites is the frequency of extreme winds at hub height. Here, extreme winds are defined as the value of the highest 10 minutes averaged wind speed with a 50 year return period, i.e. annual exceeding probability of 2% (Rodrigo, 2010). A frequently applied method to estimate winds in the lowest few hundred meters above ground is to extrapolate observed 10-meter winds logarithmically to higher altitudes. Recent study by Drechsel et al. (2012) showed however that this methodology is not as accurate as interpolating simulated results from the global ECMWF numerical weather prediction (NWP) model to the desired height. Observations of persistent low level jets near Colima in SW-Mexico also show that the logarithmic approach can give highly inaccurate results for some regions (Arfeuille et al., 2012). To address these shortcomings of limited, and/or poorly representative, observations and extrapolations of winds one can use NWP models to dynamically scale down relatively coarse resolution atmospheric analysis. In the case of limited computing resources one has typically to make a compromise between spatial resolution and the duration of the simulated period, both of which can limit the quality of the wind farm siting. A common method to estimate maximum winds is to fit an extreme value distribution (e.g. Gumbel, gev or Pareto) to the maximum values of each year of available data, or the tail of these values. If data are only available for a short period, e.g. 10 or 15 years, then this will give a rather inaccurate estimate. It is possible to deal with this problem by utilizing monthly or weekly maxima, but this introduces new problems: seasonal variation, autocorrelation of neighboring values, and increased discrepancy between data and fitted distribution. We introduce a new method to estimate return periods of extreme values of winds at hub height from relatively short time series of winds, simulated at a high spatial resolution. REFERENCES Arfeuille, Gilles J. M., A. L. Quintanilla, L. Zizumbo, and F. C. Viesca, 2012. Wind Resource Assessment in a Tropical Region with Complex Terrain using SODAR and a Meteorological Tower Network to Measure Low Level Jets and Boundary Layer Conditions. 15th AMS Conference on Mountain Meteorology, Steam boat Spring, Colorado, USA, August 2012. Available on-line: https://ams.confex.com/ams/15MountMet/webprogram/Manuscript/Paper210184/ARFEUILLLE_etal_15MountMet Conf_Aug2012.pdf Drechsel S., G. J. Mayr, J. W. Messner, and R. Stauffer, 2012: Wind Speeds at Heights Crucial for Wind Energy: Measurements and Verification of Forecasts. J. Appl. Meteor. Climatol., 51, 1602-1617. Rodrigo, J. S., 2010. State-of-the-Art of Wind Resource Assessment. CENER National Renewable Energy Center, Sarriguren, Spain. Available on-line: http://www.waudit-itn.eu/download.php?id=103&parent=79
2018-01-01
Mathematical models simulating different and representative engineering problem, atomic dry friction, the moving front problems and elastic and solid mechanics are presented in the form of a set of non-linear, coupled or not coupled differential equations. For different parameters values that influence the solution, the problem is numerically solved by the network method, which provides all the variables of the problems. Although the model is extremely sensitive to the above parameters, no assumptions are considered as regards the linearization of the variables. The design of the models, which are run on standard electrical circuit simulation software, is explained in detail. The network model results are compared with common numerical methods or experimental data, published in the scientific literature, to show the reliability of the model. PMID:29518121
Sánchez-Pérez, J F; Marín, F; Morales, J L; Cánovas, M; Alhama, F
2018-01-01
Mathematical models simulating different and representative engineering problem, atomic dry friction, the moving front problems and elastic and solid mechanics are presented in the form of a set of non-linear, coupled or not coupled differential equations. For different parameters values that influence the solution, the problem is numerically solved by the network method, which provides all the variables of the problems. Although the model is extremely sensitive to the above parameters, no assumptions are considered as regards the linearization of the variables. The design of the models, which are run on standard electrical circuit simulation software, is explained in detail. The network model results are compared with common numerical methods or experimental data, published in the scientific literature, to show the reliability of the model.
A fast approach to designing airfoils from given pressure distribution in compressible flows
NASA Technical Reports Server (NTRS)
Daripa, Prabir
1987-01-01
A new inverse method for aerodynamic design of airfols is presented for subcritical flows. The pressure distribution in this method can be prescribed as a function of the arc length of the as-yet unknown body. This inverse problem is shown to be mathematically equivalent to solving only one nonlinear boundary value problem subject to known Dirichlet data on the boundary. The solution to this problem determines the airfoil, the freestream Mach number, and the upstream flow direction. The existence of a solution to a given pressure distribution is discussed. The method is easy to implement and extremely efficient. A series of results for which comparisons are made with the known airfoils is presented.
NASA Astrophysics Data System (ADS)
Gavrishchaka, V. V.; Ganguli, S. B.
2001-12-01
Reliable forecasting of rare events in a complex dynamical system is a challenging problem that is important for many practical applications. Due to the nature of rare events, data set available for construction of the statistical and/or machine learning model is often very limited and incomplete. Therefore many widely used approaches including such robust algorithms as neural networks can easily become inadequate for rare events prediction. Moreover in many practical cases models with high-dimensional inputs are required. This limits applications of the existing rare event modeling techniques (e.g., extreme value theory) that focus on univariate cases. These approaches are not easily extended to multivariate cases. Support vector machine (SVM) is a machine learning system that can provide an optimal generalization using very limited and incomplete training data sets and can efficiently handle high-dimensional data. These features may allow to use SVM to model rare events in some applications. We have applied SVM-based system to the problem of large-amplitude substorm prediction and extreme event forecasting in stock and currency exchange markets. Encouraging preliminary results will be presented and other possible applications of the system will be discussed.
Uncertainty in determining extreme precipitation thresholds
NASA Astrophysics Data System (ADS)
Liu, Bingjun; Chen, Junfan; Chen, Xiaohong; Lian, Yanqing; Wu, Lili
2013-10-01
Extreme precipitation events are rare and occur mostly on a relatively small and local scale, which makes it difficult to set the thresholds for extreme precipitations in a large basin. Based on the long term daily precipitation data from 62 observation stations in the Pearl River Basin, this study has assessed the applicability of the non-parametric, parametric, and the detrended fluctuation analysis (DFA) methods in determining extreme precipitation threshold (EPT) and the certainty to EPTs from each method. Analyses from this study show the non-parametric absolute critical value method is easy to use, but unable to reflect the difference of spatial rainfall distribution. The non-parametric percentile method can account for the spatial distribution feature of precipitation, but the problem with this method is that the threshold value is sensitive to the size of rainfall data series and is subjected to the selection of a percentile thus make it difficult to determine reasonable threshold values for a large basin. The parametric method can provide the most apt description of extreme precipitations by fitting extreme precipitation distributions with probability distribution functions; however, selections of probability distribution functions, the goodness-of-fit tests, and the size of the rainfall data series can greatly affect the fitting accuracy. In contrast to the non-parametric and the parametric methods which are unable to provide information for EPTs with certainty, the DFA method although involving complicated computational processes has proven to be the most appropriate method that is able to provide a unique set of EPTs for a large basin with uneven spatio-temporal precipitation distribution. The consistency between the spatial distribution of DFA-based thresholds with the annual average precipitation, the coefficient of variation (CV), and the coefficient of skewness (CS) for the daily precipitation further proves that EPTs determined by the DFA method are more reasonable and applicable for the Pearl River Basin.
A Firefly-Inspired Method for Protein Structure Prediction in Lattice Models
Maher, Brian; Albrecht, Andreas A.; Loomes, Martin; Yang, Xin-She; Steinhöfel, Kathleen
2014-01-01
We introduce a Firefly-inspired algorithmic approach for protein structure prediction over two different lattice models in three-dimensional space. In particular, we consider three-dimensional cubic and three-dimensional face-centred-cubic (FCC) lattices. The underlying energy models are the Hydrophobic-Polar (H-P) model, the Miyazawa–Jernigan (M-J) model and a related matrix model. The implementation of our approach is tested on ten H-P benchmark problems of a length of 48 and ten M-J benchmark problems of a length ranging from 48 until 61. The key complexity parameter we investigate is the total number of objective function evaluations required to achieve the optimum energy values for the H-P model or competitive results in comparison to published values for the M-J model. For H-P instances and cubic lattices, where data for comparison are available, we obtain an average speed-up over eight instances of 2.1, leaving out two extreme values (otherwise, 8.8). For six M-J instances, data for comparison are available for cubic lattices and runs with a population size of 100, where, a priori, the minimum free energy is a termination criterion. The average speed-up over four instances is 1.2 (leaving out two extreme values, otherwise 1.1), which is achieved for a population size of only eight instances. The present study is a test case with initial results for ad hoc parameter settings, with the aim of justifying future research on larger instances within lattice model settings, eventually leading to the ultimate goal of implementations for off-lattice models. PMID:24970205
A firefly-inspired method for protein structure prediction in lattice models.
Maher, Brian; Albrecht, Andreas A; Loomes, Martin; Yang, Xin-She; Steinhöfel, Kathleen
2014-01-07
We introduce a Firefly-inspired algorithmic approach for protein structure prediction over two different lattice models in three-dimensional space. In particular, we consider three-dimensional cubic and three-dimensional face-centred-cubic (FCC) lattices. The underlying energy models are the Hydrophobic-Polar (H-P) model, the Miyazawa-Jernigan (M-J) model and a related matrix model. The implementation of our approach is tested on ten H-P benchmark problems of a length of 48 and ten M-J benchmark problems of a length ranging from 48 until 61. The key complexity parameter we investigate is the total number of objective function evaluations required to achieve the optimum energy values for the H-P model or competitive results in comparison to published values for the M-J model. For H-P instances and cubic lattices, where data for comparison are available, we obtain an average speed-up over eight instances of 2.1, leaving out two extreme values (otherwise, 8.8). For six M-J instances, data for comparison are available for cubic lattices and runs with a population size of 100, where, a priori, the minimum free energy is a termination criterion. The average speed-up over four instances is 1.2 (leaving out two extreme values, otherwise 1.1), which is achieved for a population size of only eight instances. The present study is a test case with initial results for ad hoc parameter settings, with the aim of justifying future research on larger instances within lattice model settings, eventually leading to the ultimate goal of implementations for off-lattice models.
Inflight thermodynamic properties
NASA Technical Reports Server (NTRS)
Brown, S. C.; Daniels, G. E.; Johnson, D. L.; Smith, O. E.
1973-01-01
The inflight thermodynamic parameters (temperature, pressure, and density) of the atmosphere are presented. Mean and extreme values of the thermodynamic parameters given here can be used in application of many aerospace problems, such as: (1) research and planning and engineering design of remote earth sensing systems; (2) vehicle design and development; and (3) vehicle trajectory analysis, dealing with vehicle thrust, dynamic pressure, aerodynamic drag, aerodynamic heating, vibration, structural and guidance limitations, and reentry analysis. Atmospheric density plays a very important role in most of the above problems. A subsection on reentry is presented, giving atmospheric models to be used for reentry heating, trajectory, etc., analysis.
The Rb problem in massive AGB stars.
NASA Astrophysics Data System (ADS)
Pérez-Mesa, V.; García-Hernández, D. A.; Zamora, O.; Plez, B.; Manchado, A.; Karakas, A. I.; Lugaro, M.
2017-03-01
The asymptotic giant branch (AGB) is formed by low- and intermediate-mass stars (0.8 M_{⊙} < M < 8 M_{⊙}) in their last nuclear-burning phase, when they develop thermal pulses (TP) and suffer extreme mass loss. AGB stars are the main contributor to the enrichment of the interstellar medium (ISM) and thus to the chemical evolution of galaxies. In particular, the more massive AGB stars (M > 4 M_{⊙}) are expected to produce light (e.g., Li, N) and heavy neutron-rich s-process elements (such as Rb, Zr, Ba, Y, etc.), which are not formed in lower mass AGB stars and Supernova explosions. Classical chemical analyses using hydrostatic atmospheres revealed strong Rb overabundances and high [Rb/Zr] ratios in massive AGB stars of our Galaxy and the Magellanic Clouds (MC), confirming for the first time that the ^{22}Ne neutron source dominates the production of s-process elements in these stars. The extremely high Rb abundances and [Rb/Zr] ratios observed in the most massive stars (specially in the low-metallicity MC stars) uncovered a Rb problem; such extreme Rb and [Rb/Zr] values are not predicted by the s-process AGB models, suggesting fundamental problems in our present understanding of their atmospheres. We present more realistic dynamical model atmospheres that consider a gaseous circumstellar envelope with a radial wind and we re-derive the Rb (and Zr) abundances in massive Galactic AGB stars. The new Rb abundances and [Rb/Zr] ratios derived with these dynamical models significantly resolve the problem of the mismatch between the observations and the theoretical predictions of the more massive AGB stars.
Larsen, Kristian; Weidich, Flemming; Leboeuf-Yde, Charlotte
2002-06-01
Shock-absorbing and biomechanic shoe orthoses are frequently used in the prevention and treatment of back and lower extremity problems. One review concludes that the former is clinically effective in relation to prevention, whereas the latter has been tested in only 1 randomized clinical trial, concluding that stress fractures could be prevented. To investigate if biomechanic shoe orthoses can prevent problems in the back and lower extremities and if reducing the number of days off-duty because of back or lower extremity problems is possible. Prospective, randomized, controlled intervention trial. One female and 145 male military conscripts (aged 18 to 24 years), representing 25% of all new conscripts in a Danish regiment. Health data were collected by questionnaires at initiation of the study and 3 months later. Custom-made biomechanic shoe orthoses to be worn in military boots were provided to all in the study group during the 3-month intervention period. No intervention was provided for the control group. Differences between the 2 groups were tested with the chi-square test, and statistical significance was accepted at P <.05. Risk ratio (RR), risk difference (ARR), numbers needed to prevent (NNP), and cost per successfully prevented case were calculated. Outcome variables included self-reported back and/or lower extremity problems; specific problems in the back or knees or shin splints, Achilles tendonitis, sprained ankle, or other problems in the lower extremity; number of subjects with at least 1 day off-duty because of back or lower extremity problems and total number of days off-duty within the first 3 months of military service because of back or lower extremity problems. Results were significantly better in an actual-use analysis in the intervention group for total number of subjects with back or lower extremity problems (RR 0.7, ARR 19%, NNP 5, cost 98 US dollars); number of subjects with shin splints (RR 0.2, ARR 19%, NNP 5, cost 101 US dollars); number of off-duty days because of back or lower extremity problems (RR 0.6, ARR < 1%, NNP 200, cost 3750 US dollars). In an intention-to-treat analysis, a significant difference was found for only number of subjects with shin splints (RR 0.3, ARR 18%, NNP 6 cost 105 US dollars), whereas a worst-case analysis revealed no significant differences between the study groups. This study shows that it may be possible to prevent certain musculoskeletal problems in the back or lower extremities among military conscripts by using custom-made biomechanic shoe orthoses. However, because care-seeking for lower extremity problems is rare, using this method of prevention in military conscripts would be too costly. We also noted that the choice of statistical approach determined the outcome.
Van den Akker, Alithe L; Prinzie, Peter; Deković, Maja; De Haan, Amaranta D; Asscher, Jessica J; Widiger, Thomas
2013-12-01
This study investigated the development of personality extremity (deviation of an average midpoint of all 5 personality dimensions together) across childhood and adolescence, as well as relations between personality extremity and adjustment problems. For 598 children (mean age at Time 1 = 7.5 years), mothers and fathers reported the Big Five personality dimensions 4 times across 8 years. Children's vector length in a 5-dimensional configuration of the Big Five dimensions represented personality extremity. Mothers, fathers, and teachers reported children's internalizing and externalizing problems at the 1st and final measurement. In a cohort-sequential design, we modeled personality extremity in children and adolescents from ages 6 to 17 years. Growth mixture modeling revealed a similar solution for both mother and father reports: a large group with relatively short vectors that were stable over time (mother reports: 80.3%; father reports: 84.7%) and 2 smaller groups with relatively long vectors (i.e., extreme personality configuration). One group started out relatively extreme and decreased over time (mother reports: 13.2%; father reports: 10.4%), whereas the other group started out only slightly higher than the short vector group but increased across time (mother reports: 6.5%; father reports: 4.9%). Children who belonged to the increasingly extreme class experienced more internalizing and externalizing problems in late adolescence, controlling for previous levels of adjustment problems and the Big Five personality dimensions. Personality extremity may be important to consider when identifying children at risk for adjustment problems. PsycINFO Database Record (c) 2013 APA, all rights reserved.
NASA Astrophysics Data System (ADS)
da Costa, Diogo Ricardo; Hansen, Matheus; Guarise, Gustavo; Medrano-T, Rene O.; Leonel, Edson D.
2016-04-01
We show that extreme orbits, trajectories that connect local maximum and minimum values of one dimensional maps, play a major role in the parameter space of dissipative systems dictating the organization for the windows of periodicity, hence producing sets of shrimp-like structures. Here we solve three fundamental problems regarding the distribution of these sets and give: (i) their precise localization in the parameter space, even for sets of very high periods; (ii) their local and global distributions along cascades; and (iii) the association of these cascades to complicate sets of periodicity. The extreme orbits are proved to be a powerful indicator to investigate the organization of windows of periodicity in parameter planes. As applications of the theory, we obtain some results for the circle map and perturbed logistic map. The formalism presented here can be extended to many other different nonlinear and dissipative systems.
Numerical solution of the electron transport equation
NASA Astrophysics Data System (ADS)
Woods, Mark
The electron transport equation has been solved many times for a variety of reasons. The main difficulty in its numerical solution is that it is a very stiff boundary value problem. The most common numerical methods for solving boundary value problems are symmetric collocation methods and shooting methods. Both of these types of methods can only be applied to the electron transport equation if the boundary conditions are altered with unrealistic assumptions because they require too many points to be practical. Further, they result in oscillating and negative solutions, which are physically meaningless for the problem at hand. For these reasons, all numerical methods for this problem to date are a bit unusual because they were designed to try and avoid the problem of extreme stiffness. This dissertation shows that there is no need to introduce spurious boundary conditions or invent other numerical methods for the electron transport equation. Rather, there already exists methods for very stiff boundary value problems within the numerical analysis literature. We demonstrate one such method in which the fast and slow modes of the boundary value problem are essentially decoupled. This allows for an upwind finite difference method to be applied to each mode as is appropriate. This greatly reduces the number of points needed in the mesh, and we demonstrate how this eliminates the need to define new boundary conditions. This method is verified by showing that under certain restrictive assumptions, the electron transport equation has an exact solution that can be written as an integral. We show that the solution from the upwind method agrees with the quadrature evaluation of the exact solution. This serves to verify that the upwind method is properly solving the electron transport equation. Further, it is demonstrated that the output of the upwind method can be used to compute auroral light emissions.
Substance abuse and developments in harm reduction.
Cheung, Y W
2000-06-13
A drug is a substance that produces a psychoactive, chemical or medicinal effect on the user. The psychoactive effect of mood-altering drugs is modulated by the user's perception of the risks of drug use, his or her ability to control drug use and the demographic, socioeconomic and cultural context. The ability to control drug use may vary along a continuum from compulsive use at one end to controlled use at the other. The "drug problem" has been socially constructed, and the presence of a moral panic has led to public support for the prohibitionist approach. The legalization approach has severely attacked the dominant prohibitionist approach but has failed to gain much support in society because of its extreme libertarian views. The harm reduction approach, which is based on public health principles, avoids the extremes of value-loaded judgements on drug use and focuses on the reduction of drug-related harm through pragmatic and low-threshold programs. This approach is likely to be important in tackling the drug problem in the 21st century.
Substance abuse and developments in harm reduction
Cheung, Y W
2000-01-01
A drug is a substance that produces a psychoactive, chemical or medicinal effect on the user. The psychoactive effect of mood-altering drugs is modulated by the user's perception of the risks of drug use, his or her ability to control drug use and the demographic, socioeconomic and cultural context. The ability to control drug use may vary along a continuum from compulsive use at one end to controlled use at the other. The "drug problem" has been socially constructed, and the presence of a moral panic has led to public support for the prohibitionist approach. The legalization approach has severely attacked the dominant prohibitionist approach but has failed to gain much support in society because of its extreme libertarian views. The harm reduction approach, which is based on public health principles, avoids the extremes of value-loaded judgements on drug use and focuses on the reduction of drug-related harm through pragmatic and low-threshold programs. This approach is likely to be important in tackling the drug problem in the 21st century. PMID:10870502
Spinella, Marcello; Lester, David; Yang, Bijou
2015-12-01
Compulsive buying behavior is typically viewed as pathological, but recent research has shown that compulsive buying tendencies are associated with attitudes toward money, personal financial behavior, and having materialistic values, suggesting that compulsive buyers are manifesting an extreme form of habits shown by people in general. In a study of 240 community residents, scores on the Compulsive Buying Scale were associated positively with scores on the Material Values Scale and the Canadian Problem Gambling Index, and negatively with scores on the Executive Personal Finance Scale and Ardelt's wisdom scale. These results suggest that, as is the case for many abnormal behaviors, tendencies toward compulsive buying may not be pathological, but are associated with attitudes toward money in general, financial management behavior, and materialistic values.
Min and Max Exponential Extreme Interval Values and Statistics
ERIC Educational Resources Information Center
Jance, Marsha; Thomopoulos, Nick
2009-01-01
The extreme interval values and statistics (expected value, median, mode, standard deviation, and coefficient of variation) for the smallest (min) and largest (max) values of exponentially distributed variables with parameter ? = 1 are examined for different observation (sample) sizes. An extreme interval value g[subscript a] is defined as a…
Comparative Methodology and Postmodern Relativism
NASA Astrophysics Data System (ADS)
Young, Robert
1997-09-01
The author addresses the problems of conducting comparative studies in education if one adopts a viewpoint of postmodern relativism. While acknowledging the value of postmodernist thought in opening up a new understanding of the educational process, he finds that postmodernism raises difficulties when one attempts to deal with the differences and interactions between cultures. He rejects the extremes of both relativism and universalism and argues that comparative studies should be based on a balance between the two.
Monsoon Forecasting based on Imbalanced Classification Techniques
NASA Astrophysics Data System (ADS)
Ribera, Pedro; Troncoso, Alicia; Asencio-Cortes, Gualberto; Vega, Inmaculada; Gallego, David
2017-04-01
Monsoonal systems are quasiperiodic processes of the climatic system that control seasonal precipitation over different regions of the world. The Western North Pacific Summer Monsoon (WNPSM) is one of those monsoons and it is known to have a great impact both over the global climate and over the total precipitation of very densely populated areas. The interannual variability of the WNPSM along the last 50-60 years has been related to different climatic indices such as El Niño, El Niño Modoki, the Indian Ocean Dipole or the Pacific Decadal Oscillation. Recently, a new and longer series characterizing the monthly evolution of the WNPSM, the WNP Directional Index (WNPDI), has been developed, extending its previous length from about 50 years to more than 100 years (1900-2007). Imbalanced classification techniques have been applied to the WNPDI in order to check the capability of traditional climate indices to capture and forecast the evolution of the WNPSM. The problem of forecasting has been transformed into a binary classification problem, in which the positive class represents the occurrence of an extreme monsoon event. Given that the number of extreme monsoons is much lower than the number of non-extreme monsoons, the resultant classification problem is highly imbalanced. The complete dataset is composed of 1296 instances, where only 71 (5.47%) samples correspond to extreme monsoons. Twenty predictor variables based on the cited climatic indices have been proposed, and namely, models based on trees, black box models such as neural networks, support vector machines and nearest neighbors, and finally ensemble-based techniques as random forests have been used in order to forecast the occurrence of extreme monsoons. It can be concluded that the methodology proposed here reports promising results according to the quality parameters evaluated and predicts extreme monsoons for a temporal horizon of a month with a high accuracy. From a climatological point of view, models based on trees show that the index of the El Niño Modoki in the months previous to an extreme monsoon acts as its best predictor. In most cases, the value of the Indian Ocean Dipole index acts as a second order classifier. But El Niño index, more frequently, or the Pacific Decadal Oscillation index, only in one case, do also modulate the intensity of the WNPSM in some cases.
NASA Astrophysics Data System (ADS)
Alexandre, E.; Cuadra, L.; Nieto-Borge, J. C.; Candil-García, G.; del Pino, M.; Salcedo-Sanz, S.
2015-08-01
Wave parameters computed from time series measured by buoys (significant wave height Hs, mean wave period, etc.) play a key role in coastal engineering and in the design and operation of wave energy converters. Storms or navigation accidents can make measuring buoys break down, leading to missing data gaps. In this paper we tackle the problem of locally reconstructing Hs at out-of-operation buoys by using wave parameters from nearby buoys, based on the spatial correlation among values at neighboring buoy locations. The novelty of our approach for its potential application to problems in coastal engineering is twofold. On one hand, we propose a genetic algorithm hybridized with an extreme learning machine that selects, among the available wave parameters from the nearby buoys, a subset FnSP with nSP parameters that minimizes the Hs reconstruction error. On the other hand, we evaluate to what extent the selected parameters in subset FnSP are good enough in assisting other machine learning (ML) regressors (extreme learning machines, support vector machines and gaussian process regression) to reconstruct Hs. The results show that all the ML method explored achieve a good Hs reconstruction in the two different locations studied (Caribbean Sea and West Atlantic).
NASA Astrophysics Data System (ADS)
Nikitin, V. N.; Chemodanov, V. B.
2018-02-01
The degree of stability of a laser system for surface scanning with nonlinear multiplicative crosstalks is discussed. To determine its stability, the action functional is introduced, which is defined on the set of virtual (achievable) trajectories. The action functional is a measure of external action, which should be applied to a system to move it along a predetermined trial trajectory in the state space.The degree of stability of the system depends on the minimum value of the action functional which is reached on the extreme trajectory transferring the laser scanning system from equilibrium to the limit of the normal operation range. Numerical methods are proposed for calculating the degree of stability.
The problem of extreme events in paired-watershed studies
James W. Hornbeck
1973-01-01
In paired-watershed studies, the occurrence of an extreme event during the after-treatment period presents a problem: the effects of treatment must be determined by using greatly extrapolated regression statistics. Several steps are presented to help insure careful handling of extreme events during analysis and reporting of research results.
Assessing the features of extreme smog in China and the differentiated treatment strategy
NASA Astrophysics Data System (ADS)
Deng, Lu; Zhang, Zhengjun
2018-01-01
Extreme smog can have potentially harmful effects on human health, the economy and daily life. However, the average (mean) values do not provide strategically useful information on the hazard analysis and control of extreme smog. This article investigates China's smog extremes by applying extreme value analysis to hourly PM2.5 data from 2014 to 2016 obtained from monitoring stations across China. By fitting a generalized extreme value (GEV) distribution to exceedances over a station-specific extreme smog level at each monitoring location, all study stations are grouped into eight different categories based on the estimated mean and shape parameter values of fitted GEV distributions. The extreme features characterized by the mean of the fitted extreme value distribution, the maximum frequency and the tail index of extreme smog at each location are analysed. These features can provide useful information for central/local government to conduct differentiated treatments in cities within different categories and conduct similar prevention goals and control strategies among those cities belonging to the same category in a range of areas. Furthermore, hazardous hours, breaking probability and the 1-year return level of each station are demonstrated by category, based on which the future control and reduction targets of extreme smog are proposed for the cities of Beijing, Tianjin and Hebei as an example.
Shim, Je-Myung; Kwon, Hae-Yeon; Kim, Ha-Roo; Kim, Bo-In; Jung, Ju-Hyeon
2013-12-01
[Purpose] The aim of this study was to assess the effect of Nordic pole walking on the electromyographic activities of upper extremity and lower extremity muscles. [Subjects and Methods] The subjects were randomly divided into two groups as follows: without Nordic pole walking group (n=13) and with Nordic pole walking group (n=13). The EMG data were collected by measurement while the subjects walking on a treadmill for 30 minutes by measuring from one heel strike to the next. [Results] Both the average values and maximum values of the muscle activity of the upper extremity increased in both the group that used Nordic poles and the group that did not use Nordic poles, and the values showed statistically significant differences. There was an increase in the average value for muscle activity of the latissimus dorsi, but the difference was not statistically significant, although there was a statistically significant increase in its maximum value. The average and maximum values for muscle activity of the lower extremity did not show large differences in either group, and the values did not show any statistically significant differences. [Conclusion] The use of Nordic poles by increased muscle activity of the upper extremity compared with regular walking but did not affect the lower extremity.
Shim, Je-myung; Kwon, Hae-yeon; Kim, Ha-roo; Kim, Bo-in; Jung, Ju-hyeon
2014-01-01
[Purpose] The aim of this study was to assess the effect of Nordic pole walking on the electromyographic activities of upper extremity and lower extremity muscles. [Subjects and Methods] The subjects were randomly divided into two groups as follows: without Nordic pole walking group (n=13) and with Nordic pole walking group (n=13). The EMG data were collected by measurement while the subjects walking on a treadmill for 30 minutes by measuring from one heel strike to the next. [Results] Both the average values and maximum values of the muscle activity of the upper extremity increased in both the group that used Nordic poles and the group that did not use Nordic poles, and the values showed statistically significant differences. There was an increase in the average value for muscle activity of the latissimus dorsi, but the difference was not statistically significant, although there was a statistically significant increase in its maximum value. The average and maximum values for muscle activity of the lower extremity did not show large differences in either group, and the values did not show any statistically significant differences. [Conclusion] The use of Nordic poles by increased muscle activity of the upper extremity compared with regular walking but did not affect the lower extremity. PMID:24409018
Captive breeding of pangolins: current status, problems and future prospects.
Hua, Liushuai; Gong, Shiping; Wang, Fumin; Li, Weiye; Ge, Yan; Li, Xiaonan; Hou, Fanghui
2015-01-01
Pangolins are unique placental mammals with eight species existing in the world, which have adapted to a highly specialized diet of ants and termites, and are of significance in the control of forest termite disaster. Besides their ecological value, pangolins are extremely important economic animals with the value as medicine and food. At present, illegal hunting and habitat destruction have drastically decreased the wild population of pangolins, pushing them to the edge of extinction. Captive breeding is an important way to protect these species, but because of pangolin's specialized behaviors and high dependence on natural ecosystem, there still exist many technical barriers to successful captive breeding programs. In this paper, based on the literatures and our practical experience, we reviewed the status and existing problems in captive breeding of pangolins, including four aspects, the naturalistic habitat, dietary husbandry, reproduction and disease control. Some recommendations are presented for effective captive breeding and protection of pangolins.
The nonequilibrium quantum many-body problem as a paradigm for extreme data science
NASA Astrophysics Data System (ADS)
Freericks, J. K.; Nikolić, B. K.; Frieder, O.
2014-12-01
Generating big data pervades much of physics. But some problems, which we call extreme data problems, are too large to be treated within big data science. The nonequilibrium quantum many-body problem on a lattice is just such a problem, where the Hilbert space grows exponentially with system size and rapidly becomes too large to fit on any computer (and can be effectively thought of as an infinite-sized data set). Nevertheless, much progress has been made with computational methods on this problem, which serve as a paradigm for how one can approach and attack extreme data problems. In addition, viewing these physics problems from a computer-science perspective leads to new approaches that can be tried to solve more accurately and for longer times. We review a number of these different ideas here.
Extreme values in the Chinese and American stock markets based on detrended fluctuation analysis
NASA Astrophysics Data System (ADS)
Cao, Guangxi; Zhang, Minjia
2015-10-01
This paper focuses on the comparative analysis of extreme values in the Chinese and American stock markets based on the detrended fluctuation analysis (DFA) algorithm using the daily data of Shanghai composite index and Dow Jones Industrial Average. The empirical results indicate that the multifractal detrended fluctuation analysis (MF-DFA) method is more objective than the traditional percentile method. The range of extreme value of Dow Jones Industrial Average is smaller than that of Shanghai composite index, and the extreme value of Dow Jones Industrial Average is more time clustering. The extreme value of the Chinese or American stock markets is concentrated in 2008, which is consistent with the financial crisis in 2008. Moreover, we investigate whether extreme events affect the cross-correlation between the Chinese and American stock markets using multifractal detrended cross-correlation analysis algorithm. The results show that extreme events have nothing to do with the cross-correlation between the Chinese and American stock markets.
Hermeneutics of differential calculus in eighteenth-century northern Germany.
Blanco, Mónica
2008-01-01
This paper applies comparative textbook analysis to studying the mathematical development of differential calculus in northern German states during the eighteenth century. It begins with describing how the four textbooks analyzed presented the foundations of calculus and continues with assessing the influence each of these foundational approaches exerted on the resolution of problems, such as the determination of tangents and extreme values, and even on the choice of coordinates for both algebraic and transcendental curves.
Min-Max Spaces and Complexity Reduction in Min-Max Expansions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gaubert, Stephane, E-mail: Stephane.Gaubert@inria.fr; McEneaney, William M., E-mail: wmceneaney@ucsd.edu
2012-06-15
Idempotent methods have been found to be extremely helpful in the numerical solution of certain classes of nonlinear control problems. In those methods, one uses the fact that the value function lies in the space of semiconvex functions (in the case of maximizing controllers), and approximates this value using a truncated max-plus basis expansion. In some classes, the value function is actually convex, and then one specifically approximates with suprema (i.e., max-plus sums) of affine functions. Note that the space of convex functions is a max-plus linear space, or moduloid. In extending those concepts to game problems, one finds amore » different function space, and different algebra, to be appropriate. Here we consider functions which may be represented using infima (i.e., min-max sums) of max-plus affine functions. It is natural to refer to the class of functions so represented as the min-max linear space (or moduloid) of max-plus hypo-convex functions. We examine this space, the associated notion of duality and min-max basis expansions. In using these methods for solution of control problems, and now games, a critical step is complexity-reduction. In particular, one needs to find reduced-complexity expansions which approximate the function as well as possible. We obtain a solution to this complexity-reduction problem in the case of min-max expansions.« less
Valuing happiness is associated with bipolar disorder.
Ford, Brett Q; Mauss, Iris B; Gruber, June
2015-04-01
Although people who experience happiness tend to have better psychological health, people who value happiness to an extreme tend to have worse psychological health, including more depression. We propose that the extreme valuing of happiness may be a general risk factor for mood disturbances, both depressive and manic. To test this hypothesis, we examined the relationship between the extreme valuing of happiness and risk for, diagnosis of, and illness course for bipolar disorder (BD). Supporting our hypothesis, the extreme valuing of happiness was associated with a measure of increased risk for developing BD (Studies 1 and 2), increased likelihood of past diagnosis of BD (Studies 2 and 3), and worse prospective illness course in BD (Study 3), even when controlling for current mood symptoms (Studies 1-3). These findings indicate that the extreme valuing of happiness is associated with and even predicts BD. Taken together with previous evidence, these findings suggest that the extreme valuing of happiness is a general risk factor for mood disturbances. More broadly, what emotions people strive to feel may play a critical role in psychological health. (c) 2015 APA, all rights reserved).
Valuing happiness is associated with bipolar disorder
Ford, Brett Q.; Mauss, Iris B.; Gruber, June
2015-01-01
While people who experience happiness tend to have better psychological health, people who value happiness to an extreme tend to have worse psychological health, including more depression. We propose that the extreme valuing of happiness may be a general risk factor for mood disturbances, both depressive and manic. To test this hypothesis, we examined the relationship between the extreme valuing of happiness and risk for, diagnosis of, and illness course for Bipolar Disorder (BD). Supporting our hypothesis, the extreme valuing of happiness was associated with a measure of increased risk for developing BD (Studies 1–2), increased likelihood of past diagnosis of BD (Studies 2–3), and worse prospective illness course in BD (Study 3), even when controlling for current mood symptoms (Studies 1–3). These findings indicate that the extreme valuing of happiness is associated with and even predicts BD. Taken together with previous evidence, these findings suggest that the extreme valuing of happiness is a general risk factor for mood disturbances. More broadly, what emotions people strive to feel may play a critical role in psychological health. PMID:25603134
Solving the wrong hierarchy problem
Blinov, Nikita; Hook, Anson
2016-06-29
Many theories require augmenting the Standard Model with additional scalar fields with large order one couplings. We present a new solution to the hierarchy problem for these scalar fields. We explore parity- and Z 2-symmetric theories where the Standard Model Higgs potential has two vacua. The parity or Z 2 copy of the Higgs lives in the minimum far from the origin while our Higgs occupies the minimum near the origin of the potential. This approach results in a theory with multiple light scalar fields but with only a single hierarchy problem, since the bare mass is tied to themore » Higgs mass by a discrete symmetry. The new scalar does not have a new hierarchy problem associated with it because its expectation value and mass are generated by dimensional transmutation of the scalar quartic coupling. The location of the second Higgs minimum is not a free parameter, but is rather a function of the matter content of the theory. As a result, these theories are extremely predictive. We develop this idea in the context of a solution to the strong CP problem. Lastly, we show this mechanism postdicts the top Yukawa to be within 1σ of the currently measured value and predicts scalar color octets with masses in the range 9-200 TeV.« less
A user-targeted synthesis of the VALUE perfect predictor experiment
NASA Astrophysics Data System (ADS)
Maraun, Douglas; Widmann, Martin; Gutierrez, Jose; Kotlarski, Sven; Hertig, Elke; Wibig, Joanna; Rössler, Ole; Huth, Radan
2016-04-01
VALUE is an open European network to validate and compare downscaling methods for climate change research. A key deliverable of VALUE is the development of a systematic validation framework to enable the assessment and comparison of both dynamical and statistical downscaling methods. VALUE's main approach to validation is user-focused: starting from a specific user problem, a validation tree guides the selection of relevant validation indices and performance measures. We consider different aspects: (1) marginal aspects such as mean, variance and extremes; (2) temporal aspects such as spell length characteristics; (3) spatial aspects such as the de-correlation length of precipitation extremes; and multi-variate aspects such as the interplay of temperature and precipitation or scale-interactions. Several experiments have been designed to isolate specific points in the downscaling procedure where problems may occur. Experiment 1 (perfect predictors): what is the isolated downscaling skill? How do statistical and dynamical methods compare? How do methods perform at different spatial scales? Experiment 2 (Global climate model predictors): how is the overall representation of regional climate, including errors inherited from global climate models? Experiment 3 (pseudo reality): do methods fail in representing regional climate change? Here, we present a user-targeted synthesis of the results of the first VALUE experiment. In this experiment, downscaling methods are driven with ERA-Interim reanalysis data to eliminate global climate model errors, over the period 1979-2008. As reference data we use, depending on the question addressed, (1) observations from 86 meteorological stations distributed across Europe; (2) gridded observations at the corresponding 86 locations or (3) gridded spatially extended observations for selected European regions. With more than 40 contributing methods, this study is the most comprehensive downscaling inter-comparison project so far. The results clearly indicate that for several aspects, the downscaling skill varies considerably between different methods. For specific purposes, some methods can therefore clearly be excluded.
Weeks, William B; Kotzbauer, Gregory R; Weinstein, James N
2016-06-01
Using publicly available Hospital Compare and Medicare data, we found a substantial range of hospital-level performance on quality, expenditure, and value measures for 4 common reasons for admission. Hospitals' ability to consistently deliver high-quality, low-cost care varied across the different reasons for admission. With the exception of coronary artery bypass grafting, hospitals that provided the highest-value care had more beds and a larger average daily census than those providing the lowest-value care. Transparent data like those we present can empower patients to compare hospital performance, make better-informed treatment decisions, and decide where to obtain care for particular health care problems. In the United States, the transition from volume to value dominates discussions of health care reform. While shared decision making might help patients determine whether to get care, transparency in procedure- and hospital-specific value measures would help them determine where to get care. Using Hospital Compare and Medicare expenditure data, we constructed a hospital-level measure of value from a numerator composed of quality-of-care measures (satisfaction, use of timely and effective care, and avoidance of harms) and a denominator composed of risk-adjusted 30-day episode-of-care expenditures for acute myocardial infarction (1,900 hospitals), coronary artery bypass grafting (884 hospitals), colectomy (1,252 hospitals), and hip replacement surgery (1,243 hospitals). We found substantial variation in aggregate measures of quality, cost, and value at the hospital level. Value calculation provided additional richness when compared to assessment based on quality or cost alone: about 50% of hospitals in an extreme quality- (and about 65% more in an extreme cost-) quintile were in the same extreme value quintile. With the exception of coronary artery bypass grafting, higher-value hospitals were larger and had a higher average daily census than lower-value hospitals, but were no more likely to be accredited by the Joint Commission or to have a residency program accredited by the American Council of Graduate Medical Education. While future efforts to compose value measures will certainly be modified and expanded to examine other reasons for admission, the construct that we present could allow patients to transparently compare procedure- and hospital-specific quality, spending, and value and empower them to decide where to obtain care. © 2016 Milbank Memorial Fund.
Uncertainties in obtaining high reliability from stress-strength models
NASA Technical Reports Server (NTRS)
Neal, Donald M.; Matthews, William T.; Vangel, Mark G.
1992-01-01
There has been a recent interest in determining high statistical reliability in risk assessment of aircraft components. The potential consequences are identified of incorrectly assuming a particular statistical distribution for stress or strength data used in obtaining the high reliability values. The computation of the reliability is defined as the probability of the strength being greater than the stress over the range of stress values. This method is often referred to as the stress-strength model. A sensitivity analysis was performed involving a comparison of reliability results in order to evaluate the effects of assuming specific statistical distributions. Both known population distributions, and those that differed slightly from the known, were considered. Results showed substantial differences in reliability estimates even for almost nondetectable differences in the assumed distributions. These differences represent a potential problem in using the stress-strength model for high reliability computations, since in practice it is impossible to ever know the exact (population) distribution. An alternative reliability computation procedure is examined involving determination of a lower bound on the reliability values using extreme value distributions. This procedure reduces the possibility of obtaining nonconservative reliability estimates. Results indicated the method can provide conservative bounds when computing high reliability. An alternative reliability computation procedure is examined involving determination of a lower bound on the reliability values using extreme value distributions. This procedure reduces the possibility of obtaining nonconservative reliability estimates. Results indicated the method can provide conservative bounds when computing high reliability.
Si, Yifan; Guo, Zhiguang; Liu, Weimin
2016-06-29
Superhydrophobic coating has extremely high application value and practicability. However, some difficult problems such as weak mechanical strength, the need for expensive toxic reagents, and a complex preparation process are all hard to avoid, and these problems have impeded the superhydrophobic coating's real-life application for a long time. Here, we demonstrate one kind of omnipotent epoxy resins @ stearic acid-Mg(OH)2 superhydrophobic coating via a simple antideposition route and one-step superhydrophobization process. The whole preparation process is facile, and expensive toxic reagents needed. This omnipotent coating can be applied on any solid substrate with great waterproof ability, excellent mechanical stability, and chemical durability, which can be stored in a realistic environment for more than 1 month. More significantly, this superhydrophobic coating also has four protective abilities, antifouling, anticorrosion, anti-icing, and flame-retardancy, to cope with a variety of possible extreme natural environments. Therefore, this omnipotent epoxy resins @ stearic acid-Mg(OH)2 superhydrophobic coating not only satisfies real-life need but also has great application potential in many respects.
Modified Inverse First Order Reliability Method (I-FORM) for Predicting Extreme Sea States.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eckert-Gallup, Aubrey Celia; Sallaberry, Cedric Jean-Marie; Dallman, Ann Renee
Environmental contours describing extreme sea states are generated as the input for numerical or physical model simulation s as a part of the stand ard current practice for designing marine structure s to survive extreme sea states. Such environmental contours are characterized by combinations of significant wave height ( ) and energy period ( ) values calculated for a given recurrence interval using a set of data based on hindcast simulations or buoy observations over a sufficient period of record. The use of the inverse first - order reliability method (IFORM) i s standard design practice for generating environmental contours.more » In this paper, the traditional appli cation of the IFORM to generating environmental contours representing extreme sea states is described in detail and its merits and drawbacks are assessed. The application of additional methods for analyzing sea state data including the use of principal component analysis (PCA) to create an uncorrelated representation of the data under consideration is proposed. A reexamination of the components of the IFORM application to the problem at hand including the use of new distribution fitting techniques are shown to contribute to the development of more accurate a nd reasonable representations of extreme sea states for use in survivability analysis for marine struc tures. Keywords: In verse FORM, Principal Component Analysis , Environmental Contours, Extreme Sea State Characteri zation, Wave Energy Converters« less
Extreme-value dependence: An application to exchange rate markets
NASA Astrophysics Data System (ADS)
Fernandez, Viviana
2007-04-01
Extreme value theory (EVT) focuses on modeling the tail behavior of a loss distribution using only extreme values rather than the whole data set. For a sample of 10 countries with dirty/free float regimes, we investigate whether paired currencies exhibit a pattern of asymptotic dependence. That is, whether an extremely large appreciation or depreciation in the nominal exchange rate of one country might transmit to another. In general, after controlling for volatility clustering and inertia in returns, we do not find evidence of extreme-value dependence between paired exchange rates. However, for asymptotic-independent paired returns, we find that tail dependency of exchange rates is stronger under large appreciations than under large depreciations.
Exchangeability, extreme returns and Value-at-Risk forecasts
NASA Astrophysics Data System (ADS)
Huang, Chun-Kai; North, Delia; Zewotir, Temesgen
2017-07-01
In this paper, we propose a new approach to extreme value modelling for the forecasting of Value-at-Risk (VaR). In particular, the block maxima and the peaks-over-threshold methods are generalised to exchangeable random sequences. This caters for the dependencies, such as serial autocorrelation, of financial returns observed empirically. In addition, this approach allows for parameter variations within each VaR estimation window. Empirical prior distributions of the extreme value parameters are attained by using resampling procedures. We compare the results of our VaR forecasts to that of the unconditional extreme value theory (EVT) approach and the conditional GARCH-EVT model for robust conclusions.
Predicting the cosmological constant with the scale-factor cutoff measure
DOE Office of Scientific and Technical Information (OSTI.GOV)
De Simone, Andrea; Guth, Alan H.; Salem, Michael P.
2008-09-15
It is well known that anthropic selection from a landscape with a flat prior distribution of cosmological constant {lambda} gives a reasonable fit to observation. However, a realistic model of the multiverse has a physical volume that diverges with time, and the predicted distribution of {lambda} depends on how the spacetime volume is regulated. A very promising method of regulation uses a scale-factor cutoff, which avoids a number of serious problems that arise in other approaches. In particular, the scale-factor cutoff avoids the 'youngness problem' (high probability of living in a much younger universe) and the 'Q and G catastrophes'more » (high probability for the primordial density contrast Q and gravitational constant G to have extremely large or small values). We apply the scale-factor cutoff measure to the probability distribution of {lambda}, considering both positive and negative values. The results are in good agreement with observation. In particular, the scale-factor cutoff strongly suppresses the probability for values of {lambda} that are more than about 10 times the observed value. We also discuss qualitatively the prediction for the density parameter {omega}, indicating that with this measure there is a possibility of detectable negative curvature.« less
OPTIMIZING THROUGH CO-EVOLUTIONARY AVALANCHES
DOE Office of Scientific and Technical Information (OSTI.GOV)
S. BOETTCHER; A. PERCUS
2000-08-01
We explore a new general-purpose heuristic for finding high-quality solutions to hard optimization problems. The method, called extremal optimization, is inspired by ''self-organized critically,'' a concept introduced to describe emergent complexity in many physical systems. In contrast to Genetic Algorithms which operate on an entire ''gene-pool'' of possible solutions, extremal optimization successively replaces extremely undesirable elements of a sub-optimal solution with new, random ones. Large fluctuations, called ''avalanches,'' ensue that efficiently explore many local optima. Drawing upon models used to simulate far-from-equilibrium dynamics, extremal optimization complements approximation methods inspired by equilibrium statistical physics, such as simulated annealing. With only onemore » adjustable parameter, its performance has proved competitive with more elaborate methods, especially near phase transitions. Those phase transitions are found in the parameter space of most optimization problems, and have recently been conjectured to be the origin of some of the hardest instances in computational complexity. We will demonstrate how extremal optimization can be implemented for a variety of combinatorial optimization problems. We believe that extremal optimization will be a useful tool in the investigation of phase transitions in combinatorial optimization problems, hence valuable in elucidating the origin of computational complexity.« less
Statistic analysis of annual total ozone extremes for the period 1964-1988
NASA Technical Reports Server (NTRS)
Krzyscin, Janusz W.
1994-01-01
Annual extremes of total column amount of ozone (in the period 1964-1988) from a network of 29 Dobson stations have been examined using the extreme value analysis. The extremes have been calculated as the highest deviation of daily mean total ozone from its long-term monthly mean, normalized by the monthly standard deviations. The extremes have been selected from the direct-Sun total ozone observations only. The extremes resulting from abrupt changes in ozone (day to day changes greater than 20 percent) have not been considered. The ordered extremes (maxima in ascending way, minima in descending way) have been fitted to one of three forms of the Fisher-Tippet extreme value distribution by the nonlinear least square method (Levenberg-Marguard method). We have found that the ordered extremes from a majority of Dobson stations lie close to Fisher-Tippet type III. The extreme value analysis of the composite annual extremes (combined from averages of the annual extremes selected at individual stations) has shown that the composite maxima are fitted by the Fisher-Tippet type III and the composite minima by the Fisher-Tippet type I. The difference between the Fisher-Tippet types of the composite extremes seems to be related to the ozone downward trend. Extreme value prognoses for the period 1964-2014 (derived from the data taken at: all analyzed stations, the North American, and the European stations) have revealed that the prognostic extremes are close to the largest annual extremes in the period 1964-1988 and there are only small regional differences in the prognoses.
Stewart, Jessica N; McGillivray, David; Sussman, John; Foster, Bethany
2008-10-01
Blood pressure (BP) is measured at triage in most emergency departments (EDs). We aimed to determine the value of triage BP in diagnosing hypotension and true hypertension in children age > or =3 years presenting with nonurgent problems. In this prospective study, eligible children underwent automated BP measurement at triage. If BP was elevated, then the measurement was repeated manually. Children with a high manual BP were followed. True hypertension was defined as a manual BP >95th percentile for sex, age, and height measured on 3 occasions. Automated triage BP was measured in 549 children (53.4% male; mean age, 9.4 +/- 4.3 years) and was found to be elevated in 144 of them (26%). No child was hypotensive. Among the 495 patients with complete follow-up, the specificity and positive predictive value (PPV) of elevated triage BP in diagnosing true hypertension were 81.8% and 0%, respectively. A sensitivity analysis including those with incomplete follow-up, in which the population prevalence of true hypertension was assumed to be 1% to 2%, resulted in a specificity of 74.5% to 75.3% and a PPV of 3.8% to 7.5%. The yield of measuring BP at triage in children with nonurgent problems appears to be extremely low.
NASA Astrophysics Data System (ADS)
Guimarães Nobre, Gabriela; Arnbjerg-Nielsen, Karsten; Rosbjerg, Dan; Madsen, Henrik
2016-04-01
Traditionally, flood risk assessment studies have been carried out from a univariate frequency analysis perspective. However, statistical dependence between hydrological variables, such as extreme rainfall and extreme sea surge, is plausible to exist, since both variables to some extent are driven by common meteorological conditions. Aiming to overcome this limitation, multivariate statistical techniques has the potential to combine different sources of flooding in the investigation. The aim of this study was to apply a range of statistical methodologies for analyzing combined extreme hydrological variables that can lead to coastal and urban flooding. The study area is the Elwood Catchment, which is a highly urbanized catchment located in the city of Port Phillip, Melbourne, Australia. The first part of the investigation dealt with the marginal extreme value distributions. Two approaches to extract extreme value series were applied (Annual Maximum and Partial Duration Series), and different probability distribution functions were fit to the observed sample. Results obtained by using the Generalized Pareto distribution demonstrate the ability of the Pareto family to model the extreme events. Advancing into multivariate extreme value analysis, first an investigation regarding the asymptotic properties of extremal dependence was carried out. As a weak positive asymptotic dependence between the bivariate extreme pairs was found, the Conditional method proposed by Heffernan and Tawn (2004) was chosen. This approach is suitable to model bivariate extreme values, which are relatively unlikely to occur together. The results show that the probability of an extreme sea surge occurring during a one-hour intensity extreme precipitation event (or vice versa) can be twice as great as what would occur when assuming independent events. Therefore, presuming independence between these two variables would result in severe underestimation of the flooding risk in the study area.
NASA Astrophysics Data System (ADS)
Rieder, H. E.; Staehelin, J.; Maeder, J. A.; Peter, T.; Ribatet, M.; Davison, A. C.; Stübi, R.; Weihs, P.; Holawe, F.
2010-10-01
In this study the frequency of days with extreme low (termed ELOs) and extreme high (termed EHOs) total ozone values and their influence on mean values and trends are analyzed for the world's longest total ozone record (Arosa, Switzerland). The results show (i) an increase in ELOs and (ii) a decrease in EHOs during the last decades and (iii) that the overall trend during the 1970s and 1980s in total ozone is strongly dominated by changes in these extreme events. After removing the extremes, the time series shows a strongly reduced trend (reduction by a factor of 2.5 for trend in annual mean). Excursions in the frequency of extreme events reveal "fingerprints" of dynamical factors such as ENSO or NAO, and chemical factors, such as cold Arctic vortex ozone losses, as well as major volcanic eruptions of the 20th century (Gunung Agung, El Chichón, Mt. Pinatubo). Furthermore, atmospheric loading of ozone depleting substances leads to a continuous modification of column ozone in the Northern Hemisphere also with respect to extreme values (partly again in connection with polar vortex contributions). Application of extreme value theory allows the identification of many more such "fingerprints" than conventional time series analysis of annual and seasonal mean values. The analysis shows in particular the strong influence of dynamics, revealing that even moderate ENSO and NAO events have a discernible effect on total ozone. Overall the approach to extremal modelling provides new information on time series properties, variability, trends and the influence of dynamics and chemistry, complementing earlier analyses focusing only on monthly (or annual) mean values.
NASA Astrophysics Data System (ADS)
Rieder, H. E.; Staehelin, J.; Maeder, J. A.; Peter, T.; Ribatet, M.; Davison, A. C.; Stübi, R.; Weihs, P.; Holawe, F.
2010-05-01
In this study the frequency of days with extreme low (termed ELOs) and extreme high (termed EHOs) total ozone values and their influence on mean values and trends are analyzed for the world's longest total ozone record (Arosa, Switzerland). The results show (a) an increase in ELOs and (b) a decrease in EHOs during the last decades and (c) that the overall trend during the 1970s and 1980s in total ozone is strongly dominated by changes in these extreme events. After removing the extremes, the time series shows a strongly reduced trend (reduction by a factor of 2.5 for trend in annual mean). Excursions in the frequency of extreme events reveal "fingerprints" of dynamical factors such as ENSO or NAO, and chemical factors, such as cold Arctic vortex ozone losses, as well as major volcanic eruptions of the 20th century (Gunung Agung, El Chichón, Mt. Pinatubo). Furthermore, atmospheric loading of ozone depleting substances leads to a continuous modification of column ozone in the Northern Hemisphere also with respect to extreme values (partly again in connection with polar vortex contributions). Application of extreme value theory allows the identification of many more such "fingerprints" than conventional time series analysis of annual and seasonal mean values. The analysis shows in particular the strong influence of dynamics, revealing that even moderate ENSO and NAO events have a discernible effect on total ozone. Overall the approach to extremal modelling provides new information on time series properties, variability, trends and the influence of dynamics and chemistry, complementing earlier analyses focusing only on monthly (or annual) mean values.
Rainfall extremes from TRMM data and the Metastatistical Extreme Value Distribution
NASA Astrophysics Data System (ADS)
Zorzetto, Enrico; Marani, Marco
2017-04-01
A reliable quantification of the probability of weather extremes occurrence is essential for designing resilient water infrastructures and hazard mitigation measures. However, it is increasingly clear that the presence of inter-annual climatic fluctuations determines a substantial long-term variability in the frequency of occurrence of extreme events. This circumstance questions the foundation of the traditional extreme value theory, hinged on stationary Poisson processes or on asymptotic assumptions to derive the Generalized Extreme Value (GEV) distribution. We illustrate here, with application to daily rainfall, a new approach to extreme value analysis, the Metastatistical Extreme Value Distribution (MEVD). The MEVD relaxes the above assumptions and is based on the whole distribution of daily rainfall events, thus allowing optimal use of all available observations. Using a global dataset of rain gauge observations, we show that the MEVD significantly outperforms the Generalized Extreme Value distribution, particularly for long average recurrence intervals and when small samples are available. The latter property suggests MEVD to be particularly suited for applications to satellite rainfall estimates, which only cover two decades, thus making extreme value estimation extremely challenging. Here we apply MEVD to the TRMM TMPA 3B42 product, an 18-year dataset of remotely-sensed daily rainfall providing a quasi-global coverage. Our analyses yield a global scale mapping of daily rainfall extremes and of their distributional tail properties, bridging the existing large gaps in ground-based networks. Finally, we illustrate how our global-scale analysis can provide insight into how properties of local rainfall regimes affect tail estimation uncertainty when using the GEV or MEVD approach. We find a dependence of the estimation uncertainty, for both the GEV- and MEV-based approaches, on the average annual number and on the inter-annual variability of rainy days. In particular, estimation uncertainty decreases 1) as the mean annual number of wet days increases, and 2) as the variability in the number of rainy days, expressed by its coefficient of variation, decreases. We tentatively explain this behavior in terms of the assumptions underlying the two approaches.
Nonstationary Extreme Value Analysis in a Changing Climate: A Software Package
NASA Astrophysics Data System (ADS)
Cheng, L.; AghaKouchak, A.; Gilleland, E.
2013-12-01
Numerous studies show that climatic extremes have increased substantially in the second half of the 20th century. For this reason, analysis of extremes under a nonstationary assumption has received a great deal of attention. This paper presents a software package developed for estimation of return levels, return periods, and risks of climatic extremes in a changing climate. This MATLAB software package offers tools for analysis of climate extremes under both stationary and non-stationary assumptions. The Nonstationary Extreme Value Analysis (hereafter, NEVA) provides an efficient and generalized framework for analyzing extremes using Bayesian inference. NEVA estimates the extreme value parameters using a Differential Evolution Markov Chain (DE-MC) which utilizes the genetic algorithm Differential Evolution (DE) for global optimization over the real parameter space with the Markov Chain Monte Carlo (MCMC) approach and has the advantage of simplicity, speed of calculation and convergence over conventional MCMC. NEVA also offers the confidence interval and uncertainty bounds of estimated return levels based on the sampled parameters. NEVA integrates extreme value design concepts, data analysis tools, optimization and visualization, explicitly designed to facilitate analysis extremes in geosciences. The generalized input and output files of this software package make it attractive for users from across different fields. Both stationary and nonstationary components of the package are validated for a number of case studies using empirical return levels. The results show that NEVA reliably describes extremes and their return levels.
Robust, nonlinear, high angle-of-attack control design for a supermaneuverable vehicle
NASA Technical Reports Server (NTRS)
Adams, Richard J.
1993-01-01
High angle-of-attack flight control laws are developed for a supermaneuverable fighter aircraft. The methods of dynamic inversion and structured singular value synthesis are combined into an approach which addresses both the nonlinearity and robustness problems of flight at extreme operating conditions. The primary purpose of the dynamic inversion control elements is to linearize the vehicle response across the flight envelope. Structured singular value synthesis is used to design a dynamic controller which provides robust tracking to pilot commands. The resulting control system achieves desired flying qualities and guarantees a large margin of robustness to uncertainties for high angle-of-attack flight conditions. The results of linear simulation and structured singular value stability analysis are presented to demonstrate satisfaction of the design criteria. High fidelity nonlinear simulation results show that the combined dynamics inversion/structured singular value synthesis control law achieves a high level of performance in a realistic environment.
Ethical aspects of personality disorders.
Bendelow, Gillian
2010-11-01
To review recent literature around the controversial diagnosis of personality disorder, and to assess the ethical aspects of its status as a medical disorder. The diagnostic currency of personality disorder as a psychiatric/medical disorder has a longstanding history of ethical and social challenges through critiques of the medicalization of deviance. More recently controversies by reflexive physicians around the inclusion of the category in the forthcoming revisions of International Classification of Diseases and Diagnostic and Statistical Manual of Mental Disorders classifications reflect the problems of value-laden criteria, with the diagnostic category being severely challenged from within psychiatry as well as from without. The clinical diagnostic criteria for extremely value-laden psychiatric conditions such as personality disorder need to be analyzed through the lens of values-based medicine, as well as through clinical evidence, as the propensity for political and sociolegal appropriation of the categories can render their clinical and diagnostic value meaningless.
Bivariate extreme value distributions
NASA Technical Reports Server (NTRS)
Elshamy, M.
1992-01-01
In certain engineering applications, such as those occurring in the analyses of ascent structural loads for the Space Transportation System (STS), some of the load variables have a lower bound of zero. Thus, the need for practical models of bivariate extreme value probability distribution functions with lower limits was identified. We discuss the Gumbel models and present practical forms of bivariate extreme probability distributions of Weibull and Frechet types with two parameters. Bivariate extreme value probability distribution functions can be expressed in terms of the marginal extremel distributions and a 'dependence' function subject to certain analytical conditions. Properties of such bivariate extreme distributions, sums and differences of paired extremals, as well as the corresponding forms of conditional distributions, are discussed. Practical estimation techniques are also given.
Escarela, Gabriel
2012-06-01
The occurrence of high concentrations of tropospheric ozone is considered as one of the most important issues of air management programs. The prediction of dangerous ozone levels for the public health and the environment, along with the assessment of air quality control programs aimed at reducing their severity, is of considerable interest to the scientific community and to policy makers. The chemical mechanisms of tropospheric ozone formation are complex, and highly variable meteorological conditions contribute additionally to difficulties in accurate study and prediction of high levels of ozone. Statistical methods offer an effective approach to understand the problem and eventually improve the ability to predict maximum levels of ozone. In this paper an extreme value model is developed to study data sets that consist of periodically collected maxima of tropospheric ozone concentrations and meteorological variables. The methods are applied to daily tropospheric ozone maxima in Guadalajara City, Mexico, for the period January 1997 to December 2006. The model adjusts the daily rate of change in ozone for concurrent impacts of seasonality and present and past meteorological conditions, which include surface temperature, wind speed, wind direction, relative humidity, and ozone. The results indicate that trend, annual effects, and key meteorological variables along with some interactions explain the variation in daily ozone maxima. Prediction performance assessments yield reasonably good results.
Optimal analytic method for the nonlinear Hasegawa-Mima equation
NASA Astrophysics Data System (ADS)
Baxter, Mathew; Van Gorder, Robert A.; Vajravelu, Kuppalapalle
2014-05-01
The Hasegawa-Mima equation is a nonlinear partial differential equation that describes the electric potential due to a drift wave in a plasma. In the present paper, we apply the method of homotopy analysis to a slightly more general Hasegawa-Mima equation, which accounts for hyper-viscous damping or viscous dissipation. First, we outline the method for the general initial/boundary value problem over a compact rectangular spatial domain. We use a two-stage method, where both the convergence control parameter and the auxiliary linear operator are optimally selected to minimize the residual error due to the approximation. To do the latter, we consider a family of operators parameterized by a constant which gives the decay rate of the solutions. After outlining the general method, we consider a number of concrete examples in order to demonstrate the utility of this approach. The results enable us to study properties of the initial/boundary value problem for the generalized Hasegawa-Mima equation. In several cases considered, we are able to obtain solutions with extremely small residual errors after relatively few iterations are computed (residual errors on the order of 10-15 are found in multiple cases after only three iterations). The results demonstrate that selecting a parameterized auxiliary linear operator can be extremely useful for minimizing residual errors when used concurrently with the optimal homotopy analysis method, suggesting that this approach can prove useful for a number of nonlinear partial differential equations arising in physics and nonlinear mechanics.
Future Projection of Summer Extreme Precipitation from High Resolution Multi-RCMs over East Asia
NASA Astrophysics Data System (ADS)
Kim, Gayoung; Park, Changyong; Cha, Dong-Hyun; Lee, Dong-Kyou; Suh, Myoung-Seok; Ahn, Joong-Bae; Min, Seung-Ki; Hong, Song-You; Kang, Hyun-Suk
2017-04-01
Recently, the frequency and intensity of natural hazards have been increasing due to human-induced climate change. Because most damages of natural hazards over East Asia have been related to extreme precipitation events, it is important to estimate future change in extreme precipitation characteristics caused by climate change. We investigate future changes in extremal values of summer precipitation simulated by five regional climate models participating in the CORDEX-East Asia project (i.e., HadGEM3-RA, RegCM4, MM5, WRF, and GRIMs) over East Asia. 100-year return value calculated from the generalized extreme value (GEV) parameters is analysed as an indicator of extreme intensity. In the future climate, the mean values as well as the extreme values of daily precipitation tend to increase over land region. The increase of 100-year return value can be significantly associated with the changes in the location (intensity) and scale (variability) GEV parameters for extreme precipitation. It is expected that the results of this study can be used as fruitful references when making the policy of disaster management. Acknowledgements The research was supported by the Ministry of Public Safety and Security of Korean government and Development program under grant MPSS-NH-2013-63 and the National Research Foundation of Korea Grant funded by the Ministry of Science, ICT and Future Planning of Korea (NRF-2016M3C4A7952637) for its support and assistant in completion of the study.
Extremely cold events and sudden air temperature drops during winter season in the Czech Republic
NASA Astrophysics Data System (ADS)
Crhová, Lenka; Valeriánová, Anna; Holtanová, Eva; Müller, Miloslav; Kašpar, Marek; Stříž, Martin
2014-05-01
Today a great attention is turned to analysis of extreme weather events and frequency of their occurrence under changing climate. In most cases, these studies are focused on extremely warm events in summer season. However, extremely low values of air temperature during winter can have serious impacts on many sectors as well (e.g. power engineering, transportation, industry, agriculture, human health). Therefore, in present contribution we focus on extremely and abnormally cold air temperature events in winter season in the Czech Republic. Besides the seasonal extremes of minimum air temperature determined from station data, the standardized data with removed annual cycle are used as well. Distribution of extremely cold events over the season and the temporal evolution of frequency of occurrence during the period 1961-2010 are analyzed. Furthermore, the connection of cold events with extreme sudden temperature drops is studied. The extreme air temperature events and events of extreme sudden temperature drop are assessed using the Weather Extremity Index, which evaluates the extremity (based on return periods) and spatial extent of the meteorological extreme event of interest. The generalized extreme value distribution parameters are used to estimate return periods of daily temperature values. The work has been supported by the grant P209/11/1990 funded by the Czech Science Foundation.
Spatiotemporal variability of extreme temperature frequency and amplitude in China
NASA Astrophysics Data System (ADS)
Zhang, Yuanjie; Gao, Zhiqiu; Pan, Zaitao; Li, Dan; Huang, Xinhui
2017-03-01
Temperature extremes in China are examined based on daily maximum and minimum temperatures from station observations and multiple global climate models. The magnitude and frequency of extremes are expressed in terms of return values and periods, respectively, estimated by the fitted Generalized Extreme Value (GEV) distribution of annual extreme temperatures. The observations suggest that changes in temperature extremes considerably exceed changes in the respective climatological means during the past five decades, with greater amplitude of increases in cold extremes than in warm extremes. The frequency of warm (cold) extremes increases (decreases) over most areas, with an increasingly faster rate as the extremity level rises. Changes in warm extremes are more dependent on the varying shape of GEV distribution than the location shift, whereas changes in cold extremes are more closely associated with the location shift. The models simulate the overall pattern of temperature extremes during 1961-1981 reasonably well in China, but they show a smaller asymmetry between changes in warm and cold extremes primarily due to their underestimation of increases in cold extremes especially over southern China. Projections from a high emission scenario show the multi-model median change in warm and cold extremes by 2040 relative to 1971 will be 2.6 °C and 2.8 °C, respectively, with the strongest changes in cold extremes shifting southward. By 2040, warm extremes at the 1971 20-year return values would occur about every three years, while the 1971 cold extremes would occur once in > 500 years.
Identifying and Clarifying Organizational Values.
ERIC Educational Resources Information Center
Seevers, Brenda S.
2000-01-01
Of the 14 organizational values ranked by a majority of 146 New Mexico Cooperative Extension educators as extremely valued, 9 were extremely evident in organizational policies and procedures. A values audit such as this forms an important initial step in strategic planning. (SK)
Applications of Extreme Value Theory in Public Health.
Thomas, Maud; Lemaitre, Magali; Wilson, Mark L; Viboud, Cécile; Yordanov, Youri; Wackernagel, Hans; Carrat, Fabrice
2016-01-01
We present how Extreme Value Theory (EVT) can be used in public health to predict future extreme events. We applied EVT to weekly rates of Pneumonia and Influenza (P&I) deaths over 1979-2011. We further explored the daily number of emergency department visits in a network of 37 hospitals over 2004-2014. Maxima of grouped consecutive observations were fitted to a generalized extreme value distribution. The distribution was used to estimate the probability of extreme values in specified time periods. An annual P&I death rate of 12 per 100,000 (the highest maximum observed) should be exceeded once over the next 30 years and each year, there should be a 3% risk that the P&I death rate will exceed this value. Over the past 10 years, the observed maximum increase in the daily number of visits from the same weekday between two consecutive weeks was 1133. We estimated at 0.37% the probability of exceeding a daily increase of 1000 on each month. The EVT method can be applied to various topics in epidemiology thus contributing to public health planning for extreme events.
NASA Technical Reports Server (NTRS)
Landis, J.; Leid, Terry; Garber, A.; Lee, M.
1994-01-01
This paper characterizes and analyzes the spectral response of Ball Aerospace fixed-head star trackers, (FHST's) currently in use on some three-axis stabilized spacecraft. The FHST output is a function of the frequency and intensity of the incident light and the position of the star image in the field of view. The FHST's on board the Extreme Ultraviolet Explorer (EUVE) have had occasional problems identifying stars with a high B-V value. These problems are characterized by inaccurate intensity counts observed by the tracker. The inaccuracies are due to errors in the observed star magnitude values. These errors are unique to each individual FHST. For this reason, data were also collected and analyzed from the Upper Atmosphere Research Satellite (UARS). As a consequence of this work, the Goddard Space Flight Center (GSFC) Flight Dynamics Division (FDD) hopes to improve the attitude accuracy on these missions and to adopt better star selection procedures for catalogs.
Pulling adsorbed polymers at an angle: A low temperature theory
NASA Astrophysics Data System (ADS)
Iliev, Gerasim; Whittington, Stuart
2012-02-01
We consider several partially-directed walk models in two- and three-dimensions to study the problem of a homopolymer interacting with a surface while subject to a force at the terminal monomer. The force is applied with a component parallel to the surface as well as a component perpendicular to the surface. Depending on the relative values of the force in each direction, the force can either enhance the adsorption transition or lead to desorption in an adsorbed polymer. For each model, we determine the associated generating function and extract the phase diagram, identifying states where the polymer is thermally desorbed, adsorbed, and under the influence of the force. We note the different regimes that appear in the problem and provide a low temperature approximation to describe them. The approximation is exact at T=0 and models the exact results extremely well for small values of T. This work is an extension of a model considered by S. Whittington and E. Orlandini.
Captive breeding of pangolins: current status, problems and future prospects
Hua, Liushuai; Gong, Shiping; Wang, Fumin; Li, Weiye; Ge, Yan; Li, Xiaonan; Hou, Fanghui
2015-01-01
Abstract Pangolins are unique placental mammals with eight species existing in the world, which have adapted to a highly specialized diet of ants and termites, and are of significance in the control of forest termite disaster. Besides their ecological value, pangolins are extremely important economic animals with the value as medicine and food. At present, illegal hunting and habitat destruction have drastically decreased the wild population of pangolins, pushing them to the edge of extinction. Captive breeding is an important way to protect these species, but because of pangolin’s specialized behaviors and high dependence on natural ecosystem, there still exist many technical barriers to successful captive breeding programs. In this paper, based on the literatures and our practical experience, we reviewed the status and existing problems in captive breeding of pangolins, including four aspects, the naturalistic habitat, dietary husbandry, reproduction and disease control. Some recommendations are presented for effective captive breeding and protection of pangolins. PMID:26155072
Complex extreme learning machine applications in terahertz pulsed signals feature sets.
Yin, X-X; Hadjiloucas, S; Zhang, Y
2014-11-01
This paper presents a novel approach to the automatic classification of very large data sets composed of terahertz pulse transient signals, highlighting their potential use in biochemical, biomedical, pharmaceutical and security applications. Two different types of THz spectra are considered in the classification process. Firstly a binary classification study of poly-A and poly-C ribonucleic acid samples is performed. This is then contrasted with a difficult multi-class classification problem of spectra from six different powder samples that although have fairly indistinguishable features in the optical spectrum, they also possess a few discernable spectral features in the terahertz part of the spectrum. Classification is performed using a complex-valued extreme learning machine algorithm that takes into account features in both the amplitude as well as the phase of the recorded spectra. Classification speed and accuracy are contrasted with that achieved using a support vector machine classifier. The study systematically compares the classifier performance achieved after adopting different Gaussian kernels when separating amplitude and phase signatures. The two signatures are presented as feature vectors for both training and testing purposes. The study confirms the utility of complex-valued extreme learning machine algorithms for classification of the very large data sets generated with current terahertz imaging spectrometers. The classifier can take into consideration heterogeneous layers within an object as would be required within a tomographic setting and is sufficiently robust to detect patterns hidden inside noisy terahertz data sets. The proposed study opens up the opportunity for the establishment of complex-valued extreme learning machine algorithms as new chemometric tools that will assist the wider proliferation of terahertz sensing technology for chemical sensing, quality control, security screening and clinic diagnosis. Furthermore, the proposed algorithm should also be very useful in other applications requiring the classification of very large datasets. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Smith, N.; Sandal, G. M.; Leon, G. R.; Kjærgaard, A.
2017-08-01
Land-based extreme environments (e.g. polar expeditions, Antarctic research stations, confinement chambers) have often been used as analog settings for spaceflight. These settings share similarities with the conditions experienced during space missions, including confinement, isolation and limited possibilities for evacuation. To determine the utility of analog settings for understanding human spaceflight, researchers have examined the extent to which the individual characteristics (e.g., personality) of people operating in extreme environments can be generalized across contexts (Sandal, 2000) [1]. Building on previous work, and utilising new and pre-existing data, the present study examined the extent to which personal value motives could be generalized across extreme environments. Four populations were assessed; mountaineers (N =59), military personnel (N = 25), Antarctic over-winterers (N = 21) and Mars simulation participants (N = 12). All participants completed the Portrait Values Questionnaire (PVQ; Schwartz; 2) capturing information on 10 personal values. Rank scores suggest that all groups identified Self-direction, Stimulation, Universalism and Benevolence as important values and acknowledged Power and Tradition as being low priorities. Results from difference testing suggest the extreme environment groups were most comparable on Self-direction, Stimulation, Benevolence, Tradition and Security. There were significant between-group differences on five of the ten values. Overall, findings pinpointed specific values that may be important for functioning in challenging environments. However, the differences that emerged on certain values highlight the importance of considering the specific population when comparing results across extreme settings. We recommend that further research examine the impact of personal value motives on indicators of adjustment, group working, and performance. Information from such studies could then be used to aid selection and training processes for personnel operating in extreme settings, and in space.
Extreme event statistics in a drifting Markov chain
NASA Astrophysics Data System (ADS)
Kindermann, Farina; Hohmann, Michael; Lausch, Tobias; Mayer, Daniel; Schmidt, Felix; Widera, Artur
2017-07-01
We analyze extreme event statistics of experimentally realized Markov chains with various drifts. Our Markov chains are individual trajectories of a single atom diffusing in a one-dimensional periodic potential. Based on more than 500 individual atomic traces we verify the applicability of the Sparre Andersen theorem to our system despite the presence of a drift. We present detailed analysis of four different rare-event statistics for our system: the distributions of extreme values, of record values, of extreme value occurrence in the chain, and of the number of records in the chain. We observe that, for our data, the shape of the extreme event distributions is dominated by the underlying exponential distance distribution extracted from the atomic traces. Furthermore, we find that even small drifts influence the statistics of extreme events and record values, which is supported by numerical simulations, and we identify cases in which the drift can be determined without information about the underlying random variable distributions. Our results facilitate the use of extreme event statistics as a signal for small drifts in correlated trajectories.
Risk assessment of precipitation extremes in northern Xinjiang, China
NASA Astrophysics Data System (ADS)
Yang, Jun; Pei, Ying; Zhang, Yanwei; Ge, Quansheng
2018-05-01
This study was conducted using daily precipitation records gathered at 37 meteorological stations in northern Xinjiang, China, from 1961 to 2010. We used the extreme value theory model, generalized extreme value (GEV) and generalized Pareto distribution (GPD), statistical distribution function to fit outputs of precipitation extremes with different return periods to estimate risks of precipitation extremes and diagnose aridity-humidity environmental variation and corresponding spatial patterns in northern Xinjiang. Spatiotemporal patterns of daily maximum precipitation showed that aridity-humidity conditions of northern Xinjiang could be well represented by the return periods of the precipitation data. Indices of daily maximum precipitation were effective in the prediction of floods in the study area. By analyzing future projections of daily maximum precipitation (2, 5, 10, 30, 50, and 100 years), we conclude that the flood risk will gradually increase in northern Xinjiang. GEV extreme value modeling yielded the best results, proving to be extremely valuable. Through example analysis for extreme precipitation models, the GEV statistical model was superior in terms of favorable analog extreme precipitation. The GPD model calculation results reflect annual precipitation. For most of the estimated sites' 2 and 5-year T for precipitation levels, GPD results were slightly greater than GEV results. The study found that extreme precipitation reaching a certain limit value level will cause a flood disaster. Therefore, predicting future extreme precipitation may aid warnings of flood disaster. A suitable policy concerning effective water resource management is thus urgently required.
The Erdős-Hajnal problem of hypergraph colouring, its generalizations, and related problems
NASA Astrophysics Data System (ADS)
Raigorodskii, Andrei M.; Shabanov, Dmitrii A.
2011-10-01
Extremal problems concerned with hypergraph colouring first arose in connection with classical investigations in the 1920-30s which gave rise to Ramsey theory. Since then, this area has assumed a central position in extremal combinatorics. This survey is devoted to one well-known problem of hypergraph colouring, the Erdős-Hajnal problem, initially posed in 1961. It opened a line of research in hypergraph theory whose methods and results are widely used in various domains of discrete mathematics. Bibliography: 109 titles.
NASA Astrophysics Data System (ADS)
Conrad, Jon M.
2000-01-01
Resource Economics is a text for students with a background in calculus, intermediate microeconomics, and a familiarity with the spreadsheet software Excel. The book covers basic concepts, shows how to set up spreadsheets to solve dynamic allocation problems, and presents economic models for fisheries, forestry, nonrenewable resources, stock pollutants, option value, and sustainable development. Within the text, numerical examples are posed and solved using Excel's Solver. These problems help make concepts operational, develop economic intuition, and serve as a bridge to the study of real-world problems of resource management. Through these examples and additional exercises at the end of Chapters 1 to 8, students can make dynamic models operational, develop their economic intuition, and learn how to set up spreadsheets for the simulation of optimization of resource and environmental systems. Book is unique in its use of spreadsheet software (Excel) to solve dynamic allocation problems Conrad is co-author of a previous book for the Press on the subject for graduate students Approach is extremely student-friendly; gives students the tools to apply research results to actual environmental issues
NASA Astrophysics Data System (ADS)
Rieder, Harald E.; Staehelin, Johannes; Maeder, Jörg A.; Peter, Thomas; Ribatet, Mathieu; Davison, Anthony C.; Stübi, Rene; Weihs, Philipp; Holawe, Franz
2010-05-01
In this study tools from extreme value theory (e.g. Coles, 2001; Ribatet, 2007) are applied for the first time in the field of stratospheric ozone research, as statistical analysis showed that previously used concepts assuming a Gaussian distribution (e.g. fixed deviations from mean values) of total ozone data do not address the internal data structure concerning extremes adequately. The study illustrates that tools based on extreme value theory are appropriate to identify ozone extremes and to describe the tails of the world's longest total ozone record (Arosa, Switzerland - for details see Staehelin et al., 1998a,b) (Rieder et al., 2010a). A daily moving threshold was implemented for consideration of the seasonal cycle in total ozone. The frequency of days with extreme low (termed ELOs) and extreme high (termed EHOs) total ozone and the influence of those on mean values and trends is analyzed for Arosa total ozone time series. The results show (a) an increase in ELOs and (b) a decrease in EHOs during the last decades and (c) that the overall trend during the 1970s and 1980s in total ozone is strongly dominated by changes in these extreme events. After removing the extremes, the time series shows a strongly reduced trend (reduction by a factor of 2.5 for trend in annual mean). Furthermore, it is shown that the fitted model represents the tails of the total ozone data set with very high accuracy over the entire range (including absolute monthly minima and maxima). Also the frequency distribution of ozone mini-holes (using constant thresholds) can be calculated with high accuracy. Analyzing the tails instead of a small fraction of days below constant thresholds provides deeper insight in time series properties. Excursions in the frequency of extreme events reveal "fingerprints" of dynamical factors such as ENSO or NAO, and chemical factors, such as cold Arctic vortex ozone losses, as well as major volcanic eruptions of the 20th century (e.g. Gunung Agung, El Chichón, Mt. Pinatubo). Furthermore, atmospheric loading in ozone depleting substances lead to a continuous modification of column ozone in the northern hemisphere also with respect to extreme values (partly again in connection with polar vortex contributions). It is shown that application of extreme value theory allows the identification of many more such fingerprints than conventional time series analysis of annual and seasonal mean values. Especially, the analysis shows the strong influence of dynamics, revealing that even moderate ENSO and NAO events have a discernible effect on total ozone (Rieder et al., 2010b). Overall the presented new extremes concept provides new information on time series properties, variability, trends and the influence of dynamics and chemistry, complementing earlier analyses focusing only on monthly (or annual) mean values. References: Coles, S.: An Introduction to Statistical Modeling of Extreme Values, Springer Series in Statistics, ISBN:1852334592, Springer, Berlin, 2001. Ribatet, M.: POT: Modelling peaks over a threshold, R News, 7, 34-36, 2007. Rieder ,H.E., Staehelin, J., Maeder, J.A., Ribatet, M., Stübi, R., Weihs, P., Holawe, F., Peter, T., and A.D., Davison (2010): Extreme events in total ozone over Arosa - Part I: Application of extreme value theory, to be submitted to ACPD. Rieder, H.E., Staehelin, J., Maeder, J.A., Ribatet, M., Stübi, R., Weihs, P., Holawe, F., Peter, T., and A.D., Davison (2010): Extreme events in total ozone over Arosa - Part II: Fingerprints of atmospheric dynamics and chemistry and effects on mean values and long-term changes, to be submitted to ACPD. Staehelin, J., Renaud, A., Bader, J., McPeters, R., Viatte, P., Hoegger, B., Bugnion, V., Giroud, M., and Schill, H.: Total ozone series at Arosa (Switzerland): Homogenization and data comparison, J. Geophys. Res., 103(D5), 5827-5842, doi:10.1029/97JD02402, 1998a. Staehelin, J., Kegel, R., and Harris, N. R.: Trend analysis of the homogenized total ozone series of Arosa (Switzerland), 1929-1996, J. Geophys. Res., 103(D7), 8389-8400, doi:10.1029/97JD03650, 1998b.
NASA Technical Reports Server (NTRS)
Chao, Luen-Yuan; Shetty, Dinesh K.
1992-01-01
Statistical analysis and correlation between pore-size distribution and fracture strength distribution using the theory of extreme-value statistics is presented for a sintered silicon nitride. The pore-size distribution on a polished surface of this material was characterized, using an automatic optical image analyzer. The distribution measured on the two-dimensional plane surface was transformed to a population (volume) distribution, using the Schwartz-Saltykov diameter method. The population pore-size distribution and the distribution of the pore size at the fracture origin were correllated by extreme-value statistics. Fracture strength distribution was then predicted from the extreme-value pore-size distribution, usin a linear elastic fracture mechanics model of annular crack around pore and the fracture toughness of the ceramic. The predicted strength distribution was in good agreement with strength measurements in bending. In particular, the extreme-value statistics analysis explained the nonlinear trend in the linearized Weibull plot of measured strengths without postulating a lower-bound strength.
On alternative q-Weibull and q-extreme value distributions: Properties and applications
NASA Astrophysics Data System (ADS)
Zhang, Fode; Ng, Hon Keung Tony; Shi, Yimin
2018-01-01
Tsallis statistics and Tsallis distributions have been attracting a significant amount of research work in recent years. Importantly, the Tsallis statistics, q-distributions have been applied in different disciplines. Yet, a relationship between some existing q-Weibull distributions and q-extreme value distributions that is parallel to the well-established relationship between the conventional Weibull and extreme value distributions through a logarithmic transformation has not be established. In this paper, we proposed an alternative q-Weibull distribution that leads to a q-extreme value distribution via the q-logarithm transformation. Some important properties of the proposed q-Weibull and q-extreme value distributions are studied. Maximum likelihood and least squares estimation methods are used to estimate the parameters of q-Weibull distribution and their performances are investigated through a Monte Carlo simulation study. The methodologies and the usefulness of the proposed distributions are illustrated by fitting the 2014 traffic fatalities data from The National Highway Traffic Safety Administration.
Minimizing metastatic risk in radiotherapy fractionation schedules
NASA Astrophysics Data System (ADS)
Badri, Hamidreza; Ramakrishnan, Jagdish; Leder, Kevin
2015-11-01
Metastasis is the process by which cells from a primary tumor disperse and form new tumors at distant anatomical locations. The treatment and prevention of metastatic cancer remains an extremely challenging problem. This work introduces a novel biologically motivated objective function to the radiation optimization community that takes into account metastatic risk instead of the status of the primary tumor. In this work, we consider the problem of developing fractionated irradiation schedules that minimize production of metastatic cancer cells while keeping normal tissue damage below an acceptable level. A dynamic programming framework is utilized to determine the optimal fractionation scheme. We evaluated our approach on a breast cancer case using the heart and the lung as organs-at-risk (OAR). For small tumor α /β values, hypo-fractionated schedules were optimal, which is consistent with standard models. However, for relatively larger α /β values, we found the type of schedule depended on various parameters such as the time when metastatic risk was evaluated, the α /β values of the OARs, and the normal tissue sparing factors. Interestingly, in contrast to standard models, hypo-fractionated and semi-hypo-fractionated schedules (large initial doses with doses tapering off with time) were suggested even with large tumor α/β values. Numerical results indicate the potential for significant reduction in metastatic risk.
Extremal Optimization: Methods Derived from Co-Evolution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boettcher, S.; Percus, A.G.
1999-07-13
We describe a general-purpose method for finding high-quality solutions to hard optimization problems, inspired by self-organized critical models of co-evolution such as the Bak-Sneppen model. The method, called Extremal Optimization, successively eliminates extremely undesirable components of sub-optimal solutions, rather than ''breeding'' better components. In contrast to Genetic Algorithms which operate on an entire ''gene-pool'' of possible solutions, Extremal Optimization improves on a single candidate solution by treating each of its components as species co-evolving according to Darwinian principles. Unlike Simulated Annealing, its non-equilibrium approach effects an algorithm requiring few parameters to tune. With only one adjustable parameter, its performance provesmore » competitive with, and often superior to, more elaborate stochastic optimization procedures. We demonstrate it here on two classic hard optimization problems: graph partitioning and the traveling salesman problem.« less
Learning Problems in Kindergarten Students with Extremely Preterm Birth
Taylor, H. Gerry; Klein, Nancy; Anselmo, Marcia G.; Minich, Nori; Espy, Kimberly A.; Hack, Maureen
2012-01-01
Objective To assess learning problems in extremely preterm children in kindergarten and identify risk factors. Design Cohort study. Setting Children’s hospital. Participants A cohort of extremely preterm children born January 2001 – December 2003 (n=148), defined as <28 weeks gestation and/or <1000 g birth weight, and term-born normal birth weight classmate controls (n=111). Main Interventions The children were enrolled during their first year in kindergarten and assessed on measures of learning progress. Main Outcome Measures Achievement testing, teacher ratings of learning progress, and individual educational assistance. Results The extremely preterm children had lower mean standard scores than controls on tests of spelling (8.52 points, 95% CI: 4.58, 12.46) and applied mathematics (11.02 points, 95% CI: 6.76, 15.28). They also had higher rates of substandard learning progress by teacher report in written language (OR = 4.23, 95% CI: 2.32, 7.73) and mathematics (OR = 7.08, 95% CI: 2.79, 17.95). Group differences on mathematics achievement and in teacher ratings of learning progress were significant even in children without neurosensory deficits or low global cognitive ability. Neonatal risk factors, early childhood neurodevelopmental impairment, and socioeconomic status predicted learning problems in extremely preterm children, yet many of the children with problems were not in a special education program. Conclusion Learning problems in extremely preterm children are evident in kindergarten and are associated with neonatal and early childhood risk factors. The findings support efforts to provide more extensive monitoring and interventions both prior to and during the first year in school. PMID:21893648
NASA Astrophysics Data System (ADS)
Lazoglou, Georgia; Anagnostopoulou, Christina; Tolika, Konstantia; Kolyva-Machera, Fotini
2018-04-01
The increasing trend of the intensity and frequency of temperature and precipitation extremes during the past decades has substantial environmental and socioeconomic impacts. Thus, the objective of the present study is the comparison of several statistical methods of the extreme value theory (EVT) in order to identify which is the most appropriate to analyze the behavior of the extreme precipitation, and high and low temperature events, in the Mediterranean region. The extremes choice was made using both the block maxima and the peaks over threshold (POT) technique and as a consequence both the generalized extreme value (GEV) and generalized Pareto distributions (GPDs) were used to fit them. The results were compared, in order to select the most appropriate distribution for extremes characterization. Moreover, this study evaluates the maximum likelihood estimation, the L-moments and the Bayesian method, based on both graphical and statistical goodness-of-fit tests. It was revealed that the GPD can characterize accurately both precipitation and temperature extreme events. Additionally, GEV distribution with the Bayesian method is proven to be appropriate especially for the greatest values of extremes. Another important objective of this investigation was the estimation of the precipitation and temperature return levels for three return periods (50, 100, and 150 years) classifying the data into groups with similar characteristics. Finally, the return level values were estimated with both GEV and GPD and with the three different estimation methods, revealing that the selected method can affect the return level values for both the parameter of precipitation and temperature.
NASA Astrophysics Data System (ADS)
Wen, Xian-Huan; Gómez-Hernández, J. Jaime
1998-03-01
The macrodispersion of an inert solute in a 2-D heterogeneous porous media is estimated numerically in a series of fields of varying heterogeneity. Four different random function (RF) models are used to model log-transmissivity (ln T) spatial variability, and for each of these models, ln T variance is varied from 0.1 to 2.0. The four RF models share the same univariate Gaussian histogram and the same isotropic covariance, but differ from one another in terms of the spatial connectivity patterns at extreme transmissivity values. More specifically, model A is a multivariate Gaussian model for which, by definition, extreme values (both high and low) are spatially uncorrelated. The other three models are non-multi-Gaussian: model B with high connectivity of high extreme values, model C with high connectivity of low extreme values, and model D with high connectivities of both high and low extreme values. Residence time distributions (RTDs) and macrodispersivities (longitudinal and transverse) are computed on ln T fields corresponding to the different RF models, for two different flow directions and at several scales. They are compared with each other, as well as with predicted values based on first-order analytical results. Numerically derived RTDs and macrodispersivities for the multi-Gaussian model are in good agreement with analytically derived values using first-order theories for log-transmissivity variance up to 2.0. The results from the non-multi-Gaussian models differ from each other and deviate largely from the multi-Gaussian results even when ln T variance is small. RTDs in non-multi-Gaussian realizations with high connectivity at high extreme values display earlier breakthrough than in multi-Gaussian realizations, whereas later breakthrough and longer tails are observed for RTDs from non-multi-Gaussian realizations with high connectivity at low extreme values. Longitudinal macrodispersivities in the non-multi-Gaussian realizations are, in general, larger than in the multi-Gaussian ones, while transverse macrodispersivities in the non-multi-Gaussian realizations can be larger or smaller than in the multi-Gaussian ones depending on the type of connectivity at extreme values. Comparing the numerical results for different flow directions, it is confirmed that macrodispersivities in multi-Gaussian realizations with isotropic spatial correlation are not flow direction-dependent. Macrodispersivities in the non-multi-Gaussian realizations, however, are flow direction-dependent although the covariance of ln T is isotropic (the same for all four models). It is important to account for high connectivities at extreme transmissivity values, a likely situation in some geological formations. Some of the discrepancies between first-order-based analytical results and field-scale tracer test data may be due to the existence of highly connected paths of extreme conductivity values.
Are historical values of ionospheric parameters from ionosondes overestimated?
NASA Astrophysics Data System (ADS)
Laštovička, J.; Koucká Knížová, P.; Kouba, D.
2012-04-01
Ionogram-scaled values from pre-digital ionosonde times had been derived from ionograms under the assumption of the vertical reflection of ordinary mode of sounding radio waves. Classical ionosondes were unable to distinguish between the vertical and oblique reflections and in the case of the Es-layer also between the ordinary and extraordinary mode reflections due to mirror-like reflections. However, modern digisondes determine clearly the oblique or extraordinary mode reflections. Evaluating the Pruhonice digisonde ionograms in "classical" and in "correct" way we found for seven summers (2004-2010) that among strong foEs (> 6 MHz) only 10% of foEs values were correct and 90% were artificially enhanced in average by 1 MHz, in extreme cases by more than 3 MHz (some oblique reflections). 34% of all reflections were oblique reflections. With other ionospheric parameters like foF2 or foE the problem is less severe because non-mirror reflection makes delay of the extraordinary mode with respect to the ordinary mode and they are separated on ionograms, and oblique reflections are less frequent than with the patchy Es layer. At high latitudes another problem is caused by the z-mode, which is sometimes difficult to be distinguished from the ordinary mode.
Quality-control of an hourly rainfall dataset and climatology of extremes for the UK.
Blenkinsop, Stephen; Lewis, Elizabeth; Chan, Steven C; Fowler, Hayley J
2017-02-01
Sub-daily rainfall extremes may be associated with flash flooding, particularly in urban areas but, compared with extremes on daily timescales, have been relatively little studied in many regions. This paper describes a new, hourly rainfall dataset for the UK based on ∼1600 rain gauges from three different data sources. This includes tipping bucket rain gauge data from the UK Environment Agency (EA), which has been collected for operational purposes, principally flood forecasting. Significant problems in the use of such data for the analysis of extreme events include the recording of accumulated totals, high frequency bucket tips, rain gauge recording errors and the non-operation of gauges. Given the prospect of an intensification of short-duration rainfall in a warming climate, the identification of such errors is essential if sub-daily datasets are to be used to better understand extreme events. We therefore first describe a series of procedures developed to quality control this new dataset. We then analyse ∼380 gauges with near-complete hourly records for 1992-2011 and map the seasonal climatology of intense rainfall based on UK hourly extremes using annual maxima, n-largest events and fixed threshold approaches. We find that the highest frequencies and intensities of hourly extreme rainfall occur during summer when the usual orographically defined pattern of extreme rainfall is replaced by a weaker, north-south pattern. A strong diurnal cycle in hourly extremes, peaking in late afternoon to early evening, is also identified in summer and, for some areas, in spring. This likely reflects the different mechanisms that generate sub-daily rainfall, with convection dominating during summer. The resulting quality-controlled hourly rainfall dataset will provide considerable value in several contexts, including the development of standard, globally applicable quality-control procedures for sub-daily data, the validation of the new generation of very high-resolution climate models and improved understanding of the drivers of extreme rainfall.
Quality‐control of an hourly rainfall dataset and climatology of extremes for the UK
Lewis, Elizabeth; Chan, Steven C.; Fowler, Hayley J.
2016-01-01
ABSTRACT Sub‐daily rainfall extremes may be associated with flash flooding, particularly in urban areas but, compared with extremes on daily timescales, have been relatively little studied in many regions. This paper describes a new, hourly rainfall dataset for the UK based on ∼1600 rain gauges from three different data sources. This includes tipping bucket rain gauge data from the UK Environment Agency (EA), which has been collected for operational purposes, principally flood forecasting. Significant problems in the use of such data for the analysis of extreme events include the recording of accumulated totals, high frequency bucket tips, rain gauge recording errors and the non‐operation of gauges. Given the prospect of an intensification of short‐duration rainfall in a warming climate, the identification of such errors is essential if sub‐daily datasets are to be used to better understand extreme events. We therefore first describe a series of procedures developed to quality control this new dataset. We then analyse ∼380 gauges with near‐complete hourly records for 1992–2011 and map the seasonal climatology of intense rainfall based on UK hourly extremes using annual maxima, n‐largest events and fixed threshold approaches. We find that the highest frequencies and intensities of hourly extreme rainfall occur during summer when the usual orographically defined pattern of extreme rainfall is replaced by a weaker, north–south pattern. A strong diurnal cycle in hourly extremes, peaking in late afternoon to early evening, is also identified in summer and, for some areas, in spring. This likely reflects the different mechanisms that generate sub‐daily rainfall, with convection dominating during summer. The resulting quality‐controlled hourly rainfall dataset will provide considerable value in several contexts, including the development of standard, globally applicable quality‐control procedures for sub‐daily data, the validation of the new generation of very high‐resolution climate models and improved understanding of the drivers of extreme rainfall. PMID:28239235
Process-informed extreme value statistics- Why and how?
NASA Astrophysics Data System (ADS)
Schumann, Andreas; Fischer, Svenja
2017-04-01
In many parts of the world, annual maximum series (AMS) of runoff consist of flood peaks, which differ in their genesis. There are several aspects why these differences should be considered: Often multivariate flood characteristics (volumes, shapes) are of interest. These characteristics depend on the flood types. For regionalization, the main impacts on the flood regime has to be specified. If this regime depends on different flood types, type-specific hydro-meteorological and/or watershed characteristics are relevant. The ratios between event types often change over the range of observations. If a majority of events, which belongs to certain flood type, dominates the extrapolation of a probability distribution function (pdf), it is a problem if this more frequent type would not be typical for extraordinary large extremes, determining the right tail of the pdf. To consider differences in flood origin, several problems has to be solved. The events have to be separated into different groups according to their genesis. This can be a problem for long past events where e.g. precipitation data are not available. Another problem consists in the flood type-specific statistics. If block maxima are used, the sample of floods belong to a certain type is often incomplete as other events are overlaying smaller events. Some practical useable statistical tools to solve this and other problems are presented in a case study. Seasonal models were developed which differ between winter and summer floods but also between events with long and short timescales. The pdfs of the two groups of summer floods are combined via a new mixing model. The application to German watersheds demonstrates the advantages of the new model, giving specific influence to flood types.
Binary optimization for source localization in the inverse problem of ECG.
Potyagaylo, Danila; Cortés, Elisenda Gil; Schulze, Walther H W; Dössel, Olaf
2014-09-01
The goal of ECG-imaging (ECGI) is to reconstruct heart electrical activity from body surface potential maps. The problem is ill-posed, which means that it is extremely sensitive to measurement and modeling errors. The most commonly used method to tackle this obstacle is Tikhonov regularization, which consists in converting the original problem into a well-posed one by adding a penalty term. The method, despite all its practical advantages, has however a serious drawback: The obtained solution is often over-smoothed, which can hinder precise clinical diagnosis and treatment planning. In this paper, we apply a binary optimization approach to the transmembrane voltage (TMV)-based problem. For this, we assume the TMV to take two possible values according to a heart abnormality under consideration. In this work, we investigate the localization of simulated ischemic areas and ectopic foci and one clinical infarction case. This affects only the choice of the binary values, while the core of the algorithms remains the same, making the approximation easily adjustable to the application needs. Two methods, a hybrid metaheuristic approach and the difference of convex functions (DC), algorithm were tested. For this purpose, we performed realistic heart simulations for a complex thorax model and applied the proposed techniques to the obtained ECG signals. Both methods enabled localization of the areas of interest, hence showing their potential for application in ECGI. For the metaheuristic algorithm, it was necessary to subdivide the heart into regions in order to obtain a stable solution unsusceptible to the errors, while the analytical DC scheme can be efficiently applied for higher dimensional problems. With the DC method, we also successfully reconstructed the activation pattern and origin of a simulated extrasystole. In addition, the DC algorithm enables iterative adjustment of binary values ensuring robust performance.
Problems of allometric scaling analysis: examples from mammalian reproductive biology.
Martin, Robert D; Genoud, Michel; Hemelrijk, Charlotte K
2005-05-01
Biological scaling analyses employing the widely used bivariate allometric model are beset by at least four interacting problems: (1) choice of an appropriate best-fit line with due attention to the influence of outliers; (2) objective recognition of divergent subsets in the data (allometric grades); (3) potential restrictions on statistical independence resulting from phylogenetic inertia; and (4) the need for extreme caution in inferring causation from correlation. A new non-parametric line-fitting technique has been developed that eliminates requirements for normality of distribution, greatly reduces the influence of outliers and permits objective recognition of grade shifts in substantial datasets. This technique is applied in scaling analyses of mammalian gestation periods and of neonatal body mass in primates. These analyses feed into a re-examination, conducted with partial correlation analysis, of the maternal energy hypothesis relating to mammalian brain evolution, which suggests links between body size and brain size in neonates and adults, gestation period and basal metabolic rate. Much has been made of the potential problem of phylogenetic inertia as a confounding factor in scaling analyses. However, this problem may be less severe than suspected earlier because nested analyses of variance conducted on residual variation (rather than on raw values) reveals that there is considerable variance at low taxonomic levels. In fact, limited divergence in body size between closely related species is one of the prime examples of phylogenetic inertia. One common approach to eliminating perceived problems of phylogenetic inertia in allometric analyses has been calculation of 'independent contrast values'. It is demonstrated that the reasoning behind this approach is flawed in several ways. Calculation of contrast values for closely related species of similar body size is, in fact, highly questionable, particularly when there are major deviations from the best-fit line for the scaling relationship under scrutiny.
Farooqi, Aijaz; Hägglöf, Bruno; Sedin, Gunnar; Gothefors, Leif; Serenius, Fredrik
2007-07-01
We investigated a national cohort of extremely immature children with respect to behavioral and emotional problems and social competencies, from the perspectives of parents, teachers, and children themselves. We examined 11-year-old children who were born before 26 completed weeks of gestation in Sweden between 1990 and 1992. All had been evaluated at a corrected age of 36 months. At 11 years of age, 86 of 89 survivors were studied and compared with an equal number of control subjects, matched with respect to age and gender. Behavioral and emotional problems, social competencies, and adaptive functioning at school were evaluated with standardized, well-validated instruments, including parent and teacher report questionnaires and a child self-report, administered by mail. Compared with control subjects, parents of extremely immature children reported significantly more problems with internalizing behaviors (anxiety/depression, withdrawn, and somatic problems) and attention, thought, and social problems. Teachers reported a similar pattern. Reports from children showed a trend toward increased depression symptoms compared with control subjects. Multivariate analysis of covariance of parent-reported behavioral problems revealed no interactions, but significant main effects emerged for group status (extremely immature versus control), family function, social risk, and presence of a chronic medical condition, with all effect sizes being medium and accounting for 8% to 12% of the variance. Multivariate analysis of covariance of teacher-reported behavioral problems showed significant effects for group status and gender but not for the covariates mentioned above. According to the teachers' ratings, extremely immature children were less well adjusted to the school environment than were control subjects. However, a majority of extremely immature children (85%) were functioning in mainstream schools without major adjustment problems. Despite favorable outcomes for many children born at the limit of viability, these children are at risk for mental health problems, with poorer school results.
2018-01-01
Natural hazards (events that may cause actual disasters) are established in the literature as major causes of various massive and destructive problems worldwide. The occurrences of earthquakes, floods and heat waves affect millions of people through several impacts. These include cases of hospitalisation, loss of lives and economic challenges. The focus of this study was on the risk reduction of the disasters that occur because of extremely high temperatures and heat waves. Modelling average maximum daily temperature (AMDT) guards against the disaster risk and may also help countries towards preparing for extreme heat. This study discusses the use of the r largest order statistics approach of extreme value theory towards modelling AMDT over the period of 11 years, that is, 2000–2010. A generalised extreme value distribution for r largest order statistics is fitted to the annual maxima. This is performed in an effort to study the behaviour of the r largest order statistics. The method of maximum likelihood is used in estimating the target parameters and the frequency of occurrences of the hottest days is assessed. The study presents a case study of South Africa in which the data for the non-winter season (September–April of each year) are used. The meteorological data used are the AMDT that are collected by the South African Weather Service and provided by Eskom. The estimation of the shape parameter reveals evidence of a Weibull class as an appropriate distribution for modelling AMDT in South Africa. The extreme quantiles for specified return periods are estimated using the quantile function and the best model is chosen through the use of the deviance statistic with the support of the graphical diagnostic tools. The Entropy Difference Test (EDT) is used as a specification test for diagnosing the fit of the models to the data.
A compliant mechanism for inspecting extremely confined spaces
NASA Astrophysics Data System (ADS)
Mascareñas, David; Moreu, Fernando; Cantu, Precious; Shields, Daniel; Wadden, Jack; El Hadedy, Mohamed; Farrar, Charles
2017-11-01
We present a novel, compliant mechanism that provides the capability to navigate extremely confined spaces for the purpose of infrastructure inspection. Extremely confined spaces are commonly encountered during infrastructure inspection. Examples of such spaces can include pipes, conduits, and ventilation ducts. Often these infrastructure features go uninspected simply because there is no viable way to access their interior. In addition, it is not uncommon for extremely confined spaces to possess a maze-like architecture that must be selectively navigated in order to properly perform an inspection. Efforts by the imaging sensor community have resulted in the development of imaging sensors on the millimeter length scale. Due to their compact size, they are able to inspect many extremely confined spaces of interest, however, the means to deliver these sensors to the proper location to obtain the desired images are lacking. To address this problem, we draw inspiration from the field of endoscopic surgery. Specifically we consider the work that has already been done to create long flexible needles that are capable of being steered through the human body. These devices are typically referred to as ‘steerable needles.’ Steerable needle technology is not directly applicable to the problem of navigating maze-like arrangements of extremely confined spaces, but it does provide guidance on how this problem should be approached. Specifically, the super-elastic nitinol tubing material that allows steerable needles to operate is also appropriate for the problem of navigating maze-like arrangements of extremely confined spaces. Furthermore, the portion of the mechanism that enters the extremely confined space is completely mechanical in nature. The mechanical nature of the device is an advantage when the extremely confined space features environmental hazards such as radiation that could degrade an electromechanically operated mechanism. Here, we present a compliant mechanism developed to navigate maze-like arrangements of extremely confined spaces. The mechanism is shown to be able to selectively navigate past three 90° bends. The ability to selectively navigate extremely confined spaces opens up new possibilities to use emerging miniature imaging technology for infrastructure inspection.
NASA Astrophysics Data System (ADS)
Nursamsiah; Nugroho Sugianto, Denny; Suprijanto, Jusup; Munasik; Yulianto, Bambang
2018-02-01
The information of extreme wave height return level was required for maritime planning and management. The recommendation methods in analyzing extreme wave were better distributed by Generalized Pareto Distribution (GPD). Seasonal variation was often considered in the extreme wave model. This research aims to identify the best model of GPD by considering a seasonal variation of the extreme wave. By using percentile 95 % as the threshold of extreme significant wave height, the seasonal GPD and non-seasonal GPD fitted. The Kolmogorov-Smirnov test was applied to identify the goodness of fit of the GPD model. The return value from seasonal and non-seasonal GPD was compared with the definition of return value as criteria. The Kolmogorov-Smirnov test result shows that GPD fits data very well both seasonal and non-seasonal model. The seasonal return value gives better information about the wave height characteristics.
Investigating NARCCAP Precipitation Extremes via Bivariate Extreme Value Theory (Invited)
NASA Astrophysics Data System (ADS)
Weller, G. B.; Cooley, D. S.; Sain, S. R.; Bukovsky, M. S.; Mearns, L. O.
2013-12-01
We introduce methodology from statistical extreme value theory to examine the ability of reanalysis-drive regional climate models to simulate past daily precipitation extremes. Going beyond a comparison of summary statistics such as 20-year return values, we study whether the most extreme precipitation events produced by climate model simulations exhibit correspondence to the most extreme events seen in observational records. The extent of this correspondence is formulated via the statistical concept of tail dependence. We examine several case studies of extreme precipitation events simulated by the six models of the North American Regional Climate Change Assessment Program (NARCCAP) driven by NCEP reanalysis. It is found that the NARCCAP models generally reproduce daily winter precipitation extremes along the Pacific coast quite well; in contrast, simulation of past daily summer precipitation extremes in a central US region is poor. Some differences in the strength of extremal correspondence are seen in the central region between models which employ spectral nudging and those which do not. We demonstrate how these techniques may be used to draw a link between extreme precipitation events and large-scale atmospheric drivers, as well as to downscale extreme precipitation simulated by a future run of a regional climate model. Specifically, we examine potential future changes in the nature of extreme precipitation along the Pacific coast produced by the pineapple express (PE) phenomenon. A link between extreme precipitation events and a "PE Index" derived from North Pacific sea-surface pressure fields is found. This link is used to study PE-influenced extreme precipitation produced by a future-scenario climate model run.
Countering the Pedagogy of Extremism: Reflective Narratives and Critiques of Problem-Based Learning
ERIC Educational Resources Information Center
Woo, Chris W. H.; Laxman, Kumar
2013-01-01
This paper is a critique against "purist" pedagogies found in the literature of student-centred learning. The article reproves extremism in education and questions the absolutism and teleological truths expounded in exclusive problem-based learning. The paper articulates the framework of a unifying pedagogical practice through Eve…
Climate and its change over the Tibetan Plateau and its Surroundings in 1963-2015
NASA Astrophysics Data System (ADS)
Ding, J.; Cuo, L.
2017-12-01
Tibetan Plateau and its surroundings (TPS, 23°-43°N, 73°-106°E) lies in the southwest of China and includes Tibet Autonomous Region, Qinghai Province, southern Xinjiang Uygur Autonomous Region, part of Gansu Province, western Sichuan Province, and northern Yunnan Province. The region is of strategic importance in water resources because it is the headwater of ten large rivers that support more than 16 billion population. In this study, we use daily temperature maximum and minimum, precipitation and wind speed in 1963-2015 obtained from Climate Data Center of China Meteorological Administration and Qinghai Meteorological Bureau to investigate extreme climate conditions and their changes over the TPS. The extreme events are selected based on annual extreme values and percentiles. Annual extreme value approach produces one value each year for all variables, which enables us to examine the magnitude of extreme events; whereas percentile approach selects extreme values by setting 95th percentile as thresholds for maximum temperature, precipitation and wind speed, and 5th percentile for minimum temperature. Percentile approach not only enables us to investigate the magnitude but also frequency of the extreme events. Also, Mann-Kendall trend and mutation analysis were applied to analyze the changes in mean and extreme conditions. The results will help us understand more about the extreme events during the past five decades on the TPS and will provide valuable information for the upcoming IPCC reports on climate change.
NASA Astrophysics Data System (ADS)
Zhang, Yin; Xia, Jun; She, Dunxian
2018-01-01
In recent decades, extreme precipitation events have been a research hotspot worldwide. Based on 12 extreme precipitation indices, the spatiotemporal variation and statistical characteristic of precipitation extremes in the middle reaches of the Yellow River Basin (MRYRB) during 1960-2013 were investigated. The results showed that the values of most extreme precipitation indices (except consecutive dry days (CDD)) increased from the northwest to the southeast of the MRYRB, reflecting that the southeast was the wettest region in the study area. Temporally, the precipitation extremes presented a drying trend with less frequent precipitation events. Generalized extreme value (GEV) distribution was selected to fit the time series of all indices, and the quantiles values under the 50-year return period showed a similar spatial extent with the corresponding precipitation extreme indices during 1960-2013, indicating a higher risk of extreme precipitation in the southeast of the MRYRB. Furthermore, the changes in probability distribution functions of indices for the period of 1960-1986 and 1987-2013 revealed a drying tendency in our study area. Both El Niño-Southern Oscillation (ENSO) and Pacific Decadal Oscillation (PDO) were proved to have a strong influence on precipitation extremes in the MRYRB. The results of this study are useful to master the change rule of local precipitation extremes, which will help to prevent natural hazards caused.
Preterm birth and developmental problems in the preschool age. Part I: minor motor problems.
Ferrari, Fabrizio; Gallo, Claudio; Pugliese, Marisa; Guidotti, Isotta; Gavioli, Sara; Coccolini, Elena; Zagni, Paola; Della Casa, Elisa; Rossi, Cecilia; Lugli, Licia; Todeschini, Alessandra; Ori, Luca; Bertoncelli, Natascia
2012-11-01
Nearly half of very preterm (VP) and extremely preterm (EP) infants suffers from minor disabilities. The paper overviews the literature dealing with motor problems other than cerebral palsy (CP) during infancy and preschool age. The term "minor motor problems" indicates a wide spectrum of motor disorders other than CP; "minor" does not mean "minimal", as a relevant proportion of the preterm infants will develop academic and behavioural problems at school age. Early onset disorders consist of abnormal general movements (GMs), transient dystonia and postural instability; these conditions usually fade during the first months. They were underestimated in the past; recently, qualitative assessment of GMs using Prechtl's method has become a major item of the neurological examination. Late onset disorders include developmental coordination disorder (DCD) and/or minor neurological dysfunction (MND): both terms cover partly overlapping problems. Simple MND (MND-1) and complex MND (MND-2) can be identified and MND-2 gives a higher risk for learning and behavioural disorders. A relationship between the quality of GMs and MND in childhood has been recently described. The Touwen infant neurological examination (TINE) can reliably detect neurological signs of MND even in infancy. However, the prognostic value of these disorders requires further investigations.
Self-force as a cosmic censor in the Kerr overspinning problem
NASA Astrophysics Data System (ADS)
Colleoni, Marta; Barack, Leor; Shah, Abhay G.; van de Meent, Maarten
2015-10-01
It is known that a near-extremal Kerr black hole can be spun up beyond its extremal limit by capturing a test particle. Here we show that overspinning is always averted once backreaction from the particle's own gravity is properly taken into account. We focus on nonspinning, uncharged, massive particles thrown in along the equatorial plane and work in the first-order self-force approximation (i.e., we include all relevant corrections to the particle's acceleration through linear order in the ratio, assumed small, between the particle's energy and the black hole's mass). Our calculation is a numerical implementation of a recent analysis by two of us [Phys. Rev. D 91, 104024 (2015)], in which a necessary and sufficient "censorship" condition was formulated for the capture scenario, involving certain self-force quantities calculated on the one-parameter family of unstable circular geodesics in the extremal limit. The self-force information accounts both for radiative losses and for the finite-mass correction to the critical value of the impact parameter. Here we obtain the required self-force data and present strong evidence to suggest that captured particles never drive the black hole beyond its extremal limit. We show, however, that, within our first-order self-force approximation, it is possible to reach the extremal limit with a suitable choice of initial orbital parameters. To rule out such a possibility would require (currently unavailable) information about higher-order self-force corrections.
NASA Astrophysics Data System (ADS)
Avanzi, Francesco; De Michele, Carlo; Gabriele, Salvatore; Ghezzi, Antonio; Rosso, Renzo
2015-04-01
Here, we show how atmospheric circulation and topography rule the variability of depth-duration-frequency (DDF) curves parameters, and we discuss how this variability has physical implications on the formation of extreme precipitations at high elevations. A DDF is a curve ruling the value of the maximum annual precipitation H as a function of duration D and the level of probability F. We consider around 1500 stations over the Italian territory, with at least 20 years of data of maximum annual precipitation depth at different durations. We estimated the DDF parameters at each location by using the asymptotic distribution of extreme values, i.e. the so-called Generalized Extreme Value (GEV) distribution, and considering a statistical simple scale invariance hypothesis. Consequently, a DDF curve depends on five different parameters. A first set relates H with the duration (namely, the mean value of annual maximum precipitation depth for unit duration and the scaling exponent), while a second set links H to F (namely, a scale, position and shape parameter). The value of the shape parameter has consequences on the type of random variable (unbounded, upper or lower bounded). This extensive analysis shows that the variability of the mean value of annual maximum precipitation depth for unit duration obeys to the coupled effect of topography and modal direction of moisture flux during extreme events. Median values of this parameter decrease with elevation. We called this phenomenon "reverse orographic effect" on extreme precipitation of short durations, since it is in contrast with general knowledge about the orographic effect on mean precipitation. Moreover, the scaling exponent is mainly driven by topography alone (with increasing values of this parameter at increasing elevations). Therefore, the quantiles of H(D,F) at durations greater than unit turn to be more variable at high elevations than at low elevations. Additionally, the analysis of the variability of the shape parameter with elevation shows that extreme events at high elevations appear to be distributed according to an upper bounded probability distribution. These evidences could be a characteristic sign of the formation of extreme precipitation events at high elevations.
Extreme events in total ozone over Arosa - Part 1: Application of extreme value theory
NASA Astrophysics Data System (ADS)
Rieder, H. E.; Staehelin, J.; Maeder, J. A.; Peter, T.; Ribatet, M.; Davison, A. C.; Stübi, R.; Weihs, P.; Holawe, F.
2010-10-01
In this study ideas from extreme value theory are for the first time applied in the field of stratospheric ozone research, because statistical analysis showed that previously used concepts assuming a Gaussian distribution (e.g. fixed deviations from mean values) of total ozone data do not adequately address the structure of the extremes. We show that statistical extreme value methods are appropriate to identify ozone extremes and to describe the tails of the Arosa (Switzerland) total ozone time series. In order to accommodate the seasonal cycle in total ozone, a daily moving threshold was determined and used, with tools from extreme value theory, to analyse the frequency of days with extreme low (termed ELOs) and high (termed EHOs) total ozone at Arosa. The analysis shows that the Generalized Pareto Distribution (GPD) provides an appropriate model for the frequency distribution of total ozone above or below a mathematically well-defined threshold, thus providing a statistical description of ELOs and EHOs. The results show an increase in ELOs and a decrease in EHOs during the last decades. The fitted model represents the tails of the total ozone data set with high accuracy over the entire range (including absolute monthly minima and maxima), and enables a precise computation of the frequency distribution of ozone mini-holes (using constant thresholds). Analyzing the tails instead of a small fraction of days below constant thresholds provides deeper insight into the time series properties. Fingerprints of dynamical (e.g. ENSO, NAO) and chemical features (e.g. strong polar vortex ozone loss), and major volcanic eruptions, can be identified in the observed frequency of extreme events throughout the time series. Overall the new approach to analysis of extremes provides more information on time series properties and variability than previous approaches that use only monthly averages and/or mini-holes and mini-highs.
Extreme events in total ozone over Arosa - Part 1: Application of extreme value theory
NASA Astrophysics Data System (ADS)
Rieder, H. E.; Staehelin, J.; Maeder, J. A.; Peter, T.; Ribatet, M.; Davison, A. C.; Stübi, R.; Weihs, P.; Holawe, F.
2010-05-01
In this study ideas from extreme value theory are for the first time applied in the field of stratospheric ozone research, because statistical analysis showed that previously used concepts assuming a Gaussian distribution (e.g. fixed deviations from mean values) of total ozone data do not adequately address the structure of the extremes. We show that statistical extreme value methods are appropriate to identify ozone extremes and to describe the tails of the Arosa (Switzerland) total ozone time series. In order to accommodate the seasonal cycle in total ozone, a daily moving threshold was determined and used, with tools from extreme value theory, to analyse the frequency of days with extreme low (termed ELOs) and high (termed EHOs) total ozone at Arosa. The analysis shows that the Generalized Pareto Distribution (GPD) provides an appropriate model for the frequency distribution of total ozone above or below a mathematically well-defined threshold, thus providing a statistical description of ELOs and EHOs. The results show an increase in ELOs and a decrease in EHOs during the last decades. The fitted model represents the tails of the total ozone data set with high accuracy over the entire range (including absolute monthly minima and maxima), and enables a precise computation of the frequency distribution of ozone mini-holes (using constant thresholds). Analyzing the tails instead of a small fraction of days below constant thresholds provides deeper insight into the time series properties. Fingerprints of dynamical (e.g. ENSO, NAO) and chemical features (e.g. strong polar vortex ozone loss), and major volcanic eruptions, can be identified in the observed frequency of extreme events throughout the time series. Overall the new approach to analysis of extremes provides more information on time series properties and variability than previous approaches that use only monthly averages and/or mini-holes and mini-highs.
Self-Regulation in Children Born with Extremely Low Birth Weight at 2 Years Old: A Comparison Study
ERIC Educational Resources Information Center
Lynn, Lisa N.; Cuskelly, Monica; Gray, Peter H.; O'Callaghan, Michael J.
2012-01-01
Survival rates for children born with extremely low birth weight (ELBW) are increasing; however, many of these children experience later problems with learning. This study adopted an integrated approach to these problems, involving the self-regulatory tasks of inhibition and delay of gratification and relevant individual factors including…
NASA Technical Reports Server (NTRS)
Hopkins, Randall C.; Benzing, Daniel A.
1998-01-01
Improvements in uncertainties in the values of radiant intensity (I) can be accomplished mainly by improvements in the calibration process and in minimizing the difference between the background and engine plume radiance. For engine tests in which the plume is extremely bright, the difference in luminance between the calibration lamp and the engine plume radiance can be so large as to cause relatively large uncertainties in the values of R. This is due to the small aperture necessary on the receiving optics to avoid saturating the instrument. However, this is not a problem with the SSME engine since the liquid oxygen/hydrogen combustion is not as bright as some other fuels. Applying the instrumentation to other type engine tests may require a much brighter calibration lamp.
Inflationary dynamics for matrix eigenvalue problems
Heller, Eric J.; Kaplan, Lev; Pollmann, Frank
2008-01-01
Many fields of science and engineering require finding eigenvalues and eigenvectors of large matrices. The solutions can represent oscillatory modes of a bridge, a violin, the disposition of electrons around an atom or molecule, the acoustic modes of a concert hall, or hundreds of other physical quantities. Often only the few eigenpairs with the lowest or highest frequency (extremal solutions) are needed. Methods that have been developed over the past 60 years to solve such problems include the Lanczos algorithm, Jacobi–Davidson techniques, and the conjugate gradient method. Here, we present a way to solve the extremal eigenvalue/eigenvector problem, turning it into a nonlinear classical mechanical system with a modified Lagrangian constraint. The constraint induces exponential inflationary growth of the desired extremal solutions. PMID:18511564
Gerdes, N; Farin, E
2016-10-01
Objective: Taking Fibromyalgia syndrome (FMS) as an example, the article illustrates a problem that to our knowledge has not been addressed in rehabilitation research so far: According to our large dataset, a sizeable proportion of patients had to be sent home with extremely severe burdens (<2 nd percentile in the normal population) at discharge - in spite of good improvements during their stay. Data and methods: Since 2009, patients in the RehaKlinikum Bad Säckingen, an in-patient rehab center for orthopedic-rheumatic diseases, answer the questionnaire "Indicators of Rehabilitation Status" (IRES) at the beginning and the end of their stay. We analysed IRES-data of 1 803 patients with FMS (94% women). In addition to analyses of change, we determined the degrees of severity at admission and discharge on the basis of a comparison with the normative sample of the IRES. In order to predict membership of the high-risk group of patients with still "extremely severe" values at discharge, we performed binary logistic regression analyses. Results: At admission, about 90% of the patients showed either "extreme" (65%<2 nd percentile) or "severe" (27% 2 nd -10 th percentile) values on the IRES summary score as well as on the scores for "psychic status", "pain", "symptoms of orthopedic and cardiovascular diseases", and "functioning in everyday life". In sum, then, FMS-patients have come to rehabilitation with multiple burdens of a severe to extreme degree. At discharge, the mean summary score had improved with a "strong" effect size of SRM=1.07. In spite of these good overall improvements, however, 37.4% of the patients went home with "extreme" burdens remaining, even though almost 60% of them had experienced "strong" (28%) or "relevant" (31%) improvements. The most important predictor of affiliation to this "high-risk group" was - as expected - the IRES summary score at admission. But unexpectedly influential were also some characteristics of social status such as lower household income and lower degrees of education. Conclusion: In rehabilitation research, analyses of change between pre- and post-measurement values should be accompanied by assessments of severity of rehabilitation status at discharge because even good improvements do not necessarily mean that a patient has been rehabilitated successfully. © Georg Thieme Verlag KG Stuttgart · New York.
Estimation of muscle torque in various combat sports.
Pędzich, Wioletta; Mastalerz, Andrzej; Sadowski, Jerzy
2012-01-01
The purpose of the research was to compare muscle torque of elite combat groups. Twelve taekwondo WTF athletes, twelve taekwondo ITF athletes and nine boxers participated in the study. Measurements of muscle torques were done under static conditions on a special stand which belonged to the Department of Biomechanics. The sum of muscle torque of lower right and left extremities of relative values was significantly higher for taekwondo WTF athletes than for boxers (16%, p < 0.001 for right and 10%, p < 0.05 for left extremities) and taekwondo ITF (10%, p < 0.05 for right and 8% for left extremities). Taekwondo ITF athletes attained significantly higher absolute muscle torque values than boxers for elbow flexors (20%, p < 0.05 for right and 11% for left extremities) and extensors (14% for right and 18%, p < 0.05 for left extremities) and shoulder flexors (10% for right and 12%, p < 0.05 for left extremities) and extensors (11% for right and 1% for left extremities). Taekwondo WTF and taekwondo ITF athletes obtained significantly different relative values of muscle torque of the hip flexors (16%, p < 0.05) and extensors (11%, p < 0.05) of the right extremities.
Nurse Bullying: A Review And A Proposed Solution.
Castronovo, Marie A; Pullizzi, Amy; Evans, ShaKhira
2016-01-01
Nurse bullying is an extremely common phenomenon which has detrimental consequences to nurses, patients, health care institutions, and to the nursing profession itself. It has even been linked to increased patient mortality. This article demonstrates the critical need to resolve the issue of nurse bullying. It also shows that previous attempts of resolution have not been successful, which may be partly due to the fact that the problem is relatively unacknowledged outside the nursing profession. To resolve the problem of nurse bullying, we believe that the solution must include an incentive for institutions to implement the necessary interventions and to ensure that they are effective. We propose that a measurement pertaining to the level of nurse bullying be factored into the calculation of the value-based incentive payment in the Hospital Value-Based Purchasing program. To facilitate this, we propose that a survey be developed and implemented which is similar to the Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS) survey. However, whereas the HCAHPS survey measures patients' perspectives of hospital care, this survey would measure nurses' perspectives of workplace bullying. Copyright © 2015 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Darko, Deborah; Adjei, Kwaku A.; Appiah-Adjei, Emmanuel K.; Odai, Samuel N.; Obuobie, Emmanuel; Asmah, Ruby
2018-06-01
The extent to which statistical bias-adjusted outputs of two regional climate models alter the projected change signals for the mean (and extreme) rainfall and temperature over the Volta Basin is evaluated. The outputs from two regional climate models in the Coordinated Regional Climate Downscaling Experiment for Africa (CORDEX-Africa) are bias adjusted using the quantile mapping technique. Annual maxima rainfall and temperature with their 10- and 20-year return values for the present (1981-2010) and future (2051-2080) climates are estimated using extreme value analyses. Moderate extremes are evaluated using extreme indices (viz. percentile-based, duration-based, and intensity-based). Bias adjustment of the original (bias-unadjusted) models improves the reproduction of mean rainfall and temperature for the present climate. However, the bias-adjusted models poorly reproduce the 10- and 20-year return values for rainfall and maximum temperature whereas the extreme indices are reproduced satisfactorily for the present climate. Consequently, projected changes in rainfall and temperature extremes were weak. The bias adjustment results in the reduction of the change signals for the mean rainfall while the mean temperature signals are rather magnified. The projected changes for the original mean climate and extremes are not conserved after bias adjustment with the exception of duration-based extreme indices.
On the solution of integral equations with strongly singular kernels
NASA Technical Reports Server (NTRS)
Kaya, A. C.; Erdogan, F.
1986-01-01
Some useful formulas are developed to evaluate integrals having a singularity of the form (t-x) sup-m ,m greater than or equal 1. Interpreting the integrals with strong singularities in Hadamard sense, the results are used to obtain approximate solutions of singular integral equations. A mixed boundary value problem from the theory of elasticity is considered as an example. Particularly for integral equations where the kernel contains, in addition to the dominant term (t-x) sup -m , terms which become unbounded at the end points, the present technique appears to be extremely effective to obtain rapidly converging numerical results.
On the solution of integral equations with strong ly singular kernels
NASA Technical Reports Server (NTRS)
Kaya, A. C.; Erdogan, F.
1985-01-01
In this paper some useful formulas are developed to evaluate integrals having a singularity of the form (t-x) sup-m, m or = 1. Interpreting the integrals with strong singularities in Hadamard sense, the results are used to obtain approximate solutions of singular integral equations. A mixed boundary value problem from the theory of elasticity is considered as an example. Particularly for integral equations where the kernel contains, in addition to the dominant term (t,x) sup-m, terms which become unbounded at the end points, the present technique appears to be extremely effective to obtain rapidly converging numerical results.
On the solution of integral equations with strongly singular kernels
NASA Technical Reports Server (NTRS)
Kaya, A. C.; Erdogan, F.
1987-01-01
Some useful formulas are developed to evaluate integrals having a singularity of the form (t-x) sup-m, m greater than or equal 1. Interpreting the integrals with strong singularities in Hadamard sense, the results are used to obtain approximate solutions of singular integral equations. A mixed boundary value problem from the theory of elasticity is considered as an example. Particularly for integral equations where the kernel contains, in addition to the dominant term (t-x) sup-m, terms which become unbounded at the end points, the present technique appears to be extremely effective to obtain rapidly converging numerical results.
Eating Problems at Age 6 Years in a Whole Population Sample of Extremely Preterm Children
ERIC Educational Resources Information Center
Samara, Muthanna; Johnson, Samantha; Lamberts, Koen; Marlow, Neil; Wolke, Dieter
2010-01-01
Aim: The aim of this study was to investigate the prevalence of eating problems and their association with neurological and behavioural disabilities and growth among children born extremely preterm (EPC) at age 6 years. Method: A standard questionnaire about eating was completed by parents of 223 children (125 males [56.1%], 98 females [43.9%])…
[Growth charts: Impact on the prevalence of nutritional disorders].
Polo Martín, P; Abellan, J J; Nájar Godoy, M I; Álvarez de Laviada Mulero, T
2015-05-01
The references used to assess child growth in Spain are the graphs of the Orbegozo Foundation and the charts of the World Health Organization (WHO). The objective of this study is to analyze the differences between the two charts for weight, height and body mass index, and assess their relevance to identify growth or nutritional problems. The values of the extreme percentiles of height, weight and body mass index for each sex from 0 to 10 years in both charts are compared. For each value Absolute differences and Z scores are calculated for each value. To evaluate the impact on the prevalence of the various nutritional or growth disorders the location of the value of the respective percentiles of in each of the charts were assessed. Significant differences were observed between the 3th percentile of height and weight, 97th of weight, and 85th and 97th of body mass index. Marked differences were observed for the extreme values of body mass index. During the first years, the Orbegozo charts overestimate the prevalence of malnutrition (between 2% and 19% depending on age and sex) compared to the WHO charts. In subsequent ages Orbegozo underestimates WHO between 0.7% and 2.89%. Orbegozo underestimates the prevalence of overweight (between 2.5% and 14.8%) compared to the WHO charts. The 97th percentile of Body mass index in the Orbegozo charts corresponds in most cases with WHO percentiles above 99.99%. The two charts analyzed have significant differences from a clinical and the public health point of view, in the estimation of overweight/obesity and malnutrition. Copyright © 2014 Asociación Española de Pediatría. Published by Elsevier España, S.L.U. All rights reserved.
Causes of Glacier Melt Extremes in the Alps Since 1949
NASA Astrophysics Data System (ADS)
Thibert, E.; Dkengne Sielenou, P.; Vionnet, V.; Eckert, N.; Vincent, C.
2018-01-01
Recent record-breaking glacier melt values are attributable to peculiar extreme events and long-term warming trends that shift averages upward. Analyzing one of the world's longest mass balance series with extreme value statistics, we show that detrending melt anomalies makes it possible to disentangle these effects, leading to a fairer evaluation of the return period of melt extreme values such as 2003, and to characterize them by a more realistic bounded behavior. Using surface energy balance simulations, we show that three independent drivers control melt: global radiation, latent heat, and the amount of snow at the beginning of the melting season. Extremes are governed by large deviations in global radiation combined with sensible heat. Long-term trends are driven by the lengthening of melt duration due to earlier and longer-lasting melting of ice along with melt intensification caused by trends in long-wave irradiance and latent heat due to higher air moisture.
Correlation dimension and phase space contraction via extreme value theory
NASA Astrophysics Data System (ADS)
Faranda, Davide; Vaienti, Sandro
2018-04-01
We show how to obtain theoretical and numerical estimates of correlation dimension and phase space contraction by using the extreme value theory. The maxima of suitable observables sampled along the trajectory of a chaotic dynamical system converge asymptotically to classical extreme value laws where: (i) the inverse of the scale parameter gives the correlation dimension and (ii) the extremal index is associated with the rate of phase space contraction for backward iteration, which in dimension 1 and 2, is closely related to the positive Lyapunov exponent and in higher dimensions is related to the metric entropy. We call it the Dynamical Extremal Index. Numerical estimates are straightforward to obtain as they imply just a simple fit to a univariate distribution. Numerical tests range from low dimensional maps, to generalized Henon maps and climate data. The estimates of the indicators are particularly robust even with relatively short time series.
Extreme events in total ozone: Spatio-temporal analysis from local to global scale
NASA Astrophysics Data System (ADS)
Rieder, Harald E.; Staehelin, Johannes; Maeder, Jörg A.; Ribatet, Mathieu; di Rocco, Stefania; Jancso, Leonhardt M.; Peter, Thomas; Davison, Anthony C.
2010-05-01
Recently tools from extreme value theory (e.g. Coles, 2001; Ribatet, 2007) have been applied for the first time in the field of stratospheric ozone research, as statistical analysis showed that previously used concepts assuming a Gaussian distribution (e.g. fixed deviations from mean values) of total ozone data do not address the internal data structure concerning extremes adequately (Rieder et al., 2010a,b). A case study the world's longest total ozone record (Arosa, Switzerland - for details see Staehelin et al., 1998a,b) illustrates that tools based on extreme value theory are appropriate to identify ozone extremes and to describe the tails of the total ozone record. Excursions in the frequency of extreme events reveal "fingerprints" of dynamical factors such as ENSO or NAO, and chemical factors, such as cold Arctic vortex ozone losses, as well as major volcanic eruptions of the 20th century (e.g. Gunung Agung, El Chichón, Mt. Pinatubo). Furthermore, atmospheric loading in ozone depleting substances led to a continuous modification of column ozone in the northern hemisphere also with respect to extreme values (partly again in connection with polar vortex contributions). It is shown that application of extreme value theory allows the identification of many more such fingerprints than conventional time series analysis of annual and seasonal mean values. Especially, the extremal analysis shows the strong influence of dynamics, revealing that even moderate ENSO and NAO events have a discernible effect on total ozone (Rieder et al., 2010b). Overall the extremes concept provides new information on time series properties, variability, trends and the influence of dynamics and chemistry, complementing earlier analyses focusing only on monthly (or annual) mean values. Findings described above could be proven also for the total ozone records of 5 other long-term series (Belsk, Hohenpeissenberg, Hradec Kralove, Potsdam, Uccle) showing that strong influence of atmospheric dynamics (NAO, ENSO) on total ozone is a global feature in the northern mid-latitudes (Rieder et al., 2010c). In a next step frequency distributions of extreme events are analyzed on global scale (northern and southern mid-latitudes). A specific focus here is whether findings gained through analysis of long-term European ground based stations can be clearly identified as a global phenomenon. By showing results from these three types of studies an overview of extreme events in total ozone (and the dynamical and chemical features leading to those) will be presented from local to global scales. References: Coles, S.: An Introduction to Statistical Modeling of Extreme Values, Springer Series in Statistics, ISBN:1852334592, Springer, Berlin, 2001. Ribatet, M.: POT: Modelling peaks over a threshold, R News, 7, 34-36, 2007. Rieder, H.E., Staehelin, J., Maeder, J.A., Ribatet, M., Stübi, R., Weihs, P., Holawe, F., Peter, T., and A.D., Davison (2010): Extreme events in total ozone over Arosa - Part I: Application of extreme value theory, to be submitted to ACPD. Rieder, H.E., Staehelin, J., Maeder, J.A., Ribatet, M., Stübi, R., Weihs, P., Holawe, F., Peter, T., and A.D., Davison (2010): Extreme events in total ozone over Arosa - Part II: Fingerprints of atmospheric dynamics and chemistry and effects on mean values and long-term changes, to be submitted to ACPD. Rieder, H.E., Jancso, L., Staehelin, J., Maeder, J.A., Ribatet, Peter, T., and A.D., Davison (2010): Extreme events in total ozone over the northern mid-latitudes: A case study based on long-term data sets from 5 ground-based stations, in preparation. Staehelin, J., Renaud, A., Bader, J., McPeters, R., Viatte, P., Hoegger, B., Bugnion, V., Giroud, M., and Schill, H.: Total ozone series at Arosa (Switzerland): Homogenization and data comparison, J. Geophys. Res., 103(D5), 5827-5842, doi:10.1029/97JD02402, 1998a. Staehelin, J., Kegel, R., and Harris, N. R.: Trend analysis of the homogenized total ozone series of Arosa (Switzerland), 1929-1996, J. Geophys. Res., 103(D7), 8389-8400, doi:10.1029/97JD03650, 1998b.
Mathematical aspects of assessing extreme events for the safety of nuclear plants
NASA Astrophysics Data System (ADS)
Potempski, Slawomir; Borysiewicz, Mieczyslaw
2015-04-01
In the paper the review of mathematical methodologies applied for assessing low frequencies of rare natural events like earthquakes, tsunamis, hurricanes or tornadoes, floods (in particular flash floods and surge storms), lightning, solar flares, etc., will be given in the perspective of the safety assessment of nuclear plants. The statistical methods are usually based on the extreme value theory, which deals with the analysis of extreme deviation from the median (or the mean). In this respect application of various mathematical tools can be useful, like: the extreme value theorem of Fisher-Tippett-Gnedenko leading to possible choices of general extreme value distributions, or the Pickands-Balkema-de Haan theorem for tail fitting, or the methods related to large deviation theory. In the paper the most important stochastic distributions relevant for performing rare events statistical analysis will be presented. This concerns, for example, the analysis of the data with the annual extreme values (maxima - "Annual Maxima Series" or minima), or the peak values, exceeding given thresholds at some periods of interest ("Peak Over Threshold"), or the estimation of the size of exceedance. Despite of the fact that there is a lack of sufficient statistical data directly containing rare events, in some cases it is still possible to extract useful information from existing larger data sets. As an example one can consider some data sets available from the web sites for floods, earthquakes or generally natural hazards. Some aspects of such data sets will be also presented taking into account their usefulness for the practical assessment of risk for nuclear power plants coming from extreme weather conditions.
Combining local search with co-evolution in a remarkably simple way
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boettcher, S.; Percus, A.
2000-05-01
The authors explore a new general-purpose heuristic for finding high-quality solutions to hard optimization problem. The method, called extremal optimization, is inspired by self-organized criticality, a concept introduced to describe emergent complexity in physical systems. In contrast to genetic algorithms, which operate on an entire gene-pool of possible solutions, extremal optimization successively replaces extremely undesirable elements of a single sub-optimal solution with new, random ones. Large fluctuations, or avalanches, ensue that efficiently explore many local optima. Drawing upon models used to simulate far-from-equilibrium dynamics, extremal optimization complements heuristics inspired by equilibrium statistical physics, such as simulated annealing. With only onemore » adjustable parameter, its performance has proved competitive with more elaborate methods, especially near phase transitions. Phase transitions are found in many combinatorial optimization problems, and have been conjectured to occur in the region of parameter space containing the hardest instances. We demonstrate how extremal optimization can be implemented for a variety of hard optimization problems. We believe that this will be a useful tool in the investigation of phase transitions in combinatorial optimization, thereby helping to elucidate the origin of computational complexity.« less
Weighted mining of massive collections of [Formula: see text]-values by convex optimization.
Dobriban, Edgar
2018-06-01
Researchers in data-rich disciplines-think of computational genomics and observational cosmology-often wish to mine large bodies of [Formula: see text]-values looking for significant effects, while controlling the false discovery rate or family-wise error rate. Increasingly, researchers also wish to prioritize certain hypotheses, for example, those thought to have larger effect sizes, by upweighting, and to impose constraints on the underlying mining, such as monotonicity along a certain sequence. We introduce Princessp , a principled method for performing weighted multiple testing by constrained convex optimization. Our method elegantly allows one to prioritize certain hypotheses through upweighting and to discount others through downweighting, while constraining the underlying weights involved in the mining process. When the [Formula: see text]-values derive from monotone likelihood ratio families such as the Gaussian means model, the new method allows exact solution of an important optimal weighting problem previously thought to be non-convex and computationally infeasible. Our method scales to massive data set sizes. We illustrate the applications of Princessp on a series of standard genomics data sets and offer comparisons with several previous 'standard' methods. Princessp offers both ease of operation and the ability to scale to extremely large problem sizes. The method is available as open-source software from github.com/dobriban/pvalue_weighting_matlab (accessed 11 October 2017).
Sexuality in persons with lower extremity amputations.
Bodenheimer, C; Kerrigan, A J; Garber, S L; Monga, T N
2000-06-15
There is a paucity of information regarding sexual functioning in persons with lower extremity amputations. The purpose of this study was to describe sexual and psychological functioning and health status in persons with lower extremity amputation. Self-report surveys assessed sexual functioning (Derogatis Inventory), depression (Beck Depression Inventory, anxiety (State-Trait Anxiety Inventory), and health status (Health Status Questionnaire) in a convenience sample of 30 men with lower extremity amputations. Mean age of the participants was 57 years (range 32-79). Mean duration since amputation was 23 months (range 3-634 months). Twenty one subjects (70%) had trans-tibial and seven subjects (23%) had trans-femoral amputations. A majority of subjects were experiencing problems in several domains of sexual functioning. Fifty three percent (n = 16) of the subjects were engaged in sexual intercourse or oral sex at least once a month. Twenty seven percent (n = 8) were masturbating at least once a month. Nineteen subjects (63%) reported orgasmic problems and 67% were experiencing erectile difficulties. Despite these problems, interest in sex was high in over 90% of the subjects. There was no evidence of increased prevalence of depression or anxiety in these subjects when compared to other outpatient adult populations. Sexual problems were common in the subjects studied. Despite these problems, interest in sex remained high. Few investigations have been directed toward identifying the psychological and social factors that may contribute to these problems and more research with a larger population is needed in this area.
UCODE, a computer code for universal inverse modeling
Poeter, E.P.; Hill, M.C.
1999-01-01
This article presents the US Geological Survey computer program UCODE, which was developed in collaboration with the US Army Corps of Engineers Waterways Experiment Station and the International Ground Water Modeling Center of the Colorado School of Mines. UCODE performs inverse modeling, posed as a parameter-estimation problem, using nonlinear regression. Any application model or set of models can be used; the only requirement is that they have numerical (ASCII or text only) input and output files and that the numbers in these files have sufficient significant digits. Application models can include preprocessors and postprocessors as well as models related to the processes of interest (physical, chemical and so on), making UCODE extremely powerful for model calibration. Estimated parameters can be defined flexibly with user-specified functions. Observations to be matched in the regression can be any quantity for which a simulated equivalent value can be produced, thus simulated equivalent values are calculated using values that appear in the application model output files and can be manipulated with additive and multiplicative functions, if necessary. Prior, or direct, information on estimated parameters also can be included in the regression. The nonlinear regression problem is solved by minimizing a weighted least-squares objective function with respect to the parameter values using a modified Gauss-Newton method. Sensitivities needed for the method are calculated approximately by forward or central differences and problems and solutions related to this approximation are discussed. Statistics are calculated and printed for use in (1) diagnosing inadequate data or identifying parameters that probably cannot be estimated with the available data, (2) evaluating estimated parameter values, (3) evaluating the model representation of the actual processes and (4) quantifying the uncertainty of model simulated values. UCODE is intended for use on any computer operating system: it consists of algorithms programmed in perl, a freeware language designed for text manipulation and Fortran90, which efficiently performs numerical calculations.
A comparative assessment of statistical methods for extreme weather analysis
NASA Astrophysics Data System (ADS)
Schlögl, Matthias; Laaha, Gregor
2017-04-01
Extreme weather exposure assessment is of major importance for scientists and practitioners alike. We compare different extreme value approaches and fitting methods with respect to their value for assessing extreme precipitation and temperature impacts. Based on an Austrian data set from 25 meteorological stations representing diverse meteorological conditions, we assess the added value of partial duration series over the standardly used annual maxima series in order to give recommendations for performing extreme value statistics of meteorological hazards. Results show the merits of the robust L-moment estimation, which yielded better results than maximum likelihood estimation in 62 % of all cases. At the same time, results question the general assumption of the threshold excess approach (employing partial duration series, PDS) being superior to the block maxima approach (employing annual maxima series, AMS) due to information gain. For low return periods (non-extreme events) the PDS approach tends to overestimate return levels as compared to the AMS approach, whereas an opposite behavior was found for high return levels (extreme events). In extreme cases, an inappropriate threshold was shown to lead to considerable biases that may outperform the possible gain of information from including additional extreme events by far. This effect was neither visible from the square-root criterion, nor from standardly used graphical diagnosis (mean residual life plot), but from a direct comparison of AMS and PDS in synoptic quantile plots. We therefore recommend performing AMS and PDS approaches simultaneously in order to select the best suited approach. This will make the analyses more robust, in cases where threshold selection and dependency introduces biases to the PDS approach, but also in cases where the AMS contains non-extreme events that may introduce similar biases. For assessing the performance of extreme events we recommend conditional performance measures that focus on rare events only in addition to standardly used unconditional indicators. The findings of this study are of relevance for a broad range of environmental variables, including meteorological and hydrological quantities.
Coastal-storm Inundation and Sea-level Rise in New Zealand Scott A. Stephens and Rob Bell
NASA Astrophysics Data System (ADS)
Stephens, S. A.; Bell, R.
2016-12-01
Coastal-storm inundation is a growing problem in New Zealand. It happens occasionally, when the combined forces of weather and sea line up, causing inundation of low-elevation land, coastal erosion, and rivers and stormwater systems to back up causing inland flooding. This becomes a risk where we have placed buildings and infrastructure too close to the coast. Coastal-storm inundation is not a new problem, it has happened historically, but it is becoming more frequent as the sea level continues to rise. From analyses of historic extreme sea-level events, we show how the different sea-level components, such as tide and storm surge, contribute to extreme sea-level and how these components vary around New Zealand. Recent sea-level analyses reveal some large storm surges, bigger than previously reported, and we show the type of weather patterns that drive them, and how this leads to differences in storm surge potential between the east and west coasts. Although large and damaging storm-tides have occurred historically, we show that there is potential for considerably larger elevations to be reached in the "perfect storm", and we estimate the likelihood of such extreme events occurring. Sea-level rise (SLR) will greatly increase the frequency, depth and consequences of coastal-storm inundation in the future. We show an application of a new method to determine the increasing frequency of extreme sea-levels with SLR, one which integrates the extreme tail with regularly-occurring high tides. We present spatial maps of several extreme sea-level threshold exceedance statistics for a case study at Mission Bay, Auckland, New Zealand. The maps show how the local community is likely to face decision points at various SLR thresholds, and we conclude that coastal hazard assessments should ideally use several SLR scenarios and time windows within the next 100 years or more to support the decision-making process for future coastal adaptation and when response options will be needed. In tandem, coastal hazard assessments should also provide information on SLR values linked to expected inundation frequency or depth. This can be linked to plausible timeframes for SLR thresholds to determine when critical decision points for adaptation might be reached, and we show how this might be achieved.
ERIC Educational Resources Information Center
Freeze, Rick; Cook, Paula
2005-01-01
The purpose of this study was to assess the efficacy and practicality of precision reading, a constructive reading intervention, with students with cognitive impairments, extreme academic deficits in reading, and severe social, emotional, and psychiatric problems. As precision reading had shown promise with students with low achievement, learning…
ERIC Educational Resources Information Center
Schmidt, Louis A.; Miskovic, Vladimir; Boyle, Michael; Saigal, Saroj
2010-01-01
The authors examined internalizing behavior problems at middle childhood, adolescence, and young adulthood and brain-based measures of stress vulnerability in 154 right-handed, nonimpaired young adults (M age = 23 years): 71 (30 males, 41 females) born at extremely low birth weight (ELBW; less than 1,000 g) and 83 (35 males, 48 females) controls…
Generalized extreme gust wind speeds distributions
Cheng, E.; Yeung, C.
2002-01-01
Since summer 1996, the US wind engineers are using the extreme gust (or 3-s gust) as the basic wind speed to quantify the destruction of extreme winds. In order to better understand these destructive wind forces, it is important to know the appropriate representations of these extreme gust wind speeds. Therefore, the purpose of this study is to determine the most suitable extreme value distributions for the annual extreme gust wind speeds recorded in large selected areas. To achieve this objective, we are using the generalized Pareto distribution as the diagnostic tool for determining the types of extreme gust wind speed distributions. The three-parameter generalized extreme value distribution function is, thus, reduced to either Type I Gumbel, Type II Frechet or Type III reverse Weibull distribution function for the annual extreme gust wind speeds recorded at a specific site.With the considerations of the quality and homogeneity of gust wind data collected at more than 750 weather stations throughout the United States, annual extreme gust wind speeds at selected 143 stations in the contiguous United States were used in the study. ?? 2002 Elsevier Science Ltd. All rights reserved.
Extremal problems for topological indices in combinatorial chemistry.
Tichy, Robert F; Wagner, Stephan
2005-09-01
Topological indices of molecular graphs are related to several physicochemical characteristics; recently, the inverse problem for some of these indices has been studied, and it has some applications in the design of combinatorial libraries for drug discovery. It is thus very natural to study also extremal problems for these indices, i.e., finding graphs having minimal or maximal index. In this paper, these questions will be discussed for three different indices, namely the sigma-index, the c-index and the Z-index, with emphasis on the sigma-index.
Shumway, Sterling T; Wampler, Richard S; Dersch, Charette; Arredondo, Rudy
2004-01-01
Marriage and family services have not been widely recognized as part of employee assistance programs (EAP), although family and relational problems are widely cited as sources of problems on the job. EAP clients (N = 800, 97% self-referred) indicated how much family, psychological/emotional, drug, alcohol, employment-related, legal, and medical problems troubled them and the need for services in each area. Psychological/emotional (66%) and family (65%) problem areas frequently were rated "considerable" or "extreme." Both areas were rated as "considerable" or "extreme" by 48.6% of participants. In view of the evidence that marriage and family services can be effective with both family and psychological/emotional problems, professionals who are competent to provide such services have much to offer EAP programs.
Examining global extreme sea level variations on the coast from in-situ and remote observations
NASA Astrophysics Data System (ADS)
Menendez, Melisa; Benkler, Anna S.
2017-04-01
The estimation of extreme water level values on the coast is a requirement for a wide range of engineering and coastal management applications. In addition, climate variations of extreme sea levels on the coastal area result from a complex interacting of oceanic, atmospheric and terrestrial processes across a wide range of spatial and temporal scales. In this study, variations of extreme sea level return values are investigated from two available sources of information: in-situ tide-gauge records and satellite altimetry data. Long time series of sea level from tide-gauge records are the most valuable observations since they directly measure water level in a specific coastal location. They have however a number of sources of in-homogeneities that may affect the climate description of extremes when this data source is used. Among others, the presence of gaps, historical time in-homogeneities and jumps in the mean sea level signal are factors that can provide uncertainty in the characterization of the extreme sea level behaviour. Moreover, long records from tide-gauges are sparse and there are many coastal areas worldwide without in-situ available information. On the other hand, with the accumulating altimeter records of several satellite missions from the 1990s, approaching 25 recorded years at the time of writing, it is becoming possible the analysis of extreme sea level events from this data source. Aside the well-known issue of altimeter measurements very close to the coast (mainly due to corruption by land, wet troposphere path delay errors and local tide effects on the coastal area), there are other aspects that have to be considered when sea surface height values estimated from satellite are going to be used in a statistical extreme model, such as the use of a multi-mission product to get long observed periods and the selection of the maxima sample, since altimeter observations do not provide values uniform in time and space. Here, we have compared the extreme values of 'still water level' and 'non-tidal-residual' of in-situ records from the GESLA2 dataset (Woodworth et al. 2016) against the novel coastal altimetry datasets (Cipollini et al. 2016). Seasonal patterns, inter-annual variability and long-term trends are analyzed. Then, a time-dependent extreme model (Menendez et al. 2009) is applied to characterize extreme sea level return values and their variability on the coastal area around the world.
NASA Astrophysics Data System (ADS)
El-Shobokshy, Mohammad S.; Al-Saedi, Yaseen G.
This paper investigates some of the air pollution problems which have been created as a result of the Gulf war in early 1991. Temporary periods of increased dust storm activity have been observed in Saudi Arabia. This is presumably due to disturbance of the desert surface by the extremely large number of tanks and other war machines before and during the war. The concentrations of inhalable dust particles (<15 μm) increased during the months just after the war by a factor of about 1.5 of their values during the same months of the previous year, 1990. The total horizontal solar energy flux in Riyadh has been significantly reduced during dry days with no clouds. This is attributed to the presence of soot particles, which have been generated at an extremely high rate from the fired oil fields in Kuwait. The direct normal solar insolation were also measured at the photovoltaic solar power plant in Riyadh during these days and significant reductions were observed due to the effective absorption of solar radiation by soot particles. The generated power from the plant has been reduced during days with a polluted atmosphere by about 50-80% of the expected value for such days, if the atmosphere were dry and clear.
This is a presentation titled Estimating the Effect of Climate Change on Crop Yields and Farmland Values: The Importance of Extreme Temperatures that was given for the National Center for Environmental Economics
NASA Astrophysics Data System (ADS)
Qi, Wei
2017-11-01
Cost-benefit analysis is commonly used for engineering planning and design problems in practice. However, previous cost-benefit based design flood estimation is based on stationary assumption. This study develops a non-stationary cost-benefit based design flood estimation approach. This approach integrates a non-stationary probability distribution function into cost-benefit analysis, and influence of non-stationarity on expected total cost (including flood damage and construction costs) and design flood estimation can be quantified. To facilitate design flood selections, a 'Risk-Cost' analysis approach is developed, which reveals the nexus of extreme flood risk, expected total cost and design life periods. Two basins, with 54-year and 104-year flood data respectively, are utilized to illustrate the application. It is found that the developed approach can effectively reveal changes of expected total cost and extreme floods in different design life periods. In addition, trade-offs are found between extreme flood risk and expected total cost, which reflect increases in cost to mitigate risk. Comparing with stationary approaches which generate only one expected total cost curve and therefore only one design flood estimation, the proposed new approach generate design flood estimation intervals and the 'Risk-Cost' approach selects a design flood value from the intervals based on the trade-offs between extreme flood risk and expected total cost. This study provides a new approach towards a better understanding of the influence of non-stationarity on expected total cost and design floods, and could be beneficial to cost-benefit based non-stationary design flood estimation across the world.
Analytical approximation for the Einstein-dilaton-Gauss-Bonnet black hole metric
NASA Astrophysics Data System (ADS)
Kokkotas, K. D.; Konoplya, R. A.; Zhidenko, A.
2017-09-01
We construct an analytical approximation for the numerical black hole metric of P. Kanti et al. [Phys. Rev. D 54, 5049 (1996), 10.1103/PhysRevD.54.5049] in the four-dimensional Einstein-dilaton-Gauss-Bonnet (EdGB) theory. The continued fraction expansion in terms of a compactified radial coordinate, used here, converges slowly when the dilaton coupling approaches its extremal values, but for a black hole far from the extremal state, the analytical formula has a maximal relative error of a fraction of one percent already within the third order of the continued fraction expansion. The suggested analytical representation of the numerical black hole metric is relatively compact and a good approximation in the whole space outside the black hole event horizon. Therefore, it can serve in the same way as an exact solution when analyzing particles' motion, perturbations, quasinormal modes, Hawking radiation, accreting disks, and many other problems in the vicinity of a black hole. In addition, we construct the approximate analytical expression for the dilaton field.
Hierarchy and extremes in selections from pools of randomized proteins
Boyer, Sébastien; Biswas, Dipanwita; Kumar Soshee, Ananda; Scaramozzino, Natale; Nizak, Clément; Rivoire, Olivier
2016-01-01
Variation and selection are the core principles of Darwinian evolution, but quantitatively relating the diversity of a population to its capacity to respond to selection is challenging. Here, we examine this problem at a molecular level in the context of populations of partially randomized proteins selected for binding to well-defined targets. We built several minimal protein libraries, screened them in vitro by phage display, and analyzed their response to selection by high-throughput sequencing. A statistical analysis of the results reveals two main findings. First, libraries with the same sequence diversity but built around different “frameworks” typically have vastly different responses; second, the distribution of responses of the best binders in a library follows a simple scaling law. We show how an elementary probabilistic model based on extreme value theory rationalizes the latter finding. Our results have implications for designing synthetic protein libraries, estimating the density of functional biomolecules in sequence space, characterizing diversity in natural populations, and experimentally investigating evolvability (i.e., the potential for future evolution). PMID:26969726
Hierarchy and extremes in selections from pools of randomized proteins.
Boyer, Sébastien; Biswas, Dipanwita; Kumar Soshee, Ananda; Scaramozzino, Natale; Nizak, Clément; Rivoire, Olivier
2016-03-29
Variation and selection are the core principles of Darwinian evolution, but quantitatively relating the diversity of a population to its capacity to respond to selection is challenging. Here, we examine this problem at a molecular level in the context of populations of partially randomized proteins selected for binding to well-defined targets. We built several minimal protein libraries, screened them in vitro by phage display, and analyzed their response to selection by high-throughput sequencing. A statistical analysis of the results reveals two main findings. First, libraries with the same sequence diversity but built around different "frameworks" typically have vastly different responses; second, the distribution of responses of the best binders in a library follows a simple scaling law. We show how an elementary probabilistic model based on extreme value theory rationalizes the latter finding. Our results have implications for designing synthetic protein libraries, estimating the density of functional biomolecules in sequence space, characterizing diversity in natural populations, and experimentally investigating evolvability (i.e., the potential for future evolution).
Extreme Statistics of Storm Surges in the Baltic Sea
NASA Astrophysics Data System (ADS)
Kulikov, E. A.; Medvedev, I. P.
2017-11-01
Statistical analysis of the extreme values of the Baltic Sea level has been performed for a series of observations for 15-125 years at 13 tide gauge stations. It is shown that the empirical relation between value of extreme sea level rises or ebbs (caused by storm events) and its return period in the Baltic Sea can be well approximated by the Gumbel probability distribution. The maximum values of extreme floods/ebbs of the 100-year recurrence were observed in the Gulf of Finland and the Gulf of Riga. The two longest data series, observed in Stockholm and Vyborg over 125 years, have shown a significant deviation from the Gumbel distribution for the rarest events. Statistical analysis of the hourly sea level data series reveals some asymmetry in the variability of the Baltic Sea level. The probability of rises proved higher than that of ebbs. As for the magnitude of the 100-year recurrence surge, it considerably exceeded the magnitude of ebbs almost everywhere. This asymmetry effect can be attributed to the influence of low atmospheric pressure during storms. A statistical study of extreme values has also been applied to sea level series for Narva over the period of 1994-2000, which were simulated by the ROMS numerical model. Comparisons of the "simulated" and "observed" extreme sea level distributions show that the model reproduces quite satisfactorily extreme floods of "moderate" magnitude; however, it underestimates sea level changes for the most powerful storm surges.
NASA Astrophysics Data System (ADS)
Rieder, Harald E.; Jancso, Leonhardt M.; Staehelin, Johannes; Maeder, Jörg A.; Ribatet, Mathieu; Peter, Thomas; Davison, Anthony C.
2010-05-01
In this study we analyze the frequency distribution of extreme events in low and high total ozone (termed ELOs and EHOs) for 5 long-term stations in the northern mid-latitudes in Europe (Belsk, Poland; Hradec Kralove, Czech Republic; Hohenpeissenberg and Potsdam, Germany; and Uccle, Belgium). Further, the influence of these extreme events on annual and seasonal mean values and trends is analysed. The applied method follows the new "ozone extreme concept", which is based on tools from extreme value theory [Coles, 2001; Ribatet, 2007], recently developed by Rieder et al. [2010a, b]. Mathematically seen the decisive feature within the extreme concept is the Generalized Pareto Distribution (GPD). In this analysis, the long-term trends needed to be removed first, differently to the treatment of Rieder et al. [2010a, b], in which the time series of Arosa was analysed, covering many decades of measurements in the anthropogenically undisturbed stratosphere. In contrast to previous studies only focusing on so called ozone mini-holes and mini-highs the "ozone extreme concept" provides a statistical description of the tails in total ozone distributions (i.e. extreme low and high values). It is shown that this concept is not only an appropriate method to describe the frequency and distribution of extreme events, it also provides new information on time series properties and internal variability. Furthermore it allows detection of fingerprints of physical (e.g. El Niño, NAO) and chemical (e.g. polar vortex ozone loss) features in the Earth's atmosphere as well as major volcanic eruptions (e.g. El Chichón, Mt. Pinatubo). It is shown that mean values and trends in total ozone are strongly influenced by extreme events. Trend calculations (for the period 1970-1990) are performed for the entire as well as the extremes-removed time series. The results after excluding extremes show that annual trends are most reduced at Hradec Kralove (about a factor of 3), followed by Potsdam (factor of 2.5), and Hohenpeissenberg and Belsk (both about a factor of 2). In general the reduction of trend is strongest during winter and spring. Throughout all stations the influence of ELOs on observed trends is larger than those of EHOs. Especially from the 1990s on ELOs dominate the picture as only a relatively small fraction of EHOs can be observed in the records (due to strong influence of Mt. Pinatubo eruption and polar vortex ozone loss contributions). Additionally it is evidenced that the number of observed mini-holes can be estimated highly accurate by the GPD-model. Overall the results of this thesis show that extreme events play a major role in total ozone and the "ozone extremes concept" provides deeper insight in the influence of chemical and physical features on column ozone. References: Coles, S.: An Introduction to Statistical Modeling of Extreme Values, Springer Series in Statistics, ISBN:1852334592, Springer, Berlin, 2001. Ribatet, M.: POT: Modelling peaks over a threshold, R News, 7, 34-36, 2007. Rieder ,H.E., Staehelin, J., Maeder, J.A., Ribatet, M., Stübi, R., Weihs, P., Holawe, F., Peter, T., and A.D., Davison (2010): Extreme events in total ozone over Arosa - Part I: Application of extreme value theory, to be submitted to ACPD. Rieder, H.E., Staehelin, J., Maeder, J.A., Ribatet, M., Stübi, R., Weihs, P., Holawe, F., Peter, T., and A.D., Davison (2010): Extreme events in total ozone over Arosa - Part II: Fingerprints of atmospheric dynamics and chemistry and effects on mean values and long-term changes, to be submitted to ACPD.
Doing Solar Science With Extreme-ultraviolet and X-ray High Resolution Imaging Spectroscopy
NASA Astrophysics Data System (ADS)
Doschek, G. A.
2005-12-01
In this talk I will demonstrate how high resolution extreme-ultraviolet (EUV) and/or X-ray imaging spectroscopy can be used to provide unique information for solving several current key problems of the solar atmosphere, e.g., the morphology and reconnection site of solar flares, the structure of the transition region, and coronal heating. I will describe the spectra that already exist relevant to these problems and what the shortcomings of the data are, and how an instrument such as the Extreme-ultraviolet Imaging Spectrometer (EIS) on Solar-B as well as other proposed spectroscopy missions such as NEXUS and RAM will improve on the existing observations. I will discuss a few particularly interesting properties of the spectra and atomic data for highly ionized atoms that are important for the science problems.
NASA Astrophysics Data System (ADS)
Zhao, Lili; Yin, Jianping; Yuan, Lihuan; Liu, Qiang; Li, Kuan; Qiu, Minghui
2017-07-01
Automatic detection of abnormal cells from cervical smear images is extremely demanded in annual diagnosis of women's cervical cancer. For this medical cell recognition problem, there are three different feature sections, namely cytology morphology, nuclear chromatin pathology and region intensity. The challenges of this problem come from feature combination s and classification accurately and efficiently. Thus, we propose an efficient abnormal cervical cell detection system based on multi-instance extreme learning machine (MI-ELM) to deal with above two questions in one unified framework. MI-ELM is one of the most promising supervised learning classifiers which can deal with several feature sections and realistic classification problems analytically. Experiment results over Herlev dataset demonstrate that the proposed method outperforms three traditional methods for two-class classification in terms of well accuracy and less time.
NASA Astrophysics Data System (ADS)
Wintoft, Peter; Viljanen, Ari; Wik, Magnus
2016-05-01
High-frequency ( ≈ minutes) variability of ground magnetic fields is caused by ionospheric and magnetospheric processes driven by the changing solar wind. The varying magnetic fields induce electrical fields that cause currents to flow in man-made conductors like power grids and pipelines. Under extreme conditions the geomagnetically induced currents (GIC) may be harmful to the power grids. Increasing our understanding of the extreme events is thus important for solar-terrestrial science and space weather. In this work 1-min resolution of the time derivative of measured local magnetic fields (|dBh/dt|) and computed electrical fields (Eh), for locations in Europe, have been analysed with extreme value analysis (EVA). The EVA results in an estimate of the generalized extreme value probability distribution that is described by three parameters: location, width, and shape. The shape parameter controls the extreme behaviour. The stations cover geomagnetic latitudes from 40 to 70° N. All stations included in the study have contiguous coverage of 18 years or more with 1-min resolution data. As expected, the EVA shows that the higher latitude stations have higher probability of large |dBh/dt| and |Eh| compared to stations further south. However, the EVA also shows that the shape of the distribution changes with magnetic latitude. The high latitudes have distributions that fall off faster to zero than the low latitudes, and upward bounded distributions can not be ruled out. The transition occurs around 59-61° N magnetic latitudes. Thus, the EVA shows that the observed series north of ≈ 60° N have already measured values that are close to the expected maxima values, while stations south of ≈ ° N will measure larger values in the future.
Rud, I M; Melnikova, E A; Rassulova, M A; Razumov, A N; Gorelikov, A E
2017-12-28
The present article is the analytical review of the literature pertaining to the problem of rehabilitation of the patients following the endoprosthetic replacement of joints of the lower extremities. The relevance of the problem of interest for medical rehabilitation is beyond any doubt. The traditional methods for the rehabilitation of the patients do not always lead to the desired results. The authors discuss in detail the need for and the contemporary approaches to the rehabilitation of the patients who had undergone reconstructive surgery and arthroplasty of the joints of the lower extremities. The pathogenetically-based three-stage algorithm for medical rehabilitation is proposed.
NASA Astrophysics Data System (ADS)
da Silva Oliveira, C. I.; Martinez-Martinez, D.; Al-Rjoub, A.; Rebouta, L.; Menezes, R.; Cunha, L.
2018-04-01
In this paper, we present a statistical method that allows evaluating the degree of a transparency of a thin film. To do so, the color coordinates are measured on different substrates, and the standard deviation is evaluated. In case of low values, the color depends on the film and not on the substrate, and intrinsic colors are obtained. In contrast, transparent films lead to high values of standard deviation, since the value of the color coordinates depends on the substrate. Between both extremes, colored films with a certain degree of transparency can be found. This method allows an objective and simple evaluation of the transparency of any film, improving the subjective visual inspection and avoiding the thickness problems related to optical spectroscopy evaluation. Zirconium oxynitride films deposited on three different substrates (Si, steel and glass) are used for testing the validity of this method, whose results have been validated with optical spectroscopy, and agree with the visual impression of the samples.
Dynamic remapping decisions in multi-phase parallel computations
NASA Technical Reports Server (NTRS)
Nicol, D. M.; Reynolds, P. F., Jr.
1986-01-01
The effectiveness of any given mapping of workload to processors in a parallel system is dependent on the stochastic behavior of the workload. Program behavior is often characterized by a sequence of phases, with phase changes occurring unpredictably. During a phase, the behavior is fairly stable, but may become quite different during the next phase. Thus a workload assignment generated for one phase may hinder performance during the next phase. We consider the problem of deciding whether to remap a paralled computation in the face of uncertainty in remapping's utility. Fundamentally, it is necessary to balance the expected remapping performance gain against the delay cost of remapping. This paper treats this problem formally by constructing a probabilistic model of a computation with at most two phases. We use stochastic dynamic programming to show that the remapping decision policy which minimizes the expected running time of the computation has an extremely simple structure: the optimal decision at any step is followed by comparing the probability of remapping gain against a threshold. This theoretical result stresses the importance of detecting a phase change, and assessing the possibility of gain from remapping. We also empirically study the sensitivity of optimal performance to imprecise decision threshold. Under a wide range of model parameter values, we find nearly optimal performance if remapping is chosen simply when the gain probability is high. These results strongly suggest that except in extreme cases, the remapping decision problem is essentially that of dynamically determining whether gain can be achieved by remapping after a phase change; precise quantification of the decision model parameters is not necessary.
Detection and identification of concealed weapons using matrix pencil
NASA Astrophysics Data System (ADS)
Adve, Raviraj S.; Thayaparan, Thayananthan
2011-06-01
The detection and identification of concealed weapons is an extremely hard problem due to the weak signature of the target buried within the much stronger signal from the human body. This paper furthers the automatic detection and identification of concealed weapons by proposing the use of an effective approach to obtain the resonant frequencies in a measurement. The technique, based on Matrix Pencil, a scheme for model based parameter estimation also provides amplitude information, hence providing a level of confidence in the results. Of specific interest is the fact that Matrix Pencil is based on a singular value decomposition, making the scheme robust against noise.
Quantum noise and squeezing in optical parametric oscillator with arbitrary output coupling
NASA Technical Reports Server (NTRS)
Prasad, Sudhakar
1993-01-01
The redistribution of intrinsic quantum noise in the quadratures of the field generated in a sub-threshold degenerate optical parametric oscillator exhibits interesting dependences on the individual output mirror transmittances, when they are included exactly. We present a physical picture of this problem, based on mirror boundary conditions, which is valid for arbitrary transmittances. Hence, our picture applies uniformly to all values of the cavity Q factor representing, in the opposite extremes, both perfect oscillator and amplifier configurations. Beginning with a classical second-harmonic pump, we shall generalize our analysis to the finite amplitude and phase fluctuations of the pump.
Lecture Notes on Criticality Safety Validation Using MCNP & Whisper
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Forrest B.; Rising, Michael Evan; Alwin, Jennifer Louise
Training classes for nuclear criticality safety, MCNP documentation. The need for, and problems surrounding, validation of computer codes and data area considered first. Then some background for MCNP & Whisper is given--best practices for Monte Carlo criticality calculations, neutron spectra, S(α,β) thermal neutron scattering data, nuclear data sensitivities, covariance data, and correlation coefficients. Whisper is computational software designed to assist the nuclear criticality safety analyst with validation studies with the Monte Carlo radiation transport package MCNP. Whisper's methodology (benchmark selection – C k's, weights; extreme value theory – bias, bias uncertainty; MOS for nuclear data uncertainty – GLLS) and usagemore » are discussed.« less
Regional estimation of extreme suspended sediment concentrations using watershed characteristics
NASA Astrophysics Data System (ADS)
Tramblay, Yves; Ouarda, Taha B. M. J.; St-Hilaire, André; Poulin, Jimmy
2010-01-01
SummaryThe number of stations monitoring daily suspended sediment concentration (SSC) has been decreasing since the 1980s in North America while suspended sediment is considered as a key variable for water quality. The objective of this study is to test the feasibility of regionalising extreme SSC, i.e. estimating SSC extremes values for ungauged basins. Annual maximum SSC for 72 rivers in Canada and USA were modelled with probability distributions in order to estimate quantiles corresponding to different return periods. Regionalisation techniques, originally developed for flood prediction in ungauged basins, were tested using the climatic, topographic, land cover and soils attributes of the watersheds. Two approaches were compared, using either physiographic characteristics or seasonality of extreme SSC to delineate the regions. Multiple regression models to estimate SSC quantiles as a function of watershed characteristics were built in each region, and compared to a global model including all sites. Regional estimates of SSC quantiles were compared with the local values. Results show that regional estimation of extreme SSC is more efficient than a global regression model including all sites. Groups/regions of stations have been identified, using either the watershed characteristics or the seasonality of occurrence for extreme SSC values providing a method to better describe the extreme events of SSC. The most important variables for predicting extreme SSC are the percentage of clay in the soils, precipitation intensity and forest cover.
Optimization Research on Ampacity of Underground High Voltage Cable Based on Interior Point Method
NASA Astrophysics Data System (ADS)
Huang, Feng; Li, Jing
2017-12-01
The conservative operation method which takes unified current-carrying capacity as maximum load current can’t make full use of the overall power transmission capacity of the cable. It’s not the optimal operation state for the cable cluster. In order to improve the transmission capacity of underground cables in cluster, this paper regards the maximum overall load current as the objective function and the temperature of any cables lower than maximum permissible temperature as constraint condition. The interior point method which is very effective for nonlinear problem is put forward to solve the extreme value of the problem and determine the optimal operating current of each loop. The results show that the optimal solutions obtained with the purposed method is able to increase the total load current about 5%. It greatly improves the economic performance of the cable cluster.
On different types of uncertainties in the context of the precautionary principle.
Aven, Terje
2011-10-01
Few policies for risk management have created more controversy than the precautionary principle. A main problem is the extreme number of different definitions and interpretations. Almost all definitions of the precautionary principle identify "scientific uncertainties" as the trigger or criterion for its invocation; however, the meaning of this concept is not clear. For applying the precautionary principle it is not sufficient that the threats or hazards are uncertain. A stronger requirement is needed. This article provides an in-depth analysis of this issue. We question how the scientific uncertainties are linked to the interpretation of the probability concept, expected values, the results from probabilistic risk assessments, the common distinction between aleatory uncertainties and epistemic uncertainties, and the problem of establishing an accurate prediction model (cause-effect relationship). A new classification structure is suggested to define what scientific uncertainties mean. © 2011 Society for Risk Analysis.
Newsvendor problem under complete uncertainty: a case of innovative products.
Gaspars-Wieloch, Helena
2017-01-01
The paper presents a new scenario-based decision rule for the classical version of the newsvendor problem (NP) under complete uncertainty (i.e. uncertainty with unknown probabilities). So far, NP has been analyzed under uncertainty with known probabilities or under uncertainty with partial information (probabilities known incompletely). The novel approach is designed for the sale of new, innovative products, where it is quite complicated to define probabilities or even probability-like quantities, because there are no data available for forecasting the upcoming demand via statistical analysis. The new procedure described in the contribution is based on a hybrid of Hurwicz and Bayes decision rules. It takes into account the decision maker's attitude towards risk (measured by coefficients of optimism and pessimism) and the dispersion (asymmetry, range, frequency of extremes values) of payoffs connected with particular order quantities. It does not require any information about the probability distribution.
Stationary and non-stationary extreme value modeling of extreme temperature in Malaysia
NASA Astrophysics Data System (ADS)
Hasan, Husna; Salleh, Nur Hanim Mohd; Kassim, Suraiya
2014-09-01
Extreme annual temperature of eighteen stations in Malaysia is fitted to the Generalized Extreme Value distribution. Stationary and non-stationary models with trend are considered for each station and the Likelihood Ratio test is used to determine the best-fitting model. Results show that three out of eighteen stations i.e. Bayan Lepas, Labuan and Subang favor a model which is linear in the location parameter. A hierarchical cluster analysis is employed to investigate the existence of similar behavior among the stations. Three distinct clusters are found in which one of them consists of the stations that favor the non-stationary model. T-year estimated return levels of the extreme temperature are provided based on the chosen models.
On the identification of Dragon Kings among extreme-valued outliers
NASA Astrophysics Data System (ADS)
Riva, M.; Neuman, S. P.; Guadagnini, A.
2013-07-01
Extreme values of earth, environmental, ecological, physical, biological, financial and other variables often form outliers to heavy tails of empirical frequency distributions. Quite commonly such tails are approximated by stretched exponential, log-normal or power functions. Recently there has been an interest in distinguishing between extreme-valued outliers that belong to the parent population of most data in a sample and those that do not. The first type, called Gray Swans by Nassim Nicholas Taleb (often confused in the literature with Taleb's totally unknowable Black Swans), is drawn from a known distribution of the tails which can thus be extrapolated beyond the range of sampled values. However, the magnitudes and/or space-time locations of unsampled Gray Swans cannot be foretold. The second type of extreme-valued outliers, termed Dragon Kings by Didier Sornette, may in his view be sometimes predicted based on how other data in the sample behave. This intriguing prospect has recently motivated some authors to propose statistical tests capable of identifying Dragon Kings in a given random sample. Here we apply three such tests to log air permeability data measured on the faces of a Berea sandstone block and to synthetic data generated in a manner statistically consistent with these measurements. We interpret the measurements to be, and generate synthetic data that are, samples from α-stable sub-Gaussian random fields subordinated to truncated fractional Gaussian noise (tfGn). All these data have frequency distributions characterized by power-law tails with extreme-valued outliers about the tail edges.
Modelling hydrological extremes under non-stationary conditions using climate covariates
NASA Astrophysics Data System (ADS)
Vasiliades, Lampros; Galiatsatou, Panagiota; Loukas, Athanasios
2013-04-01
Extreme value theory is a probabilistic theory that can interpret the future probabilities of occurrence of extreme events (e.g. extreme precipitation and streamflow) using past observed records. Traditionally, extreme value theory requires the assumption of temporal stationarity. This assumption implies that the historical patterns of recurrence of extreme events are static over time. However, the hydroclimatic system is nonstationary on time scales that are relevant to extreme value analysis, due to human-mediated and natural environmental change. In this study the generalized extreme value (GEV) distribution is used to assess nonstationarity in annual maximum daily rainfall and streamflow timeseries at selected meteorological and hydrometric stations in Greece and Cyprus. The GEV distribution parameters (location, scale, and shape) are specified as functions of time-varying covariates and estimated using the conditional density network (CDN) as proposed by Cannon (2010). The CDN is a probabilistic extension of the multilayer perceptron neural network. Model parameters are estimated via the generalized maximum likelihood (GML) approach using the quasi-Newton BFGS optimization algorithm, and the appropriate GEV-CDN model architecture for the selected meteorological and hydrometric stations is selected by fitting increasingly complicated models and choosing the one that minimizes the Akaike information criterion with small sample size correction. For all case studies in Greece and Cyprus different formulations are tested with combinational cases of stationary and nonstationary parameters of the GEV distribution, linear and non-linear architecture of the CDN and combinations of the input climatic covariates. Climatic indices such as the Southern Oscillation Index (SOI), which describes atmospheric circulation in the eastern tropical pacific related to El Niño Southern Oscillation (ENSO), the Pacific Decadal Oscillation (PDO) index that varies on an interdecadal rather than interannual time scale and the atmospheric circulation patterns as expressed by the North Atlantic Oscillation (NAO) index are used to express the GEV parameters as functions of the covariates. Results show that the nonstationary GEV model can be an efficient tool to take into account the dependencies between extreme value random variables and the temporal evolution of the climate.
Extreme between-study homogeneity in meta-analyses could offer useful insights.
Ioannidis, John P A; Trikalinos, Thomas A; Zintzaras, Elias
2006-10-01
Meta-analyses are routinely evaluated for the presence of large between-study heterogeneity. We examined whether it is also important to probe whether there is extreme between-study homogeneity. We used heterogeneity tests with left-sided statistical significance for inference and developed a Monte Carlo simulation test for testing extreme homogeneity in risk ratios across studies, using the empiric distribution of the summary risk ratio and heterogeneity statistic. A left-sided P=0.01 threshold was set for claiming extreme homogeneity to minimize type I error. Among 11,803 meta-analyses with binary contrasts from the Cochrane Library, 143 (1.21%) had left-sided P-value <0.01 for the asymptotic Q statistic and 1,004 (8.50%) had left-sided P-value <0.10. The frequency of extreme between-study homogeneity did not depend on the number of studies in the meta-analyses. We identified examples where extreme between-study homogeneity (left-sided P-value <0.01) could result from various possibilities beyond chance. These included inappropriate statistical inference (asymptotic vs. Monte Carlo), use of a specific effect metric, correlated data or stratification using strong predictors of outcome, and biases and potential fraud. Extreme between-study homogeneity may provide useful insights about a meta-analysis and its constituent studies.
NASA Astrophysics Data System (ADS)
Rieder, Harald E.; Jancso, Leonhardt M.; Rocco, Stefania Di; Staehelin, Johannes; Maeder, Joerg A.; Peter, Thomas; Ribatet, Mathieu; Davison, Anthony C.; de Backer, Hugo; Koehler, Ulf; Krzyścin, Janusz; Vaníček, Karel
2011-11-01
We apply methods from extreme value theory to identify extreme events in high (termed EHOs) and low (termed ELOs) total ozone and to describe the distribution tails (i.e. very high and very low values) of five long-term European ground-based total ozone time series. The influence of these extreme events on observed mean values, long-term trends and changes is analysed. The results show a decrease in EHOs and an increase in ELOs during the last decades, and establish that the observed downward trend in column ozone during the 1970-1990s is strongly dominated by changes in the frequency of extreme events. Furthermore, it is shown that clear ‘fingerprints’ of atmospheric dynamics (NAO, ENSO) and chemistry [ozone depleting substances (ODSs), polar vortex ozone loss] can be found in the frequency distribution of ozone extremes, even if no attribution is possible from standard metrics (e.g. annual mean values). The analysis complements earlier analysis for the world's longest total ozone record at Arosa, Switzerland, confirming and revealing the strong influence of atmospheric dynamics on observed ozone changes. The results provide clear evidence that in addition to ODS, volcanic eruptions and strong/moderate ENSO and NAO events had significant influence on column ozone in the European sector.
NASA Astrophysics Data System (ADS)
Güleçyüz, M. Ç.; Şenyiğit, M.; Ersoy, A.
2018-01-01
The Milne problem is studied in one speed neutron transport theory using the linearly anisotropic scattering kernel which combines forward and backward scatterings (extremely anisotropic scattering) for a non-absorbing medium with specular and diffuse reflection boundary conditions. In order to calculate the extrapolated endpoint for the Milne problem, Legendre polynomial approximation (PN method) is applied and numerical results are tabulated for selected cases as a function of different degrees of anisotropic scattering. Finally, some results are discussed and compared with the existing results in literature.
NASA Astrophysics Data System (ADS)
Schlögl, Matthias; Laaha, Gregor
2017-04-01
The assessment of road infrastructure exposure to extreme weather events is of major importance for scientists and practitioners alike. In this study, we compare the different extreme value approaches and fitting methods with respect to their value for assessing the exposure of transport networks to extreme precipitation and temperature impacts. Based on an Austrian data set from 25 meteorological stations representing diverse meteorological conditions, we assess the added value of partial duration series (PDS) over the standardly used annual maxima series (AMS) in order to give recommendations for performing extreme value statistics of meteorological hazards. Results show the merits of the robust L-moment estimation, which yielded better results than maximum likelihood estimation in 62 % of all cases. At the same time, results question the general assumption of the threshold excess approach (employing PDS) being superior to the block maxima approach (employing AMS) due to information gain. For low return periods (non-extreme events) the PDS approach tends to overestimate return levels as compared to the AMS approach, whereas an opposite behavior was found for high return levels (extreme events). In extreme cases, an inappropriate threshold was shown to lead to considerable biases that may outperform the possible gain of information from including additional extreme events by far. This effect was visible from neither the square-root criterion nor standardly used graphical diagnosis (mean residual life plot) but rather from a direct comparison of AMS and PDS in combined quantile plots. We therefore recommend performing AMS and PDS approaches simultaneously in order to select the best-suited approach. This will make the analyses more robust, not only in cases where threshold selection and dependency introduces biases to the PDS approach but also in cases where the AMS contains non-extreme events that may introduce similar biases. For assessing the performance of extreme events we recommend the use of conditional performance measures that focus on rare events only in addition to standardly used unconditional indicators. The findings of the study directly address road and traffic management but can be transferred to a range of other environmental variables including meteorological and hydrological quantities.
NASA Astrophysics Data System (ADS)
Wang, Cailin; Ren, Xuehui; Li, Ying
2017-04-01
We defined the threshold of extreme precipitation using detrended fluctuation analysis based on daily precipitation during 1955-2013 in Kuandian County, Liaoning Province. Three-dimensional copulas were introduced to analyze the characteristics of four extreme precipitation factors: the annual extreme precipitation day, extreme precipitation amount, annual average extreme precipitation intensity, and extreme precipitation rate of contribution. The results show that (1) the threshold is 95.0 mm, extreme precipitation events generally occur 1-2 times a year, the average extreme precipitation intensity is 100-150 mm, and the extreme precipitation amount is 100-270 mm accounting for 10 to 37 % of annual precipitation. (2) The generalized extreme value distribution, extreme value distribution, and generalized Pareto distribution are suitable for fitting the distribution function for each element of extreme precipitation. The Ali-Mikhail-Haq (AMH) copula function reflects the joint characteristics of extreme precipitation factors. (3) The return period of the three types has significant synchronicity, and the joint return period and co-occurrence return period have long delay when the return period of the single factor is long. This reflects the inalienability of extreme precipitation factors. The co-occurrence return period is longer than that of the single factor and joint return period. (4) The single factor fitting only reflects single factor information of extreme precipitation but is unrelated to the relationship between factors. Three-dimensional copulas represent the internal information of extreme precipitation factors and are closer to the actual. The copula function is potentially widely applicable for the multiple factors of extreme precipitation.
Risk Factors for Lower-Extremity Injuries Among Contemporary Dance Students.
van Seters, Christine; van Rijn, Rogier M; van Middelkoop, Marienke; Stubbe, Janine H
2017-10-10
To determine whether student characteristics, lower-extremity kinematics, and strength are risk factors for sustaining lower-extremity injuries in preprofessional contemporary dancers. Prospective cohort study. Codarts University of the Arts. Forty-five first-year students of Bachelor Dance and Bachelor Dance Teacher. At the beginning of the academic year, the injury history (only lower-extremity) and student characteristics (age, sex, educational program) were assessed using a questionnaire. Besides, lower-extremity kinematics [single-leg squat (SLS)], strength (countermovement jump) and height and weight (body mass index) were measured during a physical performance test. Substantial lower-extremity injuries during the academic year were defined as any problems leading to moderate or severe reductions in training volume or in performance, or complete inability to participate in dance at least once during follow-up as measured with the Oslo Sports Trauma Research Center (OSTRC) Questionnaire on Health Problems. Injuries were recorded on a monthly basis using a questionnaire. Analyses on leg-level were performed using generalized estimating equations to test the associations between substantial lower-extremity injuries and potential risk factors. The 1-year incidence of lower-extremity injuries was 82.2%. Of these, 51.4% was a substantial lower-extremity injury. Multivariate analyses identified that ankle dorsiflexion during the SLS (OR 1.25; 95% confidence interval, 1.03-1.52) was a risk factor for a substantial lower-extremity injury. The findings indicate that contemporary dance students are at high risk for lower-extremity injuries. Therefore, the identified risk factor (ankle dorsiflexion) should be considered for prevention purposes.
NASA Technical Reports Server (NTRS)
Wobus, Cameron; Reynolds, Lara; Jones, Russell; Horton, Radley; Smith, Joel; Fries, J. Stephen; Tryby, Michael; Spero, Tanya; Nolte, Chris
2015-01-01
Many of the storms that generate damaging floods are caused by locally intense, sub-daily precipitation, yet the spatial and temporal resolution of the most widely available climate model outputs are both too coarse to simulate these events. Thus there is often a disconnect between the nature of the events that cause damaging floods and the models used to project how climate change might influence their magnitude. This could be a particular problem when developing scenarios to inform future storm water management options under future climate scenarios. In this study we sought to close this gap, using sub-daily outputs from the Weather Research and Forecasting model (WRF) from each of the nine climate regions in the United States. Specifically, we asked 1) whether WRF outputs projected consistent patterns of change for sub-daily and daily precipitation extremes; and 2) whether this dynamically downscaled model projected different magnitudes of change for 3-hourly vs 24-hourly extreme events. We extracted annual maximum values for 3-hour through 24-hour precipitation totals from an 11-year time series of hindcast (1995-2005) and mid-century (2045-2055) climate, and calculated the direction and magnitude of change for 3-hour and 24-hour extreme events over this timeframe. The model results project that the magnitude of both 3-hour and 24-hour events will increase over most regions of the United States, but there was no clear or consistent difference in the relative magnitudes of change for sub-daily vs daily events.
Extreme Value Theory and the New Sunspot Number Series
NASA Astrophysics Data System (ADS)
Acero, F. J.; Carrasco, V. M. S.; Gallego, M. C.; García, J. A.; Vaquero, J. M.
2017-04-01
Extreme value theory was employed to study solar activity using the new sunspot number index. The block maxima approach was used at yearly (1700-2015), monthly (1749-2016), and daily (1818-2016) scales, selecting the maximum sunspot number value for each solar cycle, and the peaks-over-threshold (POT) technique was used after a declustering process only for the daily data. Both techniques led to negative values for the shape parameters. This implies that the extreme sunspot number value distribution has an upper bound. The return level (RL) values obtained from the POT approach were greater than when using the block maxima technique. Regarding the POT approach, the 110 year (550 and 1100 year) RLs were lower (higher) than the daily maximum observed sunspot number value of 528. Furthermore, according to the block maxima approach, the 10-cycle RL lay within the block maxima daily sunspot number range, as expected, but it was striking that the 50- and 100-cycle RLs were also within that range. Thus, it would seem that the RL is reaching a plateau, and, although one must be cautious, it would be difficult to attain sunspot number values greater than 550. The extreme value trends from the four series (yearly, monthly, and daily maxima per solar cycle, and POT after declustering the daily data) were analyzed with the Mann-Kendall test and Sen’s method. Only the negative trend of the daily data with the POT technique was statistically significant.
GPS FOM Chimney Analysis using Generalized Extreme Value Distribution
NASA Technical Reports Server (NTRS)
Ott, Rick; Frisbee, Joe; Saha, Kanan
2004-01-01
Many a time an objective of a statistical analysis is to estimate a limit value like 3-sigma 95% confidence upper limit from a data sample. The generalized Extreme Value Distribution method can be profitably employed in many situations for such an estimate. . .. It is well known that according to the Central Limit theorem the mean value of a large data set is normally distributed irrespective of the distribution of the data from which the mean value is derived. In a somewhat similar fashion it is observed that many times the extreme value of a data set has a distribution that can be formulated with a Generalized Distribution. In space shuttle entry with 3-string GPS navigation the Figure Of Merit (FOM) value gives a measure of GPS navigated state accuracy. A GPS navigated state with FOM of 6 or higher is deemed unacceptable and is said to form a FOM 6 or higher chimney. A FOM chimney is a period of time during which the FOM value stays higher than 5. A longer period of FOM of value 6 or higher causes navigated state to accumulate more error for a lack of state update. For an acceptable landing it is imperative that the state error remains low and hence at low altitude during entry GPS data of FOM greater than 5 must not last more than 138 seconds. I To test the GPS performAnce many entry test cases were simulated at the Avionics Development Laboratory. Only high value FoM chimneys are consequential. The extreme value statistical technique is applied to analyze high value FOM chimneys. The Maximum likelihood method is used to determine parameters that characterize the GEV distribution, and then the limit value statistics are estimated.
The Nature and Characteristics of Youthful Extremism
ERIC Educational Resources Information Center
Zubok, Iu. A.; Chuprov, V. I.
2010-01-01
Extremism is an acute problem of the present day. Moods of extremism are manifested in all spheres of the life and activities of young people--in education, work, business, political life, and leisure activity. They can be found in both individual and group social self-determination and are influenced by the immediate social environment as well as…
Casapia, Martin; Joseph, Serene A; Gyorkos, Theresa W
2007-01-01
Background Communities of extreme poverty suffer disproportionately from a wide range of adverse outcomes, but are often neglected or underserved by organized services and research attention. In order to target the first Millennium Development Goal of eradicating extreme poverty, thereby reducing health inequalities, participatory research in these communities is needed. Therefore, the purpose of this study was to determine the priority problems and respective potential cost-effective interventions in Belen, a community of extreme poverty in the Peruvian Amazon, using a multidisciplinary and participatory focus. Methods Two multidisciplinary and participatory workshops were conducted with important stakeholders from government, non-government and community organizations, national institutes and academic institutions. In Workshop 1, participants prioritized the main health and health-related problems in the community of Belen. Problem trees were developed to show perceived causes and effects for the top six problems. In Workshop 2, following presentations describing data from recently completed field research in school and household populations of Belen, participants listed potential interventions for the priority problems, including associated barriers, enabling factors, costs and benefits. Results The top ten priority problems in Belen were identified as: 1) infant malnutrition; 2) adolescent pregnancy; 3) diarrhoea; 4) anaemia; 5) parasites; 6) lack of basic sanitation; 7) low level of education; 8) sexually transmitted diseases; 9) domestic violence; and 10) delayed school entry. Causes and effects for the top six problems, proposed interventions, and factors relating to the implementation of interventions were multidisciplinary in nature and included health, nutrition, education, social and environmental issues. Conclusion The two workshops provided valuable insight into the main health and health-related problems facing the community of Belen. The participatory focus of the workshops ensured the active involvement of important stakeholders from Belen. Based on the results of the workshops, effective and essential interventions are now being planned which will contribute to reducing health inequalities in the community. PMID:17623093
Research in Stochastic Processes.
1982-12-01
constant high level boundary. References 1. Jurg Husler , Extremie values of non-stationary sequ-ences ard the extr-rmal index, Center for Stochastic...A. Weron, Oct. 82. 20. "Extreme values of non-stationary sequences and the extremal index." Jurg Husler , Oct. 82. 21. "A finitely additive white noise...string model, Y. Miyahara, Carleton University and Nagoya University. Sept. 22 On extremfe values of non-stationary sequences, J. Husler , University of
More tornadoes in the most extreme U.S. tornado outbreaks
NASA Astrophysics Data System (ADS)
Tippett, Michael K.; Lepore, Chiara; Cohen, Joel E.
2016-12-01
Tornadoes and severe thunderstorms kill people and damage property every year. Estimated U.S. insured losses due to severe thunderstorms in the first half of 2016 were $8.5 billion (US). The largest U.S. effects of tornadoes result from tornado outbreaks, which are sequences of tornadoes that occur in close succession. Here, using extreme value analysis, we find that the frequency of U.S. outbreaks with many tornadoes is increasing and that it is increasing faster for more extreme outbreaks. We model this behavior by extreme value distributions with parameters that are linear functions of time or of some indicators of multidecadal climatic variability. Extreme meteorological environments associated with severe thunderstorms show consistent upward trends, but the trends do not resemble those currently expected to result from global warming.
Extreme value modelling of Ghana stock exchange index.
Nortey, Ezekiel N N; Asare, Kwabena; Mettle, Felix Okoe
2015-01-01
Modelling of extreme events has always been of interest in fields such as hydrology and meteorology. However, after the recent global financial crises, appropriate models for modelling of such rare events leading to these crises have become quite essential in the finance and risk management fields. This paper models the extreme values of the Ghana stock exchange all-shares index (2000-2010) by applying the extreme value theory (EVT) to fit a model to the tails of the daily stock returns data. A conditional approach of the EVT was preferred and hence an ARMA-GARCH model was fitted to the data to correct for the effects of autocorrelation and conditional heteroscedastic terms present in the returns series, before the EVT method was applied. The Peak Over Threshold approach of the EVT, which fits a Generalized Pareto Distribution (GPD) model to excesses above a certain selected threshold, was employed. Maximum likelihood estimates of the model parameters were obtained and the model's goodness of fit was assessed graphically using Q-Q, P-P and density plots. The findings indicate that the GPD provides an adequate fit to the data of excesses. The size of the extreme daily Ghanaian stock market movements were then computed using the value at risk and expected shortfall risk measures at some high quantiles, based on the fitted GPD model.
Brown, L; Burns, Y R; Watter, P; Gray, P H; Gibbons, K S
2018-03-01
Extreme prematurity or extremely low birth weight (ELBW) can adversely affect behaviour. Nondisabled ELBW children are at risk of behavioural problems, which may become a particular concern after commencement of formal education. This study explored the frequency of behavioural and emotional problems amongst nondisabled ELBW children at 4 to 5 years of age and whether intervention had a positive influence on behaviour. The relationship between behaviour, gender, and other areas of performance at 5 years was explored. Fifty 4-year-old children (born <28 weeks gestation or birth weight <1,000 g) with minimal/mild motor impairment were randomly allocated to intervention (n = 24) or standard care (n = 26). Intervention was 6 group-based physiotherapy weekly sessions and home programme. Standard care was best practice advice. The Child Behavior Checklist (CBCL) for preschool children was completed at baseline and at 1-year post-baseline. Other measures at follow-up included Movement Assessment Battery for Children Second Edition, Beery Visual-Motor Integration Test 5th Edition, and Peabody Picture Vocabulary Test 4th Edition. The whole cohort improved on CBCL total problems score between baseline (mean 50.0, SD 11.1) and 1-year follow-up (mean 45.2, SD 10.3), p = .004. There were no significant differences between groups over time on CBCL internalizing, externalizing, or total problems scores. The intervention group showed a mean difference in total problems score of -3.8 (CI [1.5, 9.1]) between times, with standard care group values being -4.4 (CI [1.6, 7.1]). Males had higher total problems scores than females (p = .026), although still performed within the "normal" range. CBCL scores did not correlate with other scores. The behaviour of nondisabled ELBW children was within the "normal" range at 4 to 5 years, and both intervention and standard care may have contributed to improved behavioural outcomes. Behaviour was not related to performance in other developmental domains. © 2017 John Wiley & Sons Ltd.
Persisting behavior problems in extremely low birth weight adolescents.
Taylor, H Gerry; Margevicius, Seunghee; Schluchter, Mark; Andreias, Laura; Hack, Maureen
2015-04-01
To describe behavior problems in extremely low birth weight (ELBW, <1000 g) adolescents born 1992 through 1995 based on parent ratings and adolescent self-ratings at age 14 years and to examine changes in parent ratings from ages 8-14. Parent ratings of behavior problems and adolescent self-ratings were obtained for 169 ELBW adolescents (mean birth weight 815 g, gestational age 26 wk) and 115 normal birth weight (NBW) controls at 14 years. Parent ratings of behavior at age 8 years were also available. Behavior outcomes were assessed using symptom severity scores and rates of scores above DSM-IV symptom cutoffs for clinical disorder. The ELBW group had higher symptom severity scores on parent ratings at age 14 years than NBW controls for inattentive attention-deficit hyperactivity disorder (ADHD), anxiety, and social problems (all p's < .01). Rates of parent ratings meeting DSM-IV symptom criteria for inattentive ADHD were also higher for the ELBW group (12% vs. 1%, p < .01). In contrast, the ELBW group had lower symptom severity scores on self-ratings than controls for several scales. Group differences in parent ratings decreased over time for ADHD, especially among females, but were stable for anxiety and social problems. Extremely low birth weight adolescents continue to have behavior problems similar to those evident at a younger age, but these problems are not evident in behavioral self-ratings. The findings suggest that parent ratings provide contrasting perspectives on behavior problems in ELBW youth and support the need to identify and treat these problems early in childhood.
NASA Astrophysics Data System (ADS)
Woo, Hye-Jin; Park, Kyung-Ae
2017-09-01
Significant wave height (SWH) data of nine satellite altimeters were validated with in-situ SWH measurements from buoy stations in the East/Japan Sea (EJS) and the Northwest Pacific Ocean. The spatial and temporal variability of extreme SWHs was investigated by defining the 90th, 95th, and 99th percentiles based on percentile analysis. The annual mean of extreme SWHs was dramatically increased by 3.45 m in the EJS, which is significantly higher than the normal mean of about 1.44 m. The spatial distributions of SWHs showed significantly higher values in the eastern region of the EJS than those in the western part. Characteristic seasonality was found from the time-series SWHs with high SWHs (>2.5 m) in winter but low values (<1 m) in summer. The trends of the normal and extreme (99th percentile) SWHs in the EJS had a positive value of 0.0056 m year-1 and 0.0125 m year-1, respectively. The long-term trend demonstrated that higher SWH values were more extreme with time during the past decades. The predominant spatial distinctions between the coastal regions in the marginal seas of the Northwest Pacific Ocean and open ocean regions were presented. In spring, both normal and extreme SWHs showed substantially increasing trends in the EJS. Finally, we first presented the impact of the long-term trend of extreme SWHs on the marine ecosystem through vertical mixing enhancement in the upper ocean of the EJS.
The end of trend-estimation for extreme floods under climate change?
NASA Astrophysics Data System (ADS)
Schulz, Karsten; Bernhardt, Matthias
2016-04-01
An increased risk of flood events is one of the major threats under future climate change conditions. Therefore, many recent studies have investigated trends in flood extreme occurences using historic long-term river discharge data as well as simulations from combined global/regional climate and hydrological models. Severe floods are relatively rare events and the robust estimation of their probability of occurrence requires long time series of data (6). Following a method outlined by the IPCC research community, trends in extreme floods are calculated based on the difference of discharge values exceeding e.g. a 100-year level (Q100) between two 30-year windows, which represents prevailing conditions in a reference and a future time period, respectively. Following this approach, we analysed multiple, synthetically derived 2,000-year trend-free, yearly maximum runoff data generated using three different extreme value distributions (EDV). The parameters were estimated from long term runoff data of four large European watersheds (Danube, Elbe, Rhine, Thames). Both, Q100-values estimated from 30-year moving windows, as well as the subsequently derived trends showed enormous variations with time: for example, estimating the Extreme Value (Gumbel) - distribution for the Danube data, trends of Q100 in the synthetic time-series range from -4,480 to 4,028 m³/s per 100 years (Q100 =10,071m³/s, for reference). Similar results were found when applying other extreme value distributions (Weibull, and log-Normal) to all of the watersheds considered. This variability or "background noise" of estimating trends in flood extremes makes it almost impossible to significantly distinguish any real trend in observed as well as modelled data when such an approach is applied. These uncertainties, even though known in principle are hardly addressed and discussed by the climate change impact community. Any decision making and flood risk management, including the dimensioning of flood protection measures, that is based on such studies might therefore be fundamentally flawed.
NASA Astrophysics Data System (ADS)
Marani, M.; Zorzetto, E.; Hosseini, S. R.; Miniussi, A.; Scaioni, M.
2017-12-01
The Generalized Extreme Value (GEV) distribution is widely adopted irrespective of the properties of the stochastic process generating the extreme events. However, GEV presents several limitations, both theoretical (asymptotic validity for a large number of events/year or hypothesis of Poisson occurrences of Generalized Pareto events), and practical (fitting uses just yearly maxima or a few values above a high threshold). Here we describe the Metastatistical Extreme Value Distribution (MEVD, Marani & Ignaccolo, 2015), which relaxes asymptotic or Poisson/GPD assumptions and makes use of all available observations. We then illustrate the flexibility of the MEVD by applying it to daily precipitation, hurricane intensity, and storm surge magnitude. Application to daily rainfall from a global raingauge network shows that MEVD estimates are 50% more accurate than those from GEV when the recurrence interval of interest is much greater than the observational period. This makes MEVD suited for application to satellite rainfall observations ( 20 yrs length). Use of MEVD on TRMM data yields extreme event patterns that are in better agreement with surface observations than corresponding GEV estimates.Applied to the HURDAT2 Atlantic hurricane intensity dataset, MEVD significantly outperforms GEV estimates of extreme hurricanes. Interestingly, the Generalized Pareto distribution used for "ordinary" hurricane intensity points to the existence of a maximum limit wind speed that is significantly smaller than corresponding physically-based estimates. Finally, we applied the MEVD approach to water levels generated by tidal fluctuations and storm surges at a set of coastal sites spanning different storm-surge regimes. MEVD yields accurate estimates of large quantiles and inferences on tail thickness (fat vs. thin) of the underlying distribution of "ordinary" surges. In summary, the MEVD approach presents a number of theoretical and practical advantages, and outperforms traditional approaches in several applications. We conclude that the MEVD is a significant contribution to further generalize extreme value theory, with implications for a broad range of Earth Sciences.
ERIC Educational Resources Information Center
Kinnier, Richard T.
1984-01-01
Examined the resolution of value conflicts in 60 adults who wrote a solution to their conflicts. Compared extreme resolutions with those representing compromise. Compromisers and extremists did not differ in how rationally resolved they were about their solutions but compromisers felt better about their solutions. (JAC)
Van Lieshout, Ryan J; Ferro, Mark A; Schmidt, Louis A; Boyle, Michael H; Saigal, Saroj; Morrison, Katherine M; Mathewson, Karen J
2018-04-18
Individuals born extremely preterm are exposed to significant perinatal stresses that are associated with an increased risk of psychopathology. However, a paucity of longitudinal studies has prevented the empirical examination of long-term, dynamic effects of perinatal adversity on mental health. Here, internalizing and externalizing problems from adolescence through adulthood were compared in individuals born at extremely low birth weight (ELBW; <1,000 g) and normal birth weight (NBW; >2,500 g). Internalizing and externalizing data were collected over 20 years in three waves, during adolescence, young adulthood, and adulthood. Growth models were used to compare longitudinal trajectories in a geographically based sample of 151 ELBW survivors and 137 NBW control participants born between 1977 and 1982 matched for age, sex, and socioeconomic status at age 8. After adjusting for sex, socioeconomic and immigrant status, and family functioning, ELBW survivors failed to show the normative, age-related decline in internalizing problems over time relative to their NBW peers (β = .21; p < .01). Both groups exhibited small declines in externalizing problems over the same period. Self-esteem (but not physical health, IQ, or maternal mood) partially mediated the association between ELBW status and internalizing problems. Extremely low birth weight survivors experienced a blunting of the expected improvement in depression and anxiety from adolescence to adulthood. These findings suggest that altered physiological regulatory systems supporting emotional and cognitive processing may contribute to the maintenance of internalizing problems in this population. © 2018 Association for Child and Adolescent Mental Health.
Modeling extreme PM10 concentration in Malaysia using generalized extreme value distribution
NASA Astrophysics Data System (ADS)
Hasan, Husna; Mansor, Nadiah; Salleh, Nur Hanim Mohd
2015-05-01
Extreme PM10 concentration from the Air Pollutant Index (API) at thirteen monitoring stations in Malaysia is modeled using the Generalized Extreme Value (GEV) distribution. The data is blocked into monthly selection period. The Mann-Kendall (MK) test suggests a non-stationary model so two models are considered for the stations with trend. The likelihood ratio test is used to determine the best fitted model and the result shows that only two stations favor the non-stationary model (Model 2) while the other eleven stations favor stationary model (Model 1). The return level of PM10 concentration that is expected to exceed the maximum once within a selected period is obtained.
Anderson, Cynthia M.; Kincaid, Donald
2005-01-01
School discipline is a growing concern in the United States. Educators frequently are faced with discipline problems ranging from infrequent but extreme problems (e.g., shootings) to less severe problems that occur at high frequency (e.g., bullying, insubordination, tardiness, and fighting). Unfortunately, teachers report feeling ill prepared to deal effectively with discipline problems in schools. Further, research suggests that many commonly used strategies, such as suspension, expulsion, and other reactive strategies, are not effective for ameliorating discipline problems and may, in fact, make the situation worse. The principles and technology of behavior analysis have been demonstrated to be extremely effective for decreasing problem behavior and increasing social skills exhibited by school children. Recently, these principles and techniques have been applied at the level of the entire school, in a movement termed schoolwide positive behavior support. In this paper we review the tenets of schoolwide positive behavior support, demonstrating the relation between this technology and applied behavior analysis. PMID:22478439
Vacuum statistics and stability in axionic landscapes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Masoumi, Ali; Vilenkin, Alexander, E-mail: ali@cosmos.phy.tufts.edu, E-mail: vilenkin@cosmos.phy.tufts.edu
2016-03-01
We investigate vacuum statistics and stability in random axionic landscapes. For this purpose we developed an algorithm for a quick evaluation of the tunneling action, which in most cases is accurate within 10%. We find that stability of a vacuum is strongly correlated with its energy density, with lifetime rapidly growing as the energy density is decreased. On the other hand, the probability P(B) for a vacuum to have a tunneling action B greater than a given value declines as a slow power law in B. This is in sharp contrast with the studies of random quartic potentials, which foundmore » a fast exponential decline of P(B). Our results suggest that the total number of relatively stable vacua (say, with B>100) grows exponentially with the number of fields N and can get extremely large for N∼> 100. The problem with this kind of model is that the stable vacua are concentrated near the absolute minimum of the potential, so the observed value of the cosmological constant cannot be explained without fine-tuning. To address this difficulty, we consider a modification of the model, where the axions acquire a quadratic mass term, due to their mixing with 4-form fields. This results in a larger landscape with a much broader distribution of vacuum energies. The number of relatively stable vacua in such models can still be extremely large.« less
Rate of digesta passage in the philippine flying lemur, Cynocephalus volans.
Wischusen, E W; Ingle, N; Richmond, M E
1994-01-01
The rate of digesta passage was measured in five captive Philippine flying lemurs (Cynocephalus volans). These animals were force fed capsules containing known quantities of either particulate or soluble markers. The volumes of the gastrointestinal tracts of three flying lemurs were determined based on the wet weight of the contents of each section of the gut. The mean rate of digesta passage was 14.37 +/- 3.31 h when determined using the particulate marker and 21.9 +/- 0.03 h when determined using the soluble marker. The values based on the particulate marker are between 2% and 10% of similar values for other arboreal folivores. The morphology of the gastrointestinal system of the Philippine flying lemur is similar to that of other hindgut fermenters. Flying lemurs have a simple stomach and a large caecum. The total gut capacity of the Philippine flying lemur is similar to that of other herbivores, but is slightly smaller than that of either the koala (Phascolarctos cinereus), a hindgut fermenter, or the three-toed sloth (Bradypus variegatus), a foregut fermenter. These data suggest that flying lemurs deal with the problems of a folivorous diet very differently than some other arboreal mammals. Phascolarctos cinereus and Bradypus variegatus may represent one extreme with Cynocephalus volans representing the other extreme along a continuum of foraging strategies that are compatible with the arboreal folivore lifestyle.
More tornadoes in the most extreme U.S. tornado outbreaks.
Tippett, Michael K; Lepore, Chiara; Cohen, Joel E
2016-12-16
Tornadoes and severe thunderstorms kill people and damage property every year. Estimated U.S. insured losses due to severe thunderstorms in the first half of 2016 were $8.5 billion (US). The largest U.S. effects of tornadoes result from tornado outbreaks, which are sequences of tornadoes that occur in close succession. Here, using extreme value analysis, we find that the frequency of U.S. outbreaks with many tornadoes is increasing and that it is increasing faster for more extreme outbreaks. We model this behavior by extreme value distributions with parameters that are linear functions of time or of some indicators of multidecadal climatic variability. Extreme meteorological environments associated with severe thunderstorms show consistent upward trends, but the trends do not resemble those currently expected to result from global warming. Copyright © 2016, American Association for the Advancement of Science.
Regularization and computational methods for precise solution of perturbed orbit transfer problems
NASA Astrophysics Data System (ADS)
Woollands, Robyn Michele
The author has developed a suite of algorithms for solving the perturbed Lambert's problem in celestial mechanics. These algorithms have been implemented as a parallel computation tool that has broad applicability. This tool is composed of four component algorithms and each provides unique benefits for solving a particular type of orbit transfer problem. The first one utilizes a Keplerian solver (a-iteration) for solving the unperturbed Lambert's problem. This algorithm not only provides a "warm start" for solving the perturbed problem but is also used to identify which of several perturbed solvers is best suited for the job. The second algorithm solves the perturbed Lambert's problem using a variant of the modified Chebyshev-Picard iteration initial value solver that solves two-point boundary value problems. This method converges over about one third of an orbit and does not require a Newton-type shooting method and thus no state transition matrix needs to be computed. The third algorithm makes use of regularization of the differential equations through the Kustaanheimo-Stiefel transformation and extends the domain of convergence over which the modified Chebyshev-Picard iteration two-point boundary value solver will converge, from about one third of an orbit to almost a full orbit. This algorithm also does not require a Newton-type shooting method. The fourth algorithm uses the method of particular solutions and the modified Chebyshev-Picard iteration initial value solver to solve the perturbed two-impulse Lambert problem over multiple revolutions. The method of particular solutions is a shooting method but differs from the Newton-type shooting methods in that it does not require integration of the state transition matrix. The mathematical developments that underlie these four algorithms are derived in the chapters of this dissertation. For each of the algorithms, some orbit transfer test cases are included to provide insight on accuracy and efficiency of these individual algorithms. Following this discussion, the combined parallel algorithm, known as the unified Lambert tool, is presented and an explanation is given as to how it automatically selects which of the three perturbed solvers to compute the perturbed solution for a particular orbit transfer. The unified Lambert tool may be used to determine a single orbit transfer or for generating of an extremal field map. A case study is presented for a mission that is required to rendezvous with two pieces of orbit debris (spent rocket boosters). The unified Lambert tool software developed in this dissertation is already being utilized by several industrial partners and we are confident that it will play a significant role in practical applications, including solution of Lambert problems that arise in the current applications focused on enhanced space situational awareness.
A Generalized Framework for Non-Stationary Extreme Value Analysis
NASA Astrophysics Data System (ADS)
Ragno, E.; Cheng, L.; Sadegh, M.; AghaKouchak, A.
2017-12-01
Empirical trends in climate variables including precipitation, temperature, snow-water equivalent at regional to continental scales are evidence of changes in climate over time. The evolving climate conditions and human activity-related factors such as urbanization and population growth can exert further changes in weather and climate extremes. As a result, the scientific community faces an increasing demand for updated appraisal of the time-varying climate extremes. The purpose of this study is to offer a robust and flexible statistical tool for non-stationary extreme value analysis which can better characterize the severity and likelihood of extreme climatic variables. This is critical to ensure a more resilient environment in a changing climate. Following the positive feedback on the first version of Non-Stationary Extreme Value Analysis (NEVA) Toolbox by Cheng at al. 2014, we present an improved version, i.e. NEVA2.0. The upgraded version herein builds upon a newly-developed hybrid evolution Markov Chain Monte Carlo (MCMC) approach for numerical parameters estimation and uncertainty assessment. This addition leads to a more robust uncertainty estimates of return levels, return periods, and risks of climatic extremes under both stationary and non-stationary assumptions. Moreover, NEVA2.0 is flexible in incorporating any user-specified covariate other than the default time-covariate (e.g., CO2 emissions, large scale climatic oscillation patterns). The new feature will allow users to examine non-stationarity of extremes induced by physical conditions that underlie the extreme events (e.g. antecedent soil moisture deficit, large-scale climatic teleconnections, urbanization). In addition, the new version offers an option to generate stationary and/or non-stationary rainfall Intensity - Duration - Frequency (IDF) curves that are widely used for risk assessment and infrastructure design. Finally, a Graphical User Interface (GUI) of the package is provided, making NEVA accessible to a broader audience.
Extreme Value Analysis of hydro meteorological extremes in the ClimEx Large-Ensemble
NASA Astrophysics Data System (ADS)
Wood, R. R.; Martel, J. L.; Willkofer, F.; von Trentini, F.; Schmid, F. J.; Leduc, M.; Frigon, A.; Ludwig, R.
2017-12-01
Many studies show an increase in the magnitude and frequency of hydrological extreme events in the course of climate change. However the contribution of natural variability to the magnitude and frequency of hydrological extreme events is not yet settled. A reliable estimate of extreme events is from great interest for water management and public safety. In the course of the ClimEx Project (www.climex-project.org) a new single-model large-ensemble was created by dynamically downscaling the CanESM2 large-ensemble with the Canadian Regional Climate Model version 5 (CRCM5) for an European Domain and a Northeastern North-American domain. By utilizing the ClimEx 50-Member Large-Ensemble (CRCM5 driven by CanESM2 Large-Ensemble) a thorough analysis of natural variability in extreme events is possible. Are the current extreme value statistical methods able to account for natural variability? How large is the natural variability for e.g. a 1/100 year return period derived from a 50-Member Large-Ensemble for Europe and Northeastern North-America? These questions should be answered by applying various generalized extreme value distributions (GEV) to the ClimEx Large-Ensemble. Hereby various return levels (5-, 10-, 20-, 30-, 60- and 100-years) based on various lengths of time series (20-, 30-, 50-, 100- and 1500-years) should be analyzed for the maximum one day precipitation (RX1d), the maximum three hourly precipitation (RX3h) and the streamflow for selected catchments in Europe. The long time series of the ClimEx Ensemble (7500 years) allows us to give a first reliable estimate of the magnitude and frequency of certain extreme events.
Uniformly high-order accurate non-oscillatory schemes, 1
NASA Technical Reports Server (NTRS)
Harten, A.; Osher, S.
1985-01-01
The construction and the analysis of nonoscillatory shock capturing methods for the approximation of hyperbolic conservation laws was begun. These schemes share many desirable properties with total variation diminishing schemes (TVD), but TVD schemes have at most first order accuracy, in the sense of truncation error, at extreme of the solution. A uniformly second order approximation was constucted, which is nonoscillatory in the sense that the number of extrema of the discrete solution is not increasing in time. This is achieved via a nonoscillatory piecewise linear reconstruction of the solution from its cell averages, time evolution through an approximate solution of the resulting initial value problem, and averaging of this approximate solution over each cell.
Visualizing Statistical Mix Effects and Simpson's Paradox.
Armstrong, Zan; Wattenberg, Martin
2014-12-01
We discuss how "mix effects" can surprise users of visualizations and potentially lead them to incorrect conclusions. This statistical issue (also known as "omitted variable bias" or, in extreme cases, as "Simpson's paradox") is widespread and can affect any visualization in which the quantity of interest is an aggregated value such as a weighted sum or average. Our first contribution is to document how mix effects can be a serious issue for visualizations, and we analyze how mix effects can cause problems in a variety of popular visualization techniques, from bar charts to treemaps. Our second contribution is a new technique, the "comet chart," that is meant to ameliorate some of these issues.
NASA Astrophysics Data System (ADS)
Rieder, H. E.; Staehelin, J.; Maeder, J. A.; Ribatet, M.; Davison, A. C.
2009-04-01
Various generations of satellites (e.g. TOMS, GOME, OMI) made spatial datasets of column ozone available to the scientific community. This study has a special focus on column ozone over the northern mid-latitudes. Tools from geostatistics and extreme value theory are applied to analyze variability, long-term trends and frequency distributions of extreme events in total ozone. In a recent case study (Rieder et al., 2009) new tools from extreme value theory (Coles, 2001; Ribatet, 2007) have been applied to the world's longest total ozone record from Arosa, Switzerland (e.g. Staehelin 1998a,b), in order to describe extreme events in low and high total ozone. Within the current study this analysis is extended to satellite datasets for the northern mid-latitudes. Further special emphasis is given on patterns and spatial correlations and the influence of changes in atmospheric dynamics (e.g. tropospheric and lower stratospheric pressure systems) on column ozone. References: Coles, S.: An Introduction to Statistical Modeling of Extreme Values, Springer Series in Statistics, ISBN:1852334592, Springer, Berlin, 2001. Ribatet, M.: POT: Modelling peaks over a threshold, R News, 7, 34-36, 2007. Rieder, H.E., Staehelin, J., Maeder, J.A., Ribatet, M., Stübi, R., Weihs, P., Holawe, F., Peter, T., and Davison, A.C.: From ozone mini holes and mini highs towards extreme value theory: New insights from extreme events and non stationarity, submitted to J. Geophys. Res., 2009. Staehelin, J., Kegel, R., and Harris, N. R.: Trend analysis of the homogenized total ozone series of Arosa (Switzerland), 1929-1996, J. Geophys. Res., 103(D7), 8389-8400, doi:10.1029/97JD03650, 1998a. Staehelin, J., Renaud, A., Bader, J., McPeters, R., Viatte, P., Hoegger, B., Bugnion, V., Giroud, M., and Schill, H.: Total ozone series at Arosa (Switzerland): Homogenization and data comparison, J. Geophys. Res., 103(D5), 5827-5842, doi:10.1029/97JD02402, 1998b.
Web Based Information System for Job Training Activities Using Personal Extreme Programming (PXP)
NASA Astrophysics Data System (ADS)
Asri, S. A.; Sunaya, I. G. A. M.; Rudiastari, E.; Setiawan, W.
2018-01-01
Job training is one of the subjects in university or polytechnic that involves many users and reporting activities. Time and distance became problems for users to reporting and to do obligations tasks during job training due to the location where the job training took place. This research tried to develop a web based information system of job training to overcome the problems. This system was developed using Personal Extreme Programming (PXP). PXP is one of the agile methods is combination of Extreme Programming (XP) and Personal Software Process (PSP). The information system that has developed and tested which are 24% of users are strongly agree, 74% are agree, 1% disagree and 0% strongly disagree about system functionality.
New Insights into the Estimation of Extreme Geomagnetic Storm Occurrences
NASA Astrophysics Data System (ADS)
Ruffenach, Alexis; Winter, Hugo; Lavraud, Benoit; Bernardara, Pietro
2017-04-01
Space weather events such as intense geomagnetic storms are major disturbances of the near-Earth environment that may lead to serious impacts on our modern society. As such, it is of great importance to estimate their probability, and in particular that of extreme events. One approach largely used in statistical sciences for extreme events probability estimates is Extreme Value Analysis (EVA). Using this rigorous statistical framework, estimations of the occurrence of extreme geomagnetic storms are performed here based on the most relevant global parameters related to geomagnetic storms, such as ground parameters (e.g. geomagnetic Dst and aa indexes), and space parameters related to the characteristics of Coronal Mass Ejections (CME) (velocity, southward magnetic field component, electric field). Using our fitted model, we estimate the annual probability of a Carrington-type event (Dst = -850nT) to be on the order of 10-3, with a lower limit of the uncertainties on the return period of ˜500 years. Our estimate is significantly higher than that of most past studies, which typically had a return period of a few 100 years at maximum. Thus precautions are required when extrapolating intense values. Currently, the complexity of the processes and the length of available data inevitably leads to significant uncertainties in return period estimates for the occurrence of extreme geomagnetic storms. However, our application of extreme value models for extrapolating into the tail of the distribution provides a mathematically justified framework for the estimation of extreme return periods, thereby enabling the determination of more accurate estimates and reduced associated uncertainties.
Baral, Sushil C; Aryal, Yeshoda; Bhattrai, Rekha; King, Rebecca; Newell, James N
2014-01-17
People with multi-drug resistant tuberculosis (MDR-TB) in low-income countries face many problems during treatment, and cure rates are low. The purpose of the study was (a) to identify and document the problems experienced by people receiving care for MDR-TB, and how they cope when support is not provided, to inform development of strategies; (b) to estimate the effectiveness of two resultant strategies, counselling alone, and joint counselling and financial support, of increasing DOTS-plus treatment success under routine programme conditions. A mixed-method study comprising a formative qualitative study, pilot intervention study and explanatory qualitative study to better understand barriers to completion of treatment for MDR-TB. Participants were all people starting MDR-TB treatment in seven DOTS-plus centres in the Kathmandu Valley, Nepal during January to December 2008. The primary outcome measure was cure, as internationally defined. MDR-TB treatment caused extreme social, financial and employment hardship. Most patients had to move house and leave their job, and reported major stigmatisation. They were concerned about the long-term effects of their disease, and feared infecting others. In the resultant pilot intervention study, the two strategies appeared to improve treatment outcomes: cure rates for those receiving counselling, combined support and no support were 85%, 76% and 67% respectively. Compared with no support, the (adjusted) risk ratios of cure for those receiving counselling and receiving combined support were 1.2 (95% CI 1.0 to 1.6) and 1.2 (95% CI 0.9 to 1.6) respectively. The explanatory study demonstrated that patients valued both forms of support. MDR-TB patients are extremely vulnerable to stigma and extreme financial hardship. Provision of counselling and financial support may not only reduce their vulnerability, but also increase cure rates. National Tuberculosis Programmes should consider incorporating financial support and counselling into MDR-TB care: costs are low, and benefits high, especially since costs to society of incomplete treatment and potential for incurable TB are extremely high.
Visual Analysis among Novices: Training and Trend Lines as Graphic Aids
ERIC Educational Resources Information Center
Nelson, Peter M.; Van Norman, Ethan R.; Christ, Theodore J.
2017-01-01
The current study evaluated the degree to which novice visual analysts could discern trends in simulated time-series data across differing levels of variability and extreme values. Forty-five novice visual analysts were trained in general principles of visual analysis. One group received brief training on how to identify and omit extreme values.…
Implementing Extreme Value Analysis in a Geospatial Workflow for Storm Surge Hazard Assessment
NASA Astrophysics Data System (ADS)
Catelli, J.; Nong, S.
2014-12-01
Gridded data of 100-yr (1%) and 500-yr (0.2%) storm surge flood elevations for the United States, Gulf of Mexico, and East Coast are critical to understanding this natural hazard. Storm surge heights were calculated across the study area utilizing SLOSH (Sea, Lake, and Overland Surges from Hurricanes) model data for thousands of synthetic US landfalling hurricanes. Based on the results derived from SLOSH, a series of interpolations were performed using spatial analysis in a geographic information system (GIS) at both the SLOSH basin and the synthetic event levels. The result was a single grid of maximum flood elevations for each synthetic event. This project addresses the need to utilize extreme value theory in a geospatial environment to analyze coincident cells across multiple synthetic events. The results are 100-yr (1%) and 500-yr (0.2%) values for each grid cell in the study area. This talk details a geospatial approach to move raster data to SciPy's NumPy Array structure using the Python programming language. The data are then connected through a Python library to an outside statistical package like R to fit cell values to extreme value theory distributions and return values for specified recurrence intervals. While this is not a new process, the value behind this work is the ability to keep this process in a single geospatial environment and be able to easily replicate this process for other natural hazard applications and extreme event modeling.
On extreme points of the diffusion polytope
Hay, M. J.; Schiff, J.; Fisch, N. J.
2017-01-04
Here, we consider a class of diffusion problems defined on simple graphs in which the populations at any two vertices may be averaged if they are connected by an edge. The diffusion polytope is the convex hull of the set of population vectors attainable using finite sequences of these operations. A number of physical problems have linear programming solutions taking the diffusion polytope as the feasible region, e.g. the free energy that can be removed from plasma using waves, so there is a need to describe and enumerate its extreme points. We also review known results for the case ofmore » the complete graph Kn, and study a variety of problems for the path graph Pn and the cyclic graph Cn. Finall, we describe the different kinds of extreme points that arise, and identify the diffusion polytope in a number of simple cases. In the case of increasing initial populations on Pn the diffusion polytope is topologically an n-dimensional hypercube.« less
Value-at-risk estimation with wavelet-based extreme value theory: Evidence from emerging markets
NASA Astrophysics Data System (ADS)
Cifter, Atilla
2011-06-01
This paper introduces wavelet-based extreme value theory (EVT) for univariate value-at-risk estimation. Wavelets and EVT are combined for volatility forecasting to estimate a hybrid model. In the first stage, wavelets are used as a threshold in generalized Pareto distribution, and in the second stage, EVT is applied with a wavelet-based threshold. This new model is applied to two major emerging stock markets: the Istanbul Stock Exchange (ISE) and the Budapest Stock Exchange (BUX). The relative performance of wavelet-based EVT is benchmarked against the Riskmetrics-EWMA, ARMA-GARCH, generalized Pareto distribution, and conditional generalized Pareto distribution models. The empirical results show that the wavelet-based extreme value theory increases predictive performance of financial forecasting according to number of violations and tail-loss tests. The superior forecasting performance of the wavelet-based EVT model is also consistent with Basel II requirements, and this new model can be used by financial institutions as well.
Xiang, Yang; Lu, Kewei; James, Stephen L.; Borlawsky, Tara B.; Huang, Kun; Payne, Philip R.O.
2011-01-01
The Unified Medical Language System (UMLS) is the largest thesaurus in the biomedical informatics domain. Previous works have shown that knowledge constructs comprised of transitively-associated UMLS concepts are effective for discovering potentially novel biomedical hypotheses. However, the extremely large size of the UMLS becomes a major challenge for these applications. To address this problem, we designed a k-neighborhood Decentralization Labeling Scheme (kDLS) for the UMLS, and the corresponding method to effectively evaluate the kDLS indexing results. kDLS provides a comprehensive solution for indexing the UMLS for very efficient large scale knowledge discovery. We demonstrated that it is highly effective to use kDLS paths to prioritize disease-gene relations across the whole genome, with extremely high fold-enrichment values. To our knowledge, this is the first indexing scheme capable of supporting efficient large scale knowledge discovery on the UMLS as a whole. Our expectation is that kDLS will become a vital engine for retrieving information and generating hypotheses from the UMLS for future medical informatics applications. PMID:22154838
Xiang, Yang; Lu, Kewei; James, Stephen L; Borlawsky, Tara B; Huang, Kun; Payne, Philip R O
2012-04-01
The Unified Medical Language System (UMLS) is the largest thesaurus in the biomedical informatics domain. Previous works have shown that knowledge constructs comprised of transitively-associated UMLS concepts are effective for discovering potentially novel biomedical hypotheses. However, the extremely large size of the UMLS becomes a major challenge for these applications. To address this problem, we designed a k-neighborhood Decentralization Labeling Scheme (kDLS) for the UMLS, and the corresponding method to effectively evaluate the kDLS indexing results. kDLS provides a comprehensive solution for indexing the UMLS for very efficient large scale knowledge discovery. We demonstrated that it is highly effective to use kDLS paths to prioritize disease-gene relations across the whole genome, with extremely high fold-enrichment values. To our knowledge, this is the first indexing scheme capable of supporting efficient large scale knowledge discovery on the UMLS as a whole. Our expectation is that kDLS will become a vital engine for retrieving information and generating hypotheses from the UMLS for future medical informatics applications. Copyright © 2011 Elsevier Inc. All rights reserved.
Ege, Tolga; Unlu, Aytekin; Tas, Huseyin; Bek, Dogan; Turkan, Selim; Cetinkaya, Aytac
2015-01-01
Decision of limb salvage or amputation is generally aided with several trauma scoring systems such as the mangled extremity severity score (MESS). However, the reliability of the injury scores in the settling of open fractures due to explosives and missiles is challenging. Mortality and morbidity of the extremity trauma due to firearms are generally associated with time delay in revascularization, injury mechanism, anatomy of the injured site, associated injuries, age and the environmental circumstance. The purpose of the retrospective study was to evaluate the extent of extremity injuries due to ballistic missiles and to detect the reliability of mangled extremity severity score (MESS) in both upper and lower extremities. Between 2004 and 2014, 139 Gustillo Anderson Type III open fractures of both the upper and lower extremities were enrolled in the study. Data for patient age, fire arm type, transporting time from the field to the hospital (and the method), injury severity scores, MESS scores, fracture types, amputation levels, bone fixation methods and postoperative infections and complications retrieved from the two level-2 trauma center's data base. Sensitivity, specificity, positive and negative predictive values of the MESS were calculated to detect the ability in deciding amputation in the mangled limb. Amputation was performed in 39 extremities and limb salvage attempted in 100 extremities. The mean followup time was 14.6 months (range 6-32 months). In the amputated group, the mean MESS scores for upper and lower extremity were 8.8 (range 6-11) and 9.24 (range 6-11), respectively. In the limb salvage group, the mean MESS scores for upper and lower extremities were 5.29 (range 4-7) and 5.19 (range 3-8), respectively. Sensitivity of MESS in upper and lower extremities were calculated as 80% and 79.4% and positive predictive values detected as 55.55% and 83.3%, respectively. Specificity of MESS score for upper and lower extremities was 84% and 86.6%; negative predictive values were calculated as 95.45% and 90.2%, respectively. MESS is not predictive in combat related extremity injuries especially if between a score of 6-8. Limb ischemia and presence or absence of shock can be used in initial decision-making for amputation.
Ege, Tolga; Unlu, Aytekin; Tas, Huseyin; Bek, Dogan; Turkan, Selim; Cetinkaya, Aytac
2015-01-01
Background: Decision of limb salvage or amputation is generally aided with several trauma scoring systems such as the mangled extremity severity score (MESS). However, the reliability of the injury scores in the settling of open fractures due to explosives and missiles is challenging. Mortality and morbidity of the extremity trauma due to firearms are generally associated with time delay in revascularization, injury mechanism, anatomy of the injured site, associated injuries, age and the environmental circumstance. The purpose of the retrospective study was to evaluate the extent of extremity injuries due to ballistic missiles and to detect the reliability of mangled extremity severity score (MESS) in both upper and lower extremities. Materials and Methods: Between 2004 and 2014, 139 Gustillo Anderson Type III open fractures of both the upper and lower extremities were enrolled in the study. Data for patient age, fire arm type, transporting time from the field to the hospital (and the method), injury severity scores, MESS scores, fracture types, amputation levels, bone fixation methods and postoperative infections and complications retrieved from the two level-2 trauma center's data base. Sensitivity, specificity, positive and negative predictive values of the MESS were calculated to detect the ability in deciding amputation in the mangled limb. Results: Amputation was performed in 39 extremities and limb salvage attempted in 100 extremities. The mean followup time was 14.6 months (range 6–32 months). In the amputated group, the mean MESS scores for upper and lower extremity were 8.8 (range 6–11) and 9.24 (range 6–11), respectively. In the limb salvage group, the mean MESS scores for upper and lower extremities were 5.29 (range 4–7) and 5.19 (range 3–8), respectively. Sensitivity of MESS in upper and lower extremities were calculated as 80% and 79.4% and positive predictive values detected as 55.55% and 83.3%, respectively. Specificity of MESS score for upper and lower extremities was 84% and 86.6%; negative predictive values were calculated as 95.45% and 90.2%, respectively. Conclusion: MESS is not predictive in combat related extremity injuries especially if between a score of 6–8. Limb ischemia and presence or absence of shock can be used in initial decision-making for amputation. PMID:26806974
NASA Astrophysics Data System (ADS)
Staehelin, J.; Rieder, H. E.; Maeder, J. A.; Ribatet, M.; Davison, A. C.; Stübi, R.
2009-04-01
Atmospheric ozone protects the biota living at the Earth's surface from harmful solar UV-B and UV-C radiation. The global ozone shield is expected to gradually recover from the anthropogenic disturbance of ozone depleting substances (ODS) in the coming decades. The stratospheric ozone layer at extratropics might significantly increase above the thickness of the chemically undisturbed atmosphere which might enhance ozone concentrations at the tropopause altitude where ozone is an important greenhouse gas. At Arosa, a resort village in the Swiss Alps, total ozone measurements started in 1926 leading to the longest total ozone series of the world. One Fery spectrograph and seven Dobson spectrophotometers were operated at Arosa and the method used to homogenize the series will be presented. Due to its unique length the series allows studying total ozone in the chemically undisturbed as well as in the ODS loaded stratosphere. The series is particularly valuable to study natural variability in the period prior to 1970, when ODS started to affect stratospheric ozone. Concepts developed by extreme value statistics allow objective definitions of "ozone extreme high" and "ozone extreme low" values by fitting the (daily mean) time series using the Generalized Pareto Distribution (GPD). Extreme high ozone events can be attributed to effects of ElNino and/or NAO, whereas in the chemically disturbed stratosphere high frequencies of extreme low total ozone values simultaneously occur with periods of strong polar ozone depletion (identified by statistical modeling with Equivalent Stratospheric Chlorine times Volume of Stratospheric Polar Clouds) and volcanic eruptions (such as El Chichon and Pinatubo).
Variability in winter climate and winter extremes reduces population growth of an alpine butterfly.
Roland, Jens; Matter, Stephen F
2013-01-01
We examined the long-term, 15-year pattern of population change in a network of 21 Rocky Mountain populations of Parnassius smintheus butterflies in response to climatic variation. We found that winter values of the broadscale climate variable, the Pacific Decadal Oscillation (PDO) index, were a strong predictor of annual population growth, much more so than were endogenous biotic factors related to population density. The relationship between PDO and population growth was nonlinear. Populations declined in years with extreme winter PDO values, when there were either extremely warm or extremely cold sea surface temperatures in the eastern Pacific relative to that in the western Pacific. Results suggest that more variable winters, and more frequent extremely cold or warm winters, will result in more frequent decline of these populations, a pattern exacerbated by the trend for increasingly variable winters seen over the past century.
Validation of extremes within the Perfect-Predictor Experiment of the COST Action VALUE
NASA Astrophysics Data System (ADS)
Hertig, Elke; Maraun, Douglas; Wibig, Joanna; Vrac, Mathieu; Soares, Pedro; Bartholy, Judith; Pongracz, Rita; Mares, Ileana; Gutierrez, Jose Manuel; Casanueva, Ana; Alzbutas, Robertas
2016-04-01
Extreme events are of widespread concern due to their damaging consequences on natural and anthropogenic systems. From science to applications the statistical attributes of rare and infrequent occurrence and low probability become connected with the socio-economic aspect of strong impact. Specific end-user needs regarding information about extreme events depend on the type of application, but as a joining element there is always the request for easily accessible climate change information with a clear description of their uncertainties and limitations. Within the Perfect-Predictor Experiment of the COST Action VALUE extreme indices modelled from a wide range of downscaling methods are compared to reference indices calculated from observational data. The experiment uses reference data from a selection of 86 weather stations representative of the different climates in Europe. Results are presented for temperature and precipitation extremes and include aspects of the marginal distribution as well as spell-length related aspects.
Climate Change Impact on Variability of Rainfall Intensity in Upper Blue Nile Basin, Ethiopia
NASA Astrophysics Data System (ADS)
Worku, L. Y.
2015-12-01
Extreme rainfall events are major problems in Ethiopia with the resulting floods that usually could cause significant damage to agriculture, ecology, infrastructure, disruption to human activities, loss of property, loss of lives and disease outbreak. The aim of this study was to explore the likely changes of precipitation extreme changes due to future climate change. The study specifically focuses to understand the future climate change impact on variability of rainfall intensity-duration-frequency in Upper Blue Nile basin. Precipitations data from two Global Climate Models (GCMs) have been used in the study are HadCM3 and CGCM3. Rainfall frequency analysis was carried out to estimate quantile with different return periods. Probability Weighted Method (PWM) selected estimation of parameter distribution and L-Moment Ratio Diagrams (LMRDs) used to find the best parent distribution for each station. Therefore, parent distributions for derived from frequency analysis are Generalized Logistic (GLOG), Generalized Extreme Value (GEV), and Gamma & Pearson III (P3) parent distribution. After analyzing estimated quantile simple disaggregation model was applied in order to find sub daily rainfall data. Finally the disaggregated rainfall is fitted to find IDF curve and the result shows in most parts of the basin rainfall intensity expected to increase in the future. As a result of the two GCM outputs, the study indicates there will be likely increase of precipitation extremes over the Blue Nile basin due to the changing climate. This study should be interpreted with caution as the GCM model outputs in this part of the world have huge uncertainty.
Power laws and extreme values in antibody repertoires
NASA Astrophysics Data System (ADS)
Boyer, Sebastien; Biswas, Dipanwita; Scaramozzino, Natale; Kumar, Ananda Soshee; Nizak, Clément; Rivoire, Olivier
2015-03-01
Evolution by natural selection involves the succession of three steps: mutations, selection and proliferation. We are interested in describing and characterizing the result of selection over a population of many variants. After selection, this population will be dominated by the few best variants, with highest propensity to be selected, or highest ``selectivity.'' We ask the following question: how is the selectivity of the best variants distributed in the population? Extreme value theory, which characterizes the extreme tail of probability distributions in terms of a few universality class, has been proposed to describe it. To test this proposition and identify the relevant universality class, we performed quantitative in vitro experimental selections of libraries of >105 antibodies using the technique of phage display. Data obtained by high-throughput sequencing allows us to fit the selectivity distribution over more than two decades. In most experiments, the results show a striking power law for the selectivity distribution of the top antibodies, consistent with extreme value theory.
NASA Astrophysics Data System (ADS)
Alahmadi, F.; Rahman, N. A.; Abdulrazzak, M.
2014-09-01
Rainfall frequency analysis is an essential tool for the design of water related infrastructure. It can be used to predict future flood magnitudes for a given magnitude and frequency of extreme rainfall events. This study analyses the application of rainfall partial duration series (PDS) in the vast growing urban Madinah city located in the western part of Saudi Arabia. Different statistical distributions were applied (i.e. Normal, Log Normal, Extreme Value type I, Generalized Extreme Value, Pearson Type III, Log Pearson Type III) and their distribution parameters were estimated using L-moments methods. Also, different selection criteria models are applied, e.g. Akaike Information Criterion (AIC), Corrected Akaike Information Criterion (AICc), Bayesian Information Criterion (BIC) and Anderson-Darling Criterion (ADC). The analysis indicated the advantage of Generalized Extreme Value as the best fit statistical distribution for Madinah partial duration daily rainfall series. The outcome of such an evaluation can contribute toward better design criteria for flood management, especially flood protection measures.
Variance analysis of forecasted streamflow maxima in a wet temperate climate
NASA Astrophysics Data System (ADS)
Al Aamery, Nabil; Fox, James F.; Snyder, Mark; Chandramouli, Chandra V.
2018-05-01
Coupling global climate models, hydrologic models and extreme value analysis provides a method to forecast streamflow maxima, however the elusive variance structure of the results hinders confidence in application. Directly correcting the bias of forecasts using the relative change between forecast and control simulations has been shown to marginalize hydrologic uncertainty, reduce model bias, and remove systematic variance when predicting mean monthly and mean annual streamflow, prompting our investigation for maxima streamflow. We assess the variance structure of streamflow maxima using realizations of emission scenario, global climate model type and project phase, downscaling methods, bias correction, extreme value methods, and hydrologic model inputs and parameterization. Results show that the relative change of streamflow maxima was not dependent on systematic variance from the annual maxima versus peak over threshold method applied, albeit we stress that researchers strictly adhere to rules from extreme value theory when applying the peak over threshold method. Regardless of which method is applied, extreme value model fitting does add variance to the projection, and the variance is an increasing function of the return period. Unlike the relative change of mean streamflow, results show that the variance of the maxima's relative change was dependent on all climate model factors tested as well as hydrologic model inputs and calibration. Ensemble projections forecast an increase of streamflow maxima for 2050 with pronounced forecast standard error, including an increase of +30(±21), +38(±34) and +51(±85)% for 2, 20 and 100 year streamflow events for the wet temperate region studied. The variance of maxima projections was dominated by climate model factors and extreme value analyses.
Koyama, Tetsuo; Marumoto, Kohei; Miyake, Hiroji; Domen, Kazuhisa
2013-11-01
This study examined the relationship between fractional anisotropy (FA) values of magnetic resonance-diffusion tensor imaging (DTI) and motor outcome (1 month after onset) in 15 patients with hemiparesis after ischemic stroke of corona radiata lesions. DTI data were obtained on days 14-18. FA values within the cerebral peduncle were analyzed using a computer-automated method. Motor outcome of hemiparesis was evaluated according to Brunnstrom stage (BRS; 6-point scale: severe to normal) for separate shoulder/elbow/forearm, wrist/hand, and lower extremity functions. The ratio of FA values in the affected hemisphere to those in the unaffected hemisphere (rFA) was assessed in relation to the BRS data (Spearman rank correlation test, P<.05). rFA values ranged from .715 to 1.002 (median=.924). BRS ranged from 1 to 6 (median=4) for shoulder/elbow/forearm, from 1 to 6 (median=5) for wrist/hand, and from 2 to 6 (median=4) for the lower extremities. Analysis revealed statistically significant relationships between rFA and upper extremity functions (correlation coefficient=.679 for shoulder/elbow/forearm and .706 for wrist/hand). Although slightly less evident, the relationship between rFA and lower extremity function was also statistically significant (correlation coefficient=.641). FA values within the cerebral peduncle are moderately associated with the outcome of both upper and lower extremity functions, suggesting that DTI may be applicable for outcome prediction in stroke patients with corona radiata infarct. Copyright © 2013 National Stroke Association. Published by Elsevier Inc. All rights reserved.
Nearly extremal apparent horizons in simulations of merging black holes
NASA Astrophysics Data System (ADS)
Lovelace, Geoffrey; Scheel, Mark A.; Owen, Robert; Giesler, Matthew; Katebi, Reza; Szilágyi, Béla; Chu, Tony; Demos, Nicholas; Hemberger, Daniel A.; Kidder, Lawrence E.; Pfeiffer, Harald P.; Afshari, Nousha
2015-03-01
The spin angular momentum S of an isolated Kerr black hole is bounded by the surface area A of its apparent horizon: 8π S≤slant A, with equality for extremal black holes. In this paper, we explore the extremality of individual and common apparent horizons for merging, rapidly spinning binary black holes. We consider simulations of merging black holes with equal masses M and initial spin angular momenta aligned with the orbital angular momentum, including new simulations with spin magnitudes up to S/{{M}2}=0.994. We measure the area and (using approximate Killing vectors) the spin on the individual and common apparent horizons, finding that the inequality 8π S\\lt A is satisfied in all cases but is very close to equality on the common apparent horizon at the instant it first appears. We also evaluate the Booth-Fairhurst extremality, whose value for a given apparent horizon depends on the scaling of the horizon’s null normal vectors. In particular, we introduce a gauge-invariant lower bound on the extremality by computing the smallest value that Booth and Fairhurst’s extremality parameter can take for any scaling. Using this lower bound, we conclude that the common horizons are at least moderately close to extremal just after they appear. Finally, following Lovelace et al (2008 Phys. Rev. D 78 084017), we construct quasiequilibrium binary-black hole initial data with ‘overspun’ marginally trapped surfaces with 8π S\\gt A. We show that the overspun surfaces are indeed superextremal: our lower bound on their Booth-Fairhurst extremality exceeds unity. However, we confirm that these superextremal surfaces are always surrounded by marginally outer trapped surfaces (i.e., by apparent horizons) with 8π S\\lt A. The extremality lower bound on the enclosing apparent horizon is always less than unity but can exceed the value for an extremal Kerr black hole.
Lee, Hsin-Yi; Yeh, Wen-Yu; Chen, Chun-Wan; Wang, Jung-Der
2005-07-01
Prevalence of upper extremity disorders and their associations with psychosocial factors in the workplace have received more attention recently. A national survey of cross-sectional design was performed to determine the prevalence rates of upper extremity disorders among different industries. Trained interviewers administered questionnaires to 17,669 workers and data on musculoskeletal complaints were obtained along with information on risk factors. Overall the 1-year prevalence of neck (14.8%), shoulder (16.6%), and hand (12.4%) disorders were higher than those of the upper back (7.1%) and elbow (8.3%) among those who sought medical treatment due to the complaint. Workers in construction and agriculture-related industries showed a higher prevalence of upper extremity disorders. After multiple logistic regression adjusted for age, education, and employment duration, we found job content, physical working condition, a harmonious interpersonal relationship at the workplace and organizational problems were significant determinants of upper extremity disorders in manufacturing and service industries. Male workers in manufacturing industries showed more concern about physical working conditions while female workers in public administration emphasized problems of job content and interpersonal relationships. We concluded that these factors were major job stressors contributing to musculoskeletal pain of the upper extremity.
ERIC Educational Resources Information Center
Sharma, Kshitij; Chavez-Demoulin, Valérie; Dillenbourg, Pierre
2017-01-01
The statistics used in education research are based on central trends such as the mean or standard deviation, discarding outliers. This paper adopts another viewpoint that has emerged in statistics, called extreme value theory (EVT). EVT claims that the bulk of normal distribution is comprised mainly of uninteresting variations while the most…
NASA Astrophysics Data System (ADS)
Möller, Jens; Heinrich, Hartmut
2017-04-01
As a consequence of climate change atmospheric and oceanographic extremes and their potential impacts on coastal regions are of growing concern for governmental authorities responsible for the transportation infrastructure. Highest risks for shipping as well as for rail and road traffic originate from combined effects of extremes of storm surges and heavy rainfall which sometimes lead to insufficient dewatering of inland waterways. The German Ministry of Transport and digital Infrastructure therefore has tasked its Network of Experts to investigate the possible evolutions of extreme threats for low lands and especially for Kiel Canal, which is an important shortcut for shipping between the North and Baltic Seas. In this study we present results of a comparison of an Extreme Value Analysis (EVA) carried out on gauge observations and values derived from a coupled Regional Ocean-Atmosphere Climate Model (MPI-OM). High water levels at the coasts of the North and Baltic Seas are one of the most important hazards which increase the risk of flooding of the low-lying land and prevents such areas from an adequate dewatering. In this study changes in the intensity (magnitude of the extremes) and duration of extreme water levels (above a selected threshold) are investigated for several gauge stations with data partly reaching back to 1843. Different methods are used for the extreme value statistics, (1) a stationary general Pareto distribution (GPD) model as well as (2) an instationary statistical model for better reproduction of the impact of climate change. Most gauge stations show an increase of the mean water level of about 1-2 mm/year, with a stronger increase of the highest water levels and a decrease (or lower increase) of the lowest water levels. Also, the duration of possible dewatering time intervals for the Kiel-Canal was analysed. The results for the historical gauge station observations are compared to the statistics of modelled water levels from the coupled atmosphere-ocean climate model MPI-OM for the time interval from 1951 to 2000. We demonstrate that for high water levels the observations and MPI-OM results are in good agreement, and we provide an estimate on the decreasing dewatering potential for Kiel Canal until the end of the 21st century.
Spatial variation of statistical properties of extreme water levels along the eastern Baltic Sea
NASA Astrophysics Data System (ADS)
Pindsoo, Katri; Soomere, Tarmo; Rocha, Eugénio
2016-04-01
Most of existing projections of future extreme water levels rely on the use of classic generalised extreme value distributions. The choice to use a particular distribution is often made based on the absolute value of the shape parameter of the Generalise Extreme Value distribution. If this parameter is small, the Gumbel distribution is most appropriate while in the opposite case the Weibull or Frechet distribution could be used. We demonstrate that the alongshore variation in the statistical properties of numerically simulated high water levels along the eastern coast of the Baltic Sea is so large that the use of a single distribution for projections of extreme water levels is highly questionable. The analysis is based on two simulated data sets produced in the Swedish Meteorological and Hydrological Institute. The output of the Rossby Centre Ocean model is sampled with a resolution of 6 h and the output of the circulation model NEMO with a resolution of 1 h. As the maxima of water levels of subsequent years may be correlated in the Baltic Sea, we also employ maxima for stormy seasons. We provide a detailed analysis of spatial variation of the parameters of the family of extreme value distributions along an approximately 600 km long coastal section from the north-western shore of Latvia in the Baltic Proper until the eastern Gulf of Finland. The parameters are evaluated using maximum likelihood method and method of moments. The analysis also covers the entire Gulf of Riga. The core parameter of this family of distributions, the shape parameter of the Generalised Extreme Value distribution, exhibits extensive variation in the study area. Its values evaluated using the Hydrognomon software and maximum likelihood method, vary from about -0.1 near the north-western coast of Latvia in the Baltic Proper up to about 0.05 in the eastern Gulf of Finland. This parameter is very close to zero near Tallinn in the western Gulf of Finland. Thus, it is natural that the Gumbel distribution gives adequate projections of extreme water levels for the vicinity of Tallinn. More importantly, this feature indicates that the use of a single distribution for the projections of extreme water levels and their return periods for the entire Baltic Sea coast is inappropriate. The physical reason is the interplay of the complex shape of large subbasins (such as the Gulf of Riga and Gulf of Finland) of the sea and highly anisotropic wind regime. The 'impact' of this anisotropy on the statistics of water level is amplified by the overall anisotropy of the distributions of the frequency of occurrence of high and low water levels. The most important conjecture is that long-term behaviour of water level extremes in different coastal sections of the Baltic Sea may be fundamentally different.
Asiamah, G; Mock, C; Blantari, J
2002-01-01
Objectives: The knowledge and attitudes of commercial drivers in Ghana as regards alcohol impaired driving were investigated. This was done in order to provide information that could subsequently be used to develop antidrunk driving social marketing messages built upon the intrinsic values and motivation of these drivers. Methods: Focus group discussions were held with 43 bus and minibus drivers in the capital city, Accra. A structured discussion guide was used to capture information related to values, risk perceptions, leisure time activities, and attitudes on alcohol impaired driving. Results: The majority of drivers expressed an understanding that drunk driving was a significant risk factor for crashes. There was a significant under-appreciation of the extent of the problem, however. Most believed that it was only rare, extremely intoxicated drivers who were the problem. The drivers also had a minimal understanding of the concept of blood alcohol concentration and related legal limits. Despite these factors, there was widespread support for increased enforcement of existing antidrunk driving laws. Conclusions: In Ghana, commercial drivers understand the basic danger of drunk driving and are motivated to assist in antidrunk driving measures. There are misconceptions and deficits in knowledge that need to be addressed in subsequent educational campaigns. PMID:11928975
Adaptive Online Sequential ELM for Concept Drift Tackling
Basaruddin, Chan
2016-01-01
A machine learning method needs to adapt to over time changes in the environment. Such changes are known as concept drift. In this paper, we propose concept drift tackling method as an enhancement of Online Sequential Extreme Learning Machine (OS-ELM) and Constructive Enhancement OS-ELM (CEOS-ELM) by adding adaptive capability for classification and regression problem. The scheme is named as adaptive OS-ELM (AOS-ELM). It is a single classifier scheme that works well to handle real drift, virtual drift, and hybrid drift. The AOS-ELM also works well for sudden drift and recurrent context change type. The scheme is a simple unified method implemented in simple lines of code. We evaluated AOS-ELM on regression and classification problem by using concept drift public data set (SEA and STAGGER) and other public data sets such as MNIST, USPS, and IDS. Experiments show that our method gives higher kappa value compared to the multiclassifier ELM ensemble. Even though AOS-ELM in practice does not need hidden nodes increase, we address some issues related to the increasing of the hidden nodes such as error condition and rank values. We propose taking the rank of the pseudoinverse matrix as an indicator parameter to detect “underfitting” condition. PMID:27594879
Trends in 1970-2010 southern California surface maximum temperatures: extremes and heat waves
NASA Astrophysics Data System (ADS)
Ghebreegziabher, Amanuel T.
Daily maximum temperatures from 1970-2010 were obtained from the National Climatic Data Center (NCDC) for 28 South Coast Air Basin (SoCAB) Cooperative Network (COOP) sites. Analyses were carried out on the entire data set, as well as on the 1970-1974 and 2006-2010 sub-periods, including construction of spatial distributions and time-series trends of both summer-average and annual-maximum values and of the frequency of two and four consecutive "daytime" heat wave events. Spatial patterns of average and extreme values showed three areas consistent with climatological SoCAB flow patterns: cold coastal, warm inland low-elevation, and cool further-inland mountain top. Difference (2006-2010 minus 1970-1974) distributions of both average and extreme-value trends were consistent with the shorter period (1970-2005) study of previous study, as they showed the expected inland regional warming and a "reverse-reaction" cooling in low elevation coastal and inland areas open to increasing sea breeze flows. Annual-extreme trends generally showed cooling at sites below 600 m and warming at higher elevations. As the warming trends of the extremes were larger than those of the averages, regional warming thus impacts extremes more than averages. Spatial distributions of hot-day frequencies showed expected maximum at inland low-elevation sites. Regional warming again thus induced increases at both elevated-coastal areas, but low-elevation areas showed reverse-reaction decreases.
End-of-Century Projections of North American Atmospheric River Events in CMIP5 Climate Models
NASA Astrophysics Data System (ADS)
Warner, M.; Mass, C.; Salathe, E. P., Jr.
2014-12-01
Most extreme precipitation events that occur along the North American west coast are associated with narrow plumes of above-average water vapor concentration that stretch from the tropics or subtropics to the West Coast. These events generally occur during the wet season (October-March) and are referred to as atmospheric rivers (AR). ARs can cause major river management problems, damage from flooding or landslides, and loss of life. It is expected that anthropogenic global warming could lead to thermodynamic and dynamic changes in the atmosphere, such as increases in water vapor content and, thus, precipitation, and shifts in the climatological jet stream. Since AR events are associated with extreme values of integrated water vapor (IWV) near the West Coast, increases in IWV could impact the intensity of AR events intersecting the coast. Additionally, ARs are associated with cyclonic activity that originates near and propagates along the jet stream. The jet stream configuration influences the frequency and location of AR landfall along the North American west coast. It is probable that any changes in the general circulation of the atmosphere will result in changes in the frequency, orientation, and location of AR landfalls. Global climate models have sufficient resolution to simulate synoptic features associated with AR events, such as high values of vertically integrated vapor transport (IVT) approaching the coast. Ten Coupled Model Intercomparison Project (CMIP5) simulations are used to identify changes in ARs impacting the west coast of North America between historical (1970-1999) and end-of-century (2070-2099) runs, using representative concentration pathway (RCP) 8.5. The most extreme ARs are identified in both time periods by the 99th percentile of IVT days along a north-south transect offshore of the coast. Integrated water vapor (IWV) and IVT are predicted to increase, while lower-tropospheric winds change little. Winter-mean precipitation along the West Coast increases by 11-18% (4-6% C-1) while precipitation on extreme IVT days increases by 15-39% (5-19% C-1). The frequency of IVT days above the historical 99th percentile threshold increases as much as 290% by the end of this century.
NASA Astrophysics Data System (ADS)
Zotos, Euaggelos E.
2018-06-01
The circular Sitnikov problem, where the two primary bodies are prolate or oblate spheroids, is numerically investigated. In particular, the basins of convergence on the complex plane are revealed by using a large collection of numerical methods of several order. We consider four cases, regarding the value of the oblateness coefficient which determines the nature of the roots (attractors) of the system. For all cases we use the iterative schemes for performing a thorough and systematic classification of the nodes on the complex plane. The distribution of the iterations as well as the probability and their correlations with the corresponding basins of convergence are also discussed. Our numerical computations indicate that most of the iterative schemes provide relatively similar convergence structures on the complex plane. However, there are some numerical methods for which the corresponding basins of attraction are extremely complicated with highly fractal basin boundaries. Moreover, it is proved that the efficiency strongly varies between the numerical methods.
[Newborn with phocomelia and thrombocytopenia. Case report].
Maas, C; Arand, J; Orlikowsky, Th; Goelz, R
2002-01-01
Associated malformations and symptoms may be decisive in the differential diagnosis of neonatal phocomelia. We report on a neonate with phocomelia, petechiae and thrombocytopenia. This constellation is typical for the phocomelia-thrombocytopenia-syndrome, a variant of the thrombocytopenia-absent radius-(TAR) syndrome. During the neonatal period platelet transfusions were necessary. Relevant bleeding and development delays were not evident until the age of seven months. Cardinal symptoms of the TAR syndrome are bilaterally absent radius and neonatal thrombocytopenia. The patient presented with phocomelia of the upper extremities which occurs in only 5 - 10 % of the patients with TAR syndrome. Further abnormalities include additional bone and joint disorders and haematopoietic problems, such as thrombocytopenia. Bleeding episodes mainly occur in the first year of life, hence platelet transfusions may be necessary during this period. A new experimental approach is the Interleukin-6-mediated stimulation of thrombopoiesis. Usually platelet counts reach normal values in adults. The main problem remains a satisfactory management of various limb defects.
A geometric viewpoint on generalized hydrodynamics
NASA Astrophysics Data System (ADS)
Doyon, Benjamin; Spohn, Herbert; Yoshimura, Takato
2018-01-01
Generalized hydrodynamics (GHD) is a large-scale theory for the dynamics of many-body integrable systems. It consists of an infinite set of conservation laws for quasi-particles traveling with effective ("dressed") velocities that depend on the local state. We show that these equations can be recast into a geometric dynamical problem. They are conservation equations with state-independent quasi-particle velocities, in a space equipped with a family of metrics, parametrized by the quasi-particles' type and speed, that depend on the local state. In the classical hard rod or soliton gas picture, these metrics measure the free length of space as perceived by quasi-particles; in the quantum picture, they weigh space with the density of states available to them. Using this geometric construction, we find a general solution to the initial value problem of GHD, in terms of a set of integral equations where time appears explicitly. These integral equations are solvable by iteration and provide an extremely efficient solution algorithm for GHD.
Batra, Romesh C.; Porfiri, Maurizio; Spinello, Davide
2008-01-01
We study the influence of von Kármán nonlinearity, van der Waals force, and thermal stresses on pull-in instability and small vibrations of electrostatically actuated microplates. We use the Galerkin method to develop a tractable reduced-order model for electrostatically actuated clamped rectangular microplates in the presence of van der Waals forces and thermal stresses. More specifically, we reduce the governing two-dimensional nonlinear transient boundary-value problem to a single nonlinear ordinary differential equation. For the static problem, the pull-in voltage and the pull-in displacement are determined by solving a pair of nonlinear algebraic equations. The fundamental vibration frequency corresponding to a deflected configuration of the microplate is determined by solving a linear algebraic equation. The proposed reduced-order model allows for accurately estimating the combined effects of van der Waals force and thermal stresses on the pull-in voltage and the pull-in deflection profile with an extremely limited computational effort. PMID:27879752
Batra, Romesh C; Porfiri, Maurizio; Spinello, Davide
2008-02-15
We study the influence of von Karman nonlinearity, van der Waals force, and a athermal stresses on pull-in instability and small vibrations of electrostatically actuated mi-croplates. We use the Galerkin method to develop a tractable reduced-order model for elec-trostatically actuated clamped rectangular microplates in the presence of van der Waals forcesand thermal stresses. More specifically, we reduce the governing two-dimensional nonlineartransient boundary-value problem to a single nonlinear ordinary differential equation. For thestatic problem, the pull-in voltage and the pull-in displacement are determined by solving apair of nonlinear algebraic equations. The fundamental vibration frequency corresponding toa deflected configuration of the microplate is determined by solving a linear algebraic equa-tion. The proposed reduced-order model allows for accurately estimating the combined effectsof van der Waals force and thermal stresses on the pull-in voltage and the pull-in deflectionprofile with an extremely limited computational effort.
NASA Astrophysics Data System (ADS)
Gao, Tao; Xie, Lian
2016-12-01
Precipitation extremes are the dominated causes for the formation of severe flood disasters at regional and local scales under the background of global climate change. In the present study, five annual extreme precipitation events, including 1, 7 and 30 day annual maximum rainfall and 95th and 97.5th percentile threshold levels, are analyzed relating to the reference period 1960-2011 from 140 meteorological stations over Yangtze River basin (YRB). A generalized extreme value (GEV) distribution is applied to fit annual and percentile extreme precipitation events at each station with return periods up to 200 years. The entire time period is divided into preclimatic (preceding climatic) period 1960-1980 and aftclimatic (after climatic) period 1981-2011 by considering distinctly abrupt shift of precipitation regime in the late 1970s across YRB. And the Mann-Kendall trend test is adopted to conduct trend analysis during pre- and aftclimatic periods, respectively, for the purpose of exploring possible increasing/decreasing patterns in precipitation extremes. The results indicate that the increasing trends for return values during aftclimatic period change significantly in time and space in terms of different magnitudes of extreme precipitation, while the stations with significantly positive trends are mainly distributed in the vicinity of the mainstream and major tributaries as well as large lakes, this would result in more tremendous flood disasters in the mid-lower reaches of YRB, especially in southeast coastal regions. The increasing/decreasing linear trends based on annual maximum precipitation are also investigated in pre- and aftclimatic periods, respectively, whereas those changes are not significantly similar to the variations of return values during both subperiods. Moreover, spatiotemporal patterns of precipitation extremes become more uneven and unstable in the second half period over YRB.
Investigations into Gravitational Wave Emission from Compact Body Inspiral into Massive Black Holes
NASA Technical Reports Server (NTRS)
Hughes, Scott A.
2005-01-01
In contrast to year 1 (when much of the activity associated with this grant focused upon developing our group at MIT), year 2 was a period of very focused attention on research problems. We made significant progress developing relativistic waveforms for the extreme mass ratio inspiral problem; we have pushed forward a formalism our group developed for mapping the spacetimes of massive compact objects; and, in collaboration with the Caltech group, we began to develop a framework for addressing issues in LISA data analysis for extreme mass ratio systems.
Extreme marginalization: addiction and other mental health disorders, stigma, and imprisonment
Kreek, Mary Jeanne
2013-01-01
Major well-defined medical problems that are, in part, the unfortunate outcome of a negative social environment may include specific addictive diseases and other mental health disorders, in particular the affective disorders of anxiety, depression, social phobia, and post-traumatic stress syndrome. This overview touches on the topic of extreme marginalization associated with addiction and other mental health disorders, along with arrest, imprisonment, and parole. All of these are characterized by lasting stigma that hauntingly continues to impact upon each person suffering from any of these problems. PMID:21884162
The NASA Langley Multidisciplinary Uncertainty Quantification Challenge
NASA Technical Reports Server (NTRS)
Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.
2014-01-01
This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.
NASA Astrophysics Data System (ADS)
Mentaschi, Lorenzo; Vousdoukas, Michalis; Voukouvalas, Evangelos; Sartini, Ludovica; Feyen, Luc; Besio, Giovanni; Alfieri, Lorenzo
2016-09-01
Statistical approaches to study extreme events require, by definition, long time series of data. In many scientific disciplines, these series are often subject to variations at different temporal scales that affect the frequency and intensity of their extremes. Therefore, the assumption of stationarity is violated and alternative methods to conventional stationary extreme value analysis (EVA) must be adopted. Using the example of environmental variables subject to climate change, in this study we introduce the transformed-stationary (TS) methodology for non-stationary EVA. This approach consists of (i) transforming a non-stationary time series into a stationary one, to which the stationary EVA theory can be applied, and (ii) reverse transforming the result into a non-stationary extreme value distribution. As a transformation, we propose and discuss a simple time-varying normalization of the signal and show that it enables a comprehensive formulation of non-stationary generalized extreme value (GEV) and generalized Pareto distribution (GPD) models with a constant shape parameter. A validation of the methodology is carried out on time series of significant wave height, residual water level, and river discharge, which show varying degrees of long-term and seasonal variability. The results from the proposed approach are comparable with the results from (a) a stationary EVA on quasi-stationary slices of non-stationary series and (b) the established method for non-stationary EVA. However, the proposed technique comes with advantages in both cases. For example, in contrast to (a), the proposed technique uses the whole time horizon of the series for the estimation of the extremes, allowing for a more accurate estimation of large return levels. Furthermore, with respect to (b), it decouples the detection of non-stationary patterns from the fitting of the extreme value distribution. As a result, the steps of the analysis are simplified and intermediate diagnostics are possible. In particular, the transformation can be carried out by means of simple statistical techniques such as low-pass filters based on the running mean and the standard deviation, and the fitting procedure is a stationary one with a few degrees of freedom and is easy to implement and control. An open-source MATLAB toolbox has been developed to cover this methodology, which is available at https://github.com/menta78/tsEva/ (Mentaschi et al., 2016).
ERIC Educational Resources Information Center
Williams, Lela Rankin; Degnan, Kathryn A.; Perez-Edgar, Koraly E.; Henderson, Heather A.; Rubin, Kenneth H.; Pine, Daniel S.; Steinberg, Laurence; Fox, Nathan A.
2009-01-01
Behavioral inhibition (BI) is characterized by a pattern of extreme social reticence, risk for internalizing behavior problems, and possible protection against externalizing behavior problems. Parenting style may also contribute to these associations between BI and behavior problems (BP). A sample of 113 children was assessed for BI in the…
Kondrup, S. V.; Bennett, P. C.; Forkman, B.; Meyer, I; Proschowsky, H. F.; Serpell, J. A.; Lund, T. B.
2017-01-01
A number of dog breeds suffer from welfare problems due to extreme phenotypes and high levels of inherited diseases but the popularity of such breeds is not declining. Using a survey of owners of two popular breeds with extreme physical features (French Bulldog and Chihuahua), one with a high load of inherited diseases not directly related to conformation (Cavalier King Charles Spaniel), and one representing the same size range but without extreme conformation and with the same level of disease as the overall dog population (Cairn Terrier), we investigated this seeming paradox. We examined planning and motivational factors behind acquisition of the dogs, and whether levels of experienced health and behavior problems were associated with the quality of the owner-dog relationship and the intention to re-procure a dog of the same breed. Owners of each of the four breeds (750/breed) were randomly drawn from a nationwide Danish dog registry and invited to participate. Of these, 911 responded, giving a final sample of 846. There were clear differences between owners of the four breeds with respect to degree of planning prior to purchase, with owners of Chihuahuas exhibiting less. Motivations behind choice of dog were also different. Health and other breed attributes were more important to owners of Cairn Terriers, whereas the dog’s personality was reported to be more important for owners of French Bulldogs and Cavalier King Charles Spaniels but less important for Chihuahua owners. Higher levels of health and behavior problems were positively associated with a closer owner-dog relationship for owners of Cavalier King Charles Spaniels and Chihuahuas but, for owners of French Bulldogs, high levels of problems were negatively associated with an intention to procure the same breed again. In light of these findings, it appears less paradoxical that people continue to buy dogs with welfare problems. PMID:28234931
Sandøe, P; Kondrup, S V; Bennett, P C; Forkman, B; Meyer, I; Proschowsky, H F; Serpell, J A; Lund, T B
2017-01-01
A number of dog breeds suffer from welfare problems due to extreme phenotypes and high levels of inherited diseases but the popularity of such breeds is not declining. Using a survey of owners of two popular breeds with extreme physical features (French Bulldog and Chihuahua), one with a high load of inherited diseases not directly related to conformation (Cavalier King Charles Spaniel), and one representing the same size range but without extreme conformation and with the same level of disease as the overall dog population (Cairn Terrier), we investigated this seeming paradox. We examined planning and motivational factors behind acquisition of the dogs, and whether levels of experienced health and behavior problems were associated with the quality of the owner-dog relationship and the intention to re-procure a dog of the same breed. Owners of each of the four breeds (750/breed) were randomly drawn from a nationwide Danish dog registry and invited to participate. Of these, 911 responded, giving a final sample of 846. There were clear differences between owners of the four breeds with respect to degree of planning prior to purchase, with owners of Chihuahuas exhibiting less. Motivations behind choice of dog were also different. Health and other breed attributes were more important to owners of Cairn Terriers, whereas the dog's personality was reported to be more important for owners of French Bulldogs and Cavalier King Charles Spaniels but less important for Chihuahua owners. Higher levels of health and behavior problems were positively associated with a closer owner-dog relationship for owners of Cavalier King Charles Spaniels and Chihuahuas but, for owners of French Bulldogs, high levels of problems were negatively associated with an intention to procure the same breed again. In light of these findings, it appears less paradoxical that people continue to buy dogs with welfare problems.
Analyzing phenological extreme events over the past five decades in Germany
NASA Astrophysics Data System (ADS)
Schleip, Christoph; Menzel, Annette; Estrella, Nicole; Graeser, Philipp
2010-05-01
As climate change may alter the frequency and intensity of extreme temperatures, we analysed whether warming of the last 5 decades has already changed the statistics of phenological extreme events. In this context, two extreme value statistical concepts are discussed and applied to existing phenological datasets of German Weather Service (DWD) in order to derive probabilities of occurrence for extreme early or late phenological events. We analyse four phenological groups; "begin of flowering, "leaf foliation", "fruit ripening" and "leaf colouring" as well as DWD indicator phases of the "phenological year". Additionally we put an emphasis on a between-species analysis; a comparison of differences in extreme onsets between three common northern conifers. Furthermore we conducted a within-species analysis with different phases of horse chestnut throughout a year. The first statistical approach fits data to a Gaussian model using traditional statistical techniques, and then analyses the extreme quantile. The key point of this approach is the adoption of an appropriate probability density function (PDF) to the observed data and the assessment of the PDF parameters change in time. The full analytical description in terms of the estimated PDF for defined time steps of the observation period allows probability assessments of extreme values for e.g. annual or decadal time steps. Related with this approach is the possibility of counting out the onsets which fall in our defined extreme percentiles. The estimation of the probability of extreme events on the basis of the whole data set is in contrast to analyses with the generalized extreme value distribution (GEV). The second approach deals with the extreme PDFs itself and fits the GEV distribution to annual minima of phenological series to provide useful estimates about return levels. For flowering and leaf unfolding phases exceptionally early extremes are seen since the mid 1980s and especially for the single years 1961, 1990 and 2007 whereas exceptionally extreme late events are seen in the year 1970. Summer phases such as fruit ripening exhibit stronger shifts to early extremes than spring phases. Leaf colouring phases reveal increasing probability for late extremes. The with GEV estimated 100-year event of Picea, Pinus and Larix amount to extreme early events of about -27, -31.48 and -32.79 days, respectively. If we assume non-stationary minimum data we get a more extreme 100-year event of about -35.40 for Picea but associated with wider confidence intervals. The GEV is simply another probability distribution but for purposes of extreme analysis in phenology it should be considered as equally important as (if not more important than) the Gaussian PDF approach.
Kavlak, Erdoğan; Altuğ, Filiz; Büker, Nihal; Şenol, Hande
2015-01-01
The objective of this study is to investigate musculoskeletal system problems and quality of life of mothers of children with cerebral palsy with different levels of disability. 100 children (37 girls and 63 boys) with cerebral palsy (CP) and their mothers were included in this study. Functional levels of children with CP were assessed by using the Gross Motor Function Classification System (GMFCS) and the Pediatric Functional Independence Measure (WeeFIM). Quality of life of mothers regarding health was assessed by using the Nottingham Health Profile (NHP). Musculoskeletal system problems of mothers were assessed by using the Neck Disability Index (NDI) and the Roland-Morris Disability Questionnaire (RMDQ). No statistical significance was found when GMFCS levels of children with CP and the NHP, DASH-T, RMDQ, NDI and the BAE values of mothers were compared in an inter-group way (p> 0.05). When the NHP parameters and the existence of lower and arm pains of mothers were compared with their BAI, NDI, RMDQ and DASH-T scores, a statistically significant relationship was found among them (p< 0.05). As functional levels of children with CP get worse, upper extremity, lower back and neck problems and anxiety levels of mothers increase and this situation negatively affects mothers' quality of life.
Do causes of stress differ in their association with problem drinking by sex in Korean adolescents?
Choi, Jae-Woo; Park, Eun-Cheol; Kim, Jae-Hyun; Park, So-Hee
2017-01-01
Previous studies have focused mainly on whether stress causes present drinking or excessive drinking. However, few studies have been conducted on the relationship between stress and problem drinking in adolescents. The objective of this study was to examine the stress level and the cause of stress related to problem drinking behavior according to sex among Korean youth. Data for this study were pooled from cross-sectional data collected annually from 2007 through 2012 from the Korea Youth Risk Behavior Web-based Survey. A representative sample of 442,113 students from 800 randomly selected middle and high schools in Korea were included. Multiple logistic regression models were used in the analysis. Both male and female students with extremely high stress were more likely to engage in problem drinking than were students with no stress (odds ratios [OR], 1.73 in males and 1.41 in females). The major causes of stress in male students that were associated with problem drinking were conflict with a teacher, trouble with parents, and peer relationships (ORs, 2.47, 1.72, and 1.71, respectively), whereas there are no statistically significant association between causes of stress and problem drinking among female students. Considering stress level, Male students with extremely high stress level were associated with problem drinking regardless of causes of stress, while Female students who felt extremely high levels of stress were more likely to engage in problem drinking due to stress from a conflict with parents, peer relationships, appearance, and financial difficulty (ORs, 1.53, 1.53, 1.46, and 1.47, respectively). Adolescents who engage in problem drinking may be affected by different causes of stress according to sex. Thus, appropriate approaches that reflect sex differences will be helpful to alleviate problem drinking in adolescents and educational authorities need to arrange more effective education program for drinking given positive associations between drinking education and problem drinking. Copyright © 2016. Published by Elsevier Ltd.
Scaling Reward Value with Demand Curves versus Preference Tests
Schwartz, Lindsay P.; Silberberg, Alan; Casey, Anna H.; Paukner, Annika; Suomi, Stephen J.
2016-01-01
In Experiment 1, six capuchins lifted a weight during a 10-minute session to receive a food piece. Across conditions, the weight was increased across six different amounts for three different food types. The number of food pieces obtained as a function of the weight lifted was fitted by a demand equation that is hypothesized to quantify food value. For most subjects, this analysis showed that the three food types differed little in value. In Experiment 2, these monkeys were given pairwise choices among these food types. In 13 of 18 comparisons, preferences at least equaled a 3-to-1 ratio; in seven comparisons, preference was absolute. There was no relation between values based on degree of preference versus values based on the demand equation. When choices in the present report were compared to similar data with these subjects from another study, between-study lability in preference emerged. This outcome contrasts with the finding in demand analysis that test-retest reliability is high. We attribute the unreliability and extreme assignment of value based on preference tests to high substitutability between foods. We suggest use of demand analysis instead of preference tests for studies that compare the values of different foods. A better strategy might be to avoid manipulating value by using different foods. Where possible, value should be manipulated by varying amounts of a single food type because, over an appropriate range, more food is consistently more valuable than less. Such an approach would be immune to problems in between-food substitutability. PMID:26908005
Extreme sea storm in the Mediterranean Sea. Trends during the 2nd half of the 20th century.
NASA Astrophysics Data System (ADS)
Pino, C.; Lionello, P.; Galati, M. B.
2009-04-01
Extreme sea storm in the Mediterranean Sea. Trends during the 2nd half of the 20th century Piero Lionello, University of Salento, piero.lionello@unisalento.it Maria Barbara Galati, University of Salento, mariabarbara.galati@unisalento.it Cosimo Pino, University of Salento, pino@le.infn.it The analysis of extreme Significant Wave Height (SWH) values and their trend is crucial for planning and managing coastal defences and off-shore activities. The analysis provided by this study covers a 44-year long period (1958-2001). First the WW3 (Wave Watch 3) model forced with the REMO-Hipocas regional model wind fields has been used for the hindcast of extreme SWH values over the Mediterranean basin with a 0.25 deg lat-lon resolution. Subsequently, the model results have been processed with an ad hoc software to detect storms. GEV analysis has been perfomed and a set of indicators for extreme SWH have been computed, using the Mann Kendall test for assessing statistical significance of trends for different parameter such as the number of extreme events, their duration and their intensity. Results suggest a transition towards weaker extremes and a milder climate over most of the Mediterranean Sea.
ERIC Educational Resources Information Center
Cawley, Robert
1978-01-01
Considers the problem of determining the force on an element of a finite length line of charge moving horizontally with extreme relativistic speed through an evacuated space above an infinite plane ideal conducting surface. (SL)
Fighting Illiteracy in the Arab World
ERIC Educational Resources Information Center
Hammud, Muwafaq Abu; Jarrar, Amani G.
2017-01-01
Illiteracy in the Arab world is becoming an urgent necessity particularly facing problems of poverty, ignorance, extremism, which impede the required economic, social, political and cultural development processes. Extremism, violence and terrorism, in the Arab world, can only be eliminated by spreading of knowledge, fighting illiteracy. The study…
Research in Stochastic Processes
1988-08-31
stationary sequence, Stochastic Proc. Appl. 29, 1988, 155-169 T. Hsing, J. Husler and M.R. Leadbetter, On the exceedance point process for a stationary...Nandagopalan, On exceedance point processes for "regular" sample functions, Proc. Volume, Oberxolfach Conf. on Extreme Value Theory, J. Husler and R. Reiss...exceedance point processes for stationary sequences under mild oscillation restrictions, Apr. 88. Obermotfach Conf. on Extremal Value Theory. Ed. J. HUsler
Quinn, Terrance; Sinkala, Zachariah
2014-01-01
We develop a general method for computing extreme value distribution (Gumbel, 1958) parameters for gapped alignments. Our approach uses mixture distribution theory to obtain associated BLOSUM matrices for gapped alignments, which in turn are used for determining significance of gapped alignment scores for pairs of biological sequences. We compare our results with parameters already obtained in the literature.
Ensemble-based evaluation of extreme water levels for the eastern Baltic Sea
NASA Astrophysics Data System (ADS)
Eelsalu, Maris; Soomere, Tarmo
2016-04-01
The risks and damages associated with coastal flooding that are naturally associated with an increase in the magnitude of extreme storm surges are one of the largest concerns of countries with extensive low-lying nearshore areas. The relevant risks are even more contrast for semi-enclosed water bodies such as the Baltic Sea where subtidal (weekly-scale) variations in the water volume of the sea substantially contribute to the water level and lead to large spreading of projections of future extreme water levels. We explore the options for using large ensembles of projections to more reliably evaluate return periods of extreme water levels. Single projections of the ensemble are constructed by means of fitting several sets of block maxima with various extreme value distributions. The ensemble is based on two simulated data sets produced in the Swedish Meteorological and Hydrological Institute. A hindcast by the Rossby Centre Ocean model is sampled with a resolution of 6 h and a similar hindcast by the circulation model NEMO with a resolution of 1 h. As the annual maxima of water levels in the Baltic Sea are not always uncorrelated, we employ maxima for calendar years and for stormy seasons. As the shape parameter of the Generalised Extreme Value distribution changes its sign and substantially varies in magnitude along the eastern coast of the Baltic Sea, the use of a single distribution for the entire coast is inappropriate. The ensemble involves projections based on the Generalised Extreme Value, Gumbel and Weibull distributions. The parameters of these distributions are evaluated using three different ways: maximum likelihood method and method of moments based on both biased and unbiased estimates. The total number of projections in the ensemble is 40. As some of the resulting estimates contain limited additional information, the members of pairs of projections that are highly correlated are assigned weights 0.6. A comparison of the ensemble-based projection of extreme water levels and their return periods with similar estimates derived from local observations reveals an interesting pattern of match and mismatch. The match is almost perfect in measurement sites where local effects (e.g., wave-induced set-up or local surge in very shallow areas that are not resolved by circulation models) do not contribute to the observed values of water level. There is, however, substantial mismatch between projected and observed extreme values for most of the Estonian coast. The mismatch is largest for sections that are open to high waves and for several bays that are deeply cut into mainland but open for predominant strong wind directions. Detailed quantification of this mismatch eventually makes it possible to develop substantially improved estimates of extreme water levels in sections where local effects considerably contribute into the total water level.
Some Problems of Extremes in Geometry and Construction
ERIC Educational Resources Information Center
Yanovsky, Levi
2008-01-01
Two original problems in geometry are presented with solutions utilizing to differential calculus: (a) rectangle inscribed in a sector; (b) point on the ray of the angle. The possibility of applying mathematics in general and differential calculus in particular for solution of practical problems is discussed. (Contains 8 figures.)
Extreme ultraviolet index due to broken clouds at a midlatitude site, Granada (southeastern Spain)
NASA Astrophysics Data System (ADS)
Antón, M.; Piedehierro, A. A.; Alados-Arboledas, L.; Wolfran, E.; Olmo, F. J.
2012-11-01
Cloud cover usually attenuates the ultraviolet (UV) solar radiation but, under certain sky conditions, the clouds may produce an enhancement effect increasing the UV levels at surface. The main objective of this paper is to analyze an extreme UV enhancement episode recorded on 16 June 2009 at Granada (southeastern Spain). This phenomenon was characterized by a quick and intense increase in surface UV radiation under broken cloud fields (5-7 oktas) in which the Sun was surrounded by cumulus clouds (confirmed with sky images). Thus, the UV index (UVI) showed an enhancement of a factor 4 in the course of only 30 min around midday, varying from 2.6 to 10.4 (higher than the corresponding clear-sky UVI value). Additionally, the UVI presented values higher than 10 (extreme erythemal risk) for about 20 min running, with a maximum value around 11.5. The use of an empirical model and the total ozone column (TOC) derived from the Global Ozone Monitoring Experiment (GOME) for the period 1995-2011 showed that the value of UVI ~ 11.5 is substantially larger than the highest index that could origin the natural TOC variations over Granada. Finally, the UV erythemal dose accumulated during the period of 20 min with the extreme UVI values under broken cloud fields was 350 J/m2 which surpass the energy required to produce sunburn of the most human skin types.
Isokinetic profile of elbow flexion and extension strength in elite junior tennis players.
Ellenbecker, Todd S; Roetert, E Paul
2003-02-01
Descriptive study. To determine whether bilateral differences exist in concentric elbow flexion and extension strength in elite junior tennis players. The repetitive nature of tennis frequently produces upper extremity overuse injuries. Prior research has identified tennis-specific strength adaptation in the dominant shoulder and distal upper extremity musculature of elite players. No previous study has addressed elbow flexion and extension strength. Thirty-eight elite junior tennis players were bilaterally tested for concentric elbow flexion and extension muscle performance on a Cybex 6000 isokinetic dynamometer at 90 degrees/s, 210 degrees/s, and 300 degrees/s. Repeated-measures ANOVAs were used to test for differences between extremities, muscle groups, and speed. Significantly greater (P<0.002) dominant-arm elbow extension peak torque values were measured at 90 degrees/s, 210 degrees/s, and 300 degrees/s for males. Significantly greater (P<0.002) dominant-arm single-repetition work values were also measured at 90 degrees/s and 210 degrees/s for males. No significant difference was measured between extremities in elbow flexion muscular performance in males and for elbow flexion or extension peak torque and single-repetition work values in females. No significant difference between extremities was measured in elbow flexion/extension strength ratios in females and significant differences between extremities in this ratio were only present at 210 degrees/s in males (P<0.002). These data indicate muscular adaptations around the dominant elbow in male elite junior tennis players but not females. These data have ramifications for clinicians rehabilitating upper extremity injuries in patients from this population.
Climatic extremes improve predictions of spatial patterns of tree species
Zimmermann, N.E.; Yoccoz, N.G.; Edwards, T.C.; Meier, E.S.; Thuiller, W.; Guisan, Antoine; Schmatz, D.R.; Pearman, P.B.
2009-01-01
Understanding niche evolution, dynamics, and the response of species to climate change requires knowledge of the determinants of the environmental niche and species range limits. Mean values of climatic variables are often used in such analyses. In contrast, the increasing frequency of climate extremes suggests the importance of understanding their additional influence on range limits. Here, we assess how measures representing climate extremes (i.e., interannual variability in climate parameters) explain and predict spatial patterns of 11 tree species in Switzerland. We find clear, although comparably small, improvement (+20% in adjusted D2, +8% and +3% in cross-validated True Skill Statistic and area under the receiver operating characteristics curve values) in models that use measures of extremes in addition to means. The primary effect of including information on climate extremes is a correction of local overprediction and underprediction. Our results demonstrate that measures of climate extremes are important for understanding the climatic limits of tree species and assessing species niche characteristics. The inclusion of climate variability likely will improve models of species range limits under future conditions, where changes in mean climate and increased variability are expected.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ghosh, Subimal; Das, Debasish; Kao, Shih-Chieh
Recent studies disagree on how rainfall extremes over India have changed in space and time over the past half century, as well as on whether the changes observed are due to global warming or regional urbanization. Although a uniform and consistent decrease in moderate rainfall has been reported, a lack of agreement about trends in heavy rainfall may be due in part to differences in the characterization and spatial averaging of extremes. Here we use extreme value theory to examine trends in Indian rainfall over the past half century in the context of long-term, low-frequency variability.We show that when generalizedmore » extreme value theory is applied to annual maximum rainfall over India, no statistically significant spatially uniform trends are observed, in agreement with previous studies using different approaches. Furthermore, our space time regression analysis of the return levels points to increasing spatial variability of rainfall extremes over India. Our findings highlight the need for systematic examination of global versus regional drivers of trends in Indian rainfall extremes, and may help to inform flood hazard preparedness and water resource management in the region.« less
Sequences of extremal radially excited rotating black holes.
Blázquez-Salcedo, Jose Luis; Kunz, Jutta; Navarro-Lérida, Francisco; Radu, Eugen
2014-01-10
In the Einstein-Maxwell-Chern-Simons theory the extremal Reissner-Nordström solution is no longer the single extremal solution with vanishing angular momentum, when the Chern-Simons coupling constant reaches a critical value. Instead a whole sequence of rotating extremal J=0 solutions arises, labeled by the node number of the magnetic U(1) potential. Associated with the same near horizon solution, the mass of these radially excited extremal solutions converges to the mass of the extremal Reissner-Nordström solution. On the other hand, not all near horizon solutions are also realized as global solutions.
Cazelle, Elodie; Eskes, Chantra; Hermann, Martina; Jones, Penny; McNamee, Pauline; Prinsen, Menk; Taylor, Hannah; Wijnands, Marcel V W
2015-04-01
A.I.S.E. investigated the suitability of the regulatory adopted ICE in vitro test method (OECD TG 438) with or without histopathology to identify detergent and cleaning formulations having extreme pH that require classification as EU CLP/UN GHS Category 1. To this aim, 18 extreme pH detergent and cleaning formulations were tested covering both alkaline and acidic extreme pHs. The ICE standard test method following OECD Test Guideline 438 showed good concordance with in vivo classification (83%) and good and balanced specificity and sensitivity values (83%) which are in line with the performances of currently adopted in vitro test guidelines, confirming its suitability to identify Category 1 extreme pH detergent and cleaning products. In contrast to previous findings obtained with non-extreme pH formulations, the use of histopathology did not improve the sensitivity of the assay whilst it strongly decreased its specificity for the extreme pH formulations. Furthermore, use of non-testing prediction rules for classification showed poor concordance values (33% for the extreme pH rule and 61% for the EU CLP additivity approach) with high rates of over-prediction (100% for the extreme pH rule and 50% for the additivity approach), indicating that these non-testing prediction rules are not suitable to predict Category 1 hazards of extreme pH detergent and cleaning formulations. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Pei, S.; Laws, E. A.; Ye, S.
2017-12-01
Fluvial inputs of nutrients and efficient nutrient recycling mechanisms make estuarine and coastal zones highly productive bodies of water. For the same reasons, they are susceptible to eutrophication problems. In China, eutrophication problems along coasts are becoming serious because of discharges of domestic sewage and industrial wastewater and runoff of agricultural fertilizer. Addressing these problems requires an informed assessment of the factors that controlling algal production. Our study aims at determining the factors that controlling patchiness of phytoplankton and primary production in Liaodong Bay, China that receives large inputs of nutrients from human activities in its watershed, and examining the variation patterns of phytoplankton photosynthesis under both stressors of climate change and human activities. Results of our field study suggest that nutrient concentrations were above growth-rate-saturating concentrations throughout Liaodong bay, with the possible exception of phosphate at some stations. This assessment was consistent with the results of nutrient enrichment experiments and the values of light-saturated photosynthetic rates and areal photosynthetic rates. Two large patches of high biomass and production with dimensions on the order of 10 km reflect the effects of water temperature and variation of light penetration restricted by water turbidity. To examine the effects of irradiance and temperature on light-saturated photosynthetic rates normalized to chlorophyll a concentrations (Popt), light-conditioned Popt values were modeled as a function of the temperature with a satisfactory fit to our field data (R2 = 0.60, p = 0.003). In this model, light-conditioned Popt values increased with temperatures from 22°C to roughly 25°C but declined precipitously at higher temperatures. The relatively high Popt values and low ratios of light absorbed to photosynthesis at coastal stations suggest the highly efficient usage of absorbed light by phytoplankton under replete nutrient levels and favorable temperatures. Comparatively, the low Popt values and high ratios of light absorbed to photosynthesis at estuarine stations suggest rather extreme light limitation and lowly efficient usage of absorbed light in photosynthesis in the Liaohe River estuary.
Long-term statistics of extreme tsunami height at Crescent City
NASA Astrophysics Data System (ADS)
Dong, Sheng; Zhai, Jinjin; Tao, Shanshan
2017-06-01
Historically, Crescent City is one of the most vulnerable communities impacted by tsunamis along the west coast of the United States, largely attributed to its offshore geography. Trans-ocean tsunamis usually produce large wave runup at Crescent Harbor resulting in catastrophic damages, property loss and human death. How to determine the return values of tsunami height using relatively short-term observation data is of great significance to assess the tsunami hazards and improve engineering design along the coast of Crescent City. In the present study, the extreme tsunami heights observed along the coast of Crescent City from 1938 to 2015 are fitted using six different probabilistic distributions, namely, the Gumbel distribution, the Weibull distribution, the maximum entropy distribution, the lognormal distribution, the generalized extreme value distribution and the generalized Pareto distribution. The maximum likelihood method is applied to estimate the parameters of all above distributions. Both Kolmogorov-Smirnov test and root mean square error method are utilized for goodness-of-fit test and the better fitting distribution is selected. Assuming that the occurrence frequency of tsunami in each year follows the Poisson distribution, the Poisson compound extreme value distribution can be used to fit the annual maximum tsunami amplitude, and then the point and interval estimations of return tsunami heights are calculated for structural design. The results show that the Poisson compound extreme value distribution fits tsunami heights very well and is suitable to determine the return tsunami heights for coastal disaster prevention.
Extreme geomagnetically induced currents
NASA Astrophysics Data System (ADS)
Kataoka, Ryuho; Ngwira, Chigomezyo
2016-12-01
We propose an emergency alert framework for geomagnetically induced currents (GICs), based on the empirically extreme values and theoretical upper limits of the solar wind parameters and of d B/d t, the time derivative of magnetic field variations at ground. We expect this framework to be useful for preparing against extreme events. Our analysis is based on a review of various papers, including those presented during Extreme Space Weather Workshops held in Japan in 2011, 2012, 2013, and 2014. Large-amplitude d B/d t values are the major cause of hazards associated with three different types of GICs: (1) slow d B/d t with ring current evolution (RC-type), (2) fast d B/d t associated with auroral electrojet activity (AE-type), and (3) transient d B/d t of sudden commencements (SC-type). We set "caution," "warning," and "emergency" alert levels during the main phase of superstorms with the peak Dst index of less than -300 nT (once per 10 years), -600 nT (once per 60 years), or -900 nT (once per 100 years), respectively. The extreme d B/d t values of the AE-type GICs are 2000, 4000, and 6000 nT/min at caution, warning, and emergency levels, respectively. For the SC-type GICs, a "transient alert" is also proposed for d B/d t values of 40 nT/s at low latitudes and 110 nT/s at high latitudes, especially when the solar energetic particle flux is unusually high.
Improving power and robustness for detecting genetic association with extreme-value sampling design.
Chen, Hua Yun; Li, Mingyao
2011-12-01
Extreme-value sampling design that samples subjects with extremely large or small quantitative trait values is commonly used in genetic association studies. Samples in such designs are often treated as "cases" and "controls" and analyzed using logistic regression. Such a case-control analysis ignores the potential dose-response relationship between the quantitative trait and the underlying trait locus and thus may lead to loss of power in detecting genetic association. An alternative approach to analyzing such data is to model the dose-response relationship by a linear regression model. However, parameter estimation from this model can be biased, which may lead to inflated type I errors. We propose a robust and efficient approach that takes into consideration of both the biased sampling design and the potential dose-response relationship. Extensive simulations demonstrate that the proposed method is more powerful than the traditional logistic regression analysis and is more robust than the linear regression analysis. We applied our method to the analysis of a candidate gene association study on high-density lipoprotein cholesterol (HDL-C) which includes study subjects with extremely high or low HDL-C levels. Using our method, we identified several SNPs showing a stronger evidence of association with HDL-C than the traditional case-control logistic regression analysis. Our results suggest that it is important to appropriately model the quantitative traits and to adjust for the biased sampling when dose-response relationship exists in extreme-value sampling designs. © 2011 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Rieder, Harald E.; di Rocco, Stefania; Staehelin, Johannes; Maeder, Jörg A.; Ribatet, Mathieu; Peter, Thomas; Davison, Anthony C.
2010-05-01
Tools from geostatistics and extreme value theory are applied to analyze spatial correlations in total ozone for the southern mid-latitudes. The dataset used in this study is the NIWA-assimilated total ozone dataset (Bodeker et al., 2001; Müller et al., 2008). Recently new tools from extreme value theory (Coles, 2001; Ribatet, 2007) have been applied to the world's longest total ozone record from Arosa, Switzerland (e.g. Staehelin 1998a,b) and 5 other long-term ground based stations to describe extreme events in low and high total ozone (Rieder et al., 2010a,b,c). Excursions in the frequency of extreme events reveal "fingerprints" of dynamical factors such as ENSO or NAO, and chemical factors, such as cold Arctic vortex ozone losses, as well as major volcanic eruptions of the 20th century (e.g. Gunung Agung, El Chichón, Mt. Pinatubo). Furthermore, atmospheric loading in ozone depleting substances lead to a continuous modification of column ozone in the northern hemisphere also with respect to extreme values (partly again in connection with polar vortex contributions). It is shown that application of extreme value theory allows the identification of many more of such fingerprints than conventional time series analysis on basis of annual and seasonal mean values. Especially, the analysis shows the strong influence of dynamics, revealing that even moderate ENSO and NAO events have a discernible effect on total ozone (Rieder et al., 2010b,c). Within the current study patterns in spatial correlation and frequency distributions of extreme events (e.g. ELOs and EHOs) are studied for the southern mid-latitudes. It is analyzed if "fingerprints"found for features in the northern hemisphere occur also in the southern mid-latitudes. New insights in spatial patterns of total ozone for the southern mid-latitudes are presented. Within this study the influence of changes in atmospheric dynamics (e.g. tropospheric and lower stratospheric pressure systems, ENSO) as well as influence of major volcanic eruptions (e.g. Mt. Pinatubo) and ozone depleting substances (ODS) on column ozone over the southern mid-latitudes is analyzed for the time period 1979-2007. References: Bodeker, G.E., J.C. Scott, K. Kreher, and R.L. McKenzie, Global ozone trends in potential vorticity coordinates using TOMS and GOME intercompared against the Dobson network: 1978-1998, J. Geophys. Res., 106 (D19), 23029-23042, 2001. Coles, S.: An Introduction to Statistical Modeling of Extreme Values, Springer Series in Statistics, ISBN:1852334592, Springer, Berlin, 2001. Müller, R., Grooß, J.-U., Lemmen, C., Heinze, D., Dameris, M., and Bodeker, G.: Simple measures of ozone depletion in the polar stratosphere, Atmos. Chem. Phys., 8, 251-264, 2008. Ribatet, M.: POT: Modelling peaks over a threshold, R News, 7, 34-36, 2007. Rieder ,H.E., Staehelin, J., Maeder, J.A., Ribatet, M., Stübi, R., Weihs, P., Holawe, F., Peter, T., and A.D., Davison (2010): Extreme events in total ozone over Arosa - Part I: Application of extreme value theory, to be submitted to ACPD. Rieder, H.E., Staehelin, J., Maeder, J.A., Ribatet, M., Stübi, R., Weihs, P., Holawe, F., Peter, T., and A.D., Davison (2010): Extreme events in total ozone over Arosa - Part II: Fingerprints of atmospheric dynamics and chemistry and effects on mean values and long-term changes, to be submitted to ACPD. Rieder, H.E., Jancso, L.M., Staehelin, J., Maeder, J.A., Ribatet, Peter, T., and A.D., Davison (2010): Extreme events in total ozone over the northern mid-latitudes: A case study based on long-term data sets from 5 ground-based stations, in preparation. Staehelin, J., Kegel, R., and Harris, N. R.: Trend analysis of the homogenized total ozone series of Arosa (Switzerland), 1929-1996, J. Geophys. Res., 103(D7), 8389-8400, doi:10.1029/97JD03650, 1998a. Staehelin, J., Renaud, A., Bader, J., McPeters, R., Viatte, P., Hoegger, B., Bugnion, V., Giroud, M., and Schill, H.: Total ozone series at Arosa (Switzerland): Homogenization and data comparison, J. Geophys. Res., 103(D5), 5827-5842, doi:10.1029/97JD02402, 1998b.
Decadal oscillations and extreme value distribution of river peak flows in the Meuse catchment
NASA Astrophysics Data System (ADS)
De Niel, Jan; Willems, Patrick
2017-04-01
In flood risk management, flood probabilities are often quantified through Generalized Pareto distributions of river peak flows. One of the main underlying assumptions is that all data points need to originate from one single underlying distribution (i.i.d. assumption). However, this hypothesis, although generally assumed to be correct for variables such as river peak flows, remains somehow questionable: flooding might indeed be caused by different hydrological and/or meteorological conditions. This study confirms these findings from previous research by showing a clear indication of the link between atmospheric conditions and flooding for the Meuse river in The Netherlands: decadal oscillations of river peak flows can (at least partially) be attributed to the occurrence of westerly weather types. The study further proposes a method to take this correlation between atmospheric conditions and river peak flows into account when calibrating an extreme value distribution for river peak flows. Rather than calibrating one single distribution to the data and potentially violating the i.i.d. assumption, weather type depending extreme value distributions are derived and composed. The study shows that, for the Meuse river in The Netherlands, such approach results in a more accurate extreme value distribution, especially with regards to extrapolations. Comparison of the proposed method with a traditional extreme value analysis approach and an alternative model-based approach for the same case study shows strong differences in the peak flow extrapolation. The design-flood for a 1,250 year return period is estimated at 4,800 m3s-1 for the proposed method, compared with 3,450 m3s-1 and 3,900 m3s-1 for the traditional method and a previous study. The methods were validated based on instrumental and documentary flood information of the past 500 years.
I know why you voted for Trump: (Over)inferring motives based on choice.
Barasz, Kate; Kim, Tami; Evangelidis, Ioannis
2018-05-10
People often speculate about why others make the choices they do. This paper investigates how such inferences are formed as a function of what is chosen. Specifically, when observers encounter someone else's choice (e.g., of political candidate), they use the chosen option's attribute values (e.g., a candidate's specific stance on a policy issue) to infer the importance of that attribute (e.g., the policy issue) to the decision-maker. Consequently, when a chosen option has an attribute whose value is extreme (e.g., an extreme policy stance), observers infer-sometimes incorrectly-that this attribute disproportionately motivated the decision-maker's choice. Seven studies demonstrate how observers use an attribute's value to infer its weight-the value-weight heuristic-and identify the role of perceived diagnosticity: more extreme attribute values give observers the subjective sense that they know more about a decision-maker's preferences, and in turn, increase the attribute's perceived importance. The paper explores how this heuristic can produce erroneous inferences and influence broader beliefs about decision-makers. Copyright © 2018 Elsevier B.V. All rights reserved.
Tribal Colleges: The Original Extreme Makeover Experts
ERIC Educational Resources Information Center
Powless, Donna
2015-01-01
In this article, the author states "our experience with education is a prime example in proving we are experts at problem-solving and are the originators of the extreme makeover." Educational institutions were introduced to the Native people in an outrageous manner--often as a mask for assimilating American Indians, routinely resulting…
Lazaro, Lionel E; Cordasco, Frank A
2017-02-01
In the young athlete, the shoulder is one of the most frequently injured joints during sports activities. The injuries are either from an acute traumatic event or overuse. Shoulder examination can present some challenges; given the multiple joints involved, the difficulty palpating the underlying structures, and the potential to have both intra- and/or extra-articular problems. Many of the shoulder examination tests can be positive in multiple problems. They usually have high sensitivity but low specificity and therefore low predictive value. The medical history coupled with a detailed physical exam can usually provide the information necessary to obtain an accurate diagnosis. A proficient shoulder examination and the development of an adequate differential diagnosis are important before considering advanced imaging. The shoulder complex relies upon the integrity of multiple structures for normal function. A detailed history is of paramount importance when evaluating young athletes with shoulder problems. A systematic physical examination is extremely important to guiding an accurate diagnosis. The patient's age and activity level are very important when considering the differential diagnosis. Findings obtain through history and physical examination should dictate the decision to obtain advanced imaging of the shoulder.
Gronlund, Carina J; Sullivan, Kyle P; Kefelegn, Yonathan; Cameron, Lorraine; O'Neill, Marie S
2018-08-01
Cold and hot weather are associated with mortality and morbidity. Although the burden of temperature-associated mortality may shift towards high temperatures in the future, cold temperatures may represent a greater current-day problem in temperate cities. Hot and cold temperature vulnerabilities may coincide across several personal and neighborhood characteristics, suggesting opportunities for increasing present and future resilience to extreme temperatures. We present a narrative literature review encompassing the epidemiology of cold- and heat-related mortality and morbidity, related physiologic and environmental mechanisms, and municipal responses to hot and cold weather, illustrated by Detroit, Michigan, USA, a financially burdened city in an economically diverse metropolitan area. The Detroit area experiences sharp increases in mortality and hospitalizations with extreme heat, while cold temperatures are associated with more gradual increases in mortality, with no clear threshold. Interventions such as heating and cooling centers may reduce but not eliminate temperature-associated health problems. Furthermore, direct hemodynamic responses to cold, sudden exertion, poor indoor air quality and respiratory epidemics likely contribute to cold-related mortality. Short- and long-term interventions to enhance energy and housing security and housing quality may reduce temperature-related health problems. Extreme temperatures can increase morbidity and mortality in municipalities like Detroit that experience both extreme heat and prolonged cold seasons amidst large socioeconomic disparities. The similarities in physiologic and built-environment vulnerabilities to both hot and cold weather suggest prioritization of strategies that address both present-day cold and near-future heat concerns. Copyright © 2018. Published by Elsevier B.V.
Identification and characterization of extraordinary rainstorms in Italy
NASA Astrophysics Data System (ADS)
Libertino, Andrea; Ganora, Daniele; Claps, Pierluigi
2017-04-01
Despite its generally mild climate, Italy, as most of the Mediterranean region, is prone to the development of "super-extreme" events with extraordinary rainfall intensities. The main triggering mechanisms of these events is nowadays quite well known, but more research is needed to transform this knowledge in directions to build updated rainstorm hazard maps at the national scale. Moreover, a precise definition of "super-extremes" is still lacking, since the original suggestion of a second specific EV1 component made with the TCEV distribution. The above considerations led us to consider Italy a peculiar and challenging case study, where the geographic and orographic settings, associated with recurring storm-induced disasters, require an updated assessment of the "super-extreme" rainfall hazard at the country scale. Until now, the lack of a unique dataset of rainfall extremes has made the above task difficult to reach. In this work we report the results of the analysis made on a comprehensive and uniform set of rainfall annual maxima, collected from the different authorities in charge, representing the reference dataset of extremes from 1 to 24 hours duration. The database includes more than 6000 measuring points nationwide, spanning the period 1916 - 2014. Our analysis aims at identifying a meaningful population of records deviating from an "ordinary" definition of extreme value distribution, and assessing the stationarity in the timing of these events at the national scale. The first problems that need to be overcome are related to the not uniform distribution of data in time and space. Then the evaluation of meaningful relative thresholds aimed at selecting significant samples for the trend assessment has to be addressed. A first investigation attempt refers to the events exceeding a threshold that identify an average of one occurrence per year all over Italy, i.e. with a 1/1000 overall probability of exceedance. Geographic representation of these "outliers", scaled on local averages, demonstrates some prevailing clustering on the Thyrrenian coastal areas. Subsequent application of quantile regressions, aimed at minimizing the temporal non-uniformity of samples, shows significant increasing trends on the extremes of very short duration. Further efforts have been undertaken to explore the selection of a common national set of higher order parameters all over Italy, that would make less arduous to identify the probability of occurrence of "super-extremes" in the country.
Financial Toxicity of Cancer Care: It's Time to Intervene.
Zafar, S Yousuf
2016-05-01
Evidence suggests that a considerably large proportion of cancer patients are affected by treatment-related financial harm. As medical debt grows for some with cancer, the downstream effects can be catastrophic, with a recent study suggesting a link between extreme financial distress and worse mortality. At least three factors might explain the relationship between extreme financial distress and greater risk of mortality: 1) overall poorer well-being, 2) impaired health-related quality of life, and 3) sub-par quality of care. While research has described the financial harm associated with cancer treatment, little has been done to effectively intervene on the problem. Long-term solutions must focus on policy changes to reduce unsustainable drug prices and promote innovative insurance models. In the mean time, patients continue to struggle with high out-of-pocket costs. For more immediate solutions, we should look to the oncologist and patient. Oncologists should focus on the value of care delivered, encourage patient engagement on the topic of costs, and be better educated on financial resources available to patients. For their part, patients need improved cost-related health literacy so they are aware of potential costs and resources, and research should focus on how patients define high-value care. With a growing list of financial side effects induced by cancer treatment, the time has come to intervene on the "financial toxicity" of cancer care. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Turkish Version of Kolcaba's Immobilization Comfort Questionnaire: A Validity and Reliability Study.
Tosun, Betül; Aslan, Özlem; Tunay, Servet; Akyüz, Aygül; Özkan, Hüseyin; Bek, Doğan; Açıksöz, Semra
2015-12-01
The purpose of this study was to determine the validity and reliability of the Turkish version of the Immobilization Comfort Questionnaire (ICQ). The sample used in this methodological study consisted of 121 patients undergoing lower extremity arthroscopy in a training and research hospital. The validity study of the questionnaire assessed language validity, structural validity and criterion validity. Structural validity was evaluated via exploratory factor analysis. Criterion validity was evaluated by assessing the correlation between the visual analog scale (VAS) scores (i.e., the comfort and pain VAS scores) and the ICQ scores using Spearman's correlation test. The Kaiser-Meyer-Olkin coefficient and Bartlett's test of sphericity were used to determine the suitability of the data for factor analysis. Internal consistency was evaluated to determine reliability. The data were analyzed with SPSS version 15.00 for Windows. Descriptive statistics were presented as frequencies, percentages, means and standard deviations. A p value ≤ .05 was considered statistically significant. A moderate positive correlation was found between the ICQ scores and the VAS comfort scores; a moderate negative correlation was found between the ICQ and the VAS pain measures in the criterion validity analysis. Cronbach α values of .75 and .82 were found for the first and second measurements, respectively. The findings of this study reveal that the ICQ is a valid and reliable tool for assessing the comfort of patients in Turkey who are immobilized because of lower extremity orthopedic problems. Copyright © 2015. Published by Elsevier B.V.
Yang, Wu-Bin; Niu, He-Cai; Sun, Wei-Dong; Shan, Qiang; Zheng, Yong-Fei; Li, Ning-Bo; Li, Cong-Ying; Arndt, Nicholas T.; Xu, Xing; Jiang, Yu-Hang; Yu, Xue-Yuan
2013-01-01
Cretaceous represents one of the hottest greenhouse periods in the Earth's history, but some recent studies suggest that small ice caps might be present in non-polar regions during certain periods in the Early Cretaceous. Here we report extremely negative δ18O values of −18.12‰ to −13.19‰ for early Aptian hydrothermal zircon from an A-type granite at Baerzhe in northeastern China. Given that A-type granite is anhydrous and that magmatic zircon of the Baerzhe granite has δ18O value close to mantle values, the extremely negative δ18O values for hydrothermal zircon are attributed to addition of meteoric water with extremely low δ18O, mostly likely transported by glaciers. Considering the paleoaltitude of the region, continental glaciation is suggested to occur in the early Aptian, indicating much larger temperature fluctuations than previously thought during the supergreenhouse Cretaceous. This may have impact on the evolution of major organism in the Jehol Group during this period. PMID:24061068
NASA Astrophysics Data System (ADS)
Ghiaei, Farhad; Kankal, Murat; Anilan, Tugce; Yuksek, Omer
2018-01-01
The analysis of rainfall frequency is an important step in hydrology and water resources engineering. However, a lack of measuring stations, short duration of statistical periods, and unreliable outliers are among the most important problems when designing hydrology projects. In this study, regional rainfall analysis based on L-moments was used to overcome these problems in the Eastern Black Sea Basin (EBSB) of Turkey. The L-moments technique was applied at all stages of the regional analysis, including determining homogeneous regions, in addition to fitting and estimating parameters from appropriate distribution functions in each homogeneous region. We studied annual maximum rainfall height values of various durations (5 min to 24 h) from seven rain gauge stations located in the EBSB in Turkey, which have gauging periods of 39 to 70 years. Homogeneity of the region was evaluated by using L-moments. The goodness-of-fit criterion for each distribution was defined as the ZDIST statistics, depending on various distributions, including generalized logistic (GLO), generalized extreme value (GEV), generalized normal (GNO), Pearson type 3 (PE3), and generalized Pareto (GPA). GLO and GEV determined the best distributions for short (5 to 30 min) and long (1 to 24 h) period data, respectively. Based on the distribution functions, the governing equations were extracted for calculation of intensities of 2, 5, 25, 50, 100, 250, and 500 years return periods (T). Subsequently, the T values for different rainfall intensities were estimated using data quantifying maximum amount of rainfall at different times. Using these T values, duration, altitude, latitude, and longitude values were used as independent variables in a regression model of the data. The determination coefficient ( R 2) value indicated that the model yields suitable results for the regional relationship of intensity-duration-frequency (IDF), which is necessary for the design of hydraulic structures in small and medium sized catchments.
Inter-model variability in hydrological extremes projections for Amazonian sub-basins
NASA Astrophysics Data System (ADS)
Andres Rodriguez, Daniel; Garofolo, Lucas; Lázaro de Siqueira Júnior, José; Samprogna Mohor, Guilherme; Tomasella, Javier
2014-05-01
Irreducible uncertainties due to knowledge's limitations, chaotic nature of climate system and human decision-making process drive uncertainties in Climate Change projections. Such uncertainties affect the impact studies, mainly when associated to extreme events, and difficult the decision-making process aimed at mitigation and adaptation. However, these uncertainties allow the possibility to develop exploratory analyses on system's vulnerability to different sceneries. The use of different climate model's projections allows to aboard uncertainties issues allowing the use of multiple runs to explore a wide range of potential impacts and its implications for potential vulnerabilities. Statistical approaches for analyses of extreme values are usually based on stationarity assumptions. However, nonstationarity is relevant at the time scales considered for extreme value analyses and could have great implications in dynamic complex systems, mainly under climate change transformations. Because this, it is required to consider the nonstationarity in the statistical distribution parameters. We carried out a study of the dispersion in hydrological extremes projections using climate change projections from several climate models to feed the Distributed Hydrological Model of the National Institute for Spatial Research, MHD-INPE, applied in Amazonian sub-basins. This model is a large-scale hydrological model that uses a TopModel approach to solve runoff generation processes at the grid-cell scale. MHD-INPE model was calibrated for 1970-1990 using observed meteorological data and comparing observed and simulated discharges by using several performance coeficients. Hydrological Model integrations were performed for present historical time (1970-1990) and for future period (2010-2100). Because climate models simulate the variability of the climate system in statistical terms rather than reproduce the historical behavior of climate variables, the performances of the model's runs during the historical period, when feed with climate model data, were tested using descriptors of the Flow Duration Curves. The analyses of projected extreme values were carried out considering the nonstationarity of the GEV distribution parameters and compared with extremes events in present time. Results show inter-model variability in a broad dispersion on projected extreme's values. Such dispersion implies different degrees of socio-economic impacts associated to extreme hydrological events. Despite the no existence of one optimum result, this variability allows the analyses of adaptation strategies and its potential vulnerabilities.
A new framework for estimating return levels using regional frequency analysis
NASA Astrophysics Data System (ADS)
Winter, Hugo; Bernardara, Pietro; Clegg, Georgina
2017-04-01
We propose a new framework for incorporating more spatial and temporal information into the estimation of extreme return levels. Currently, most studies use extreme value models applied to data from a single site; an approach which is inefficient statistically and leads to return level estimates that are less physically realistic. We aim to highlight the benefits that could be obtained by using methodology based upon regional frequency analysis as opposed to classic single site extreme value analysis. This motivates a shift in thinking, which permits the evaluation of local and regional effects and makes use of the wide variety of data that are now available on high temporal and spatial resolutions. The recent winter storms over the UK during the winters of 2013-14 and 2015-16, which have caused wide-ranging disruption and damaged important infrastructure, provide the main motivation for the current work. One of the most impactful natural hazards is flooding, which is often initiated by extreme precipitation. In this presentation, we focus on extreme rainfall, but shall discuss other meteorological variables alongside potentially damaging hazard combinations. To understand the risks posed by extreme precipitation, we need reliable statistical models which can be used to estimate quantities such as the T-year return level, i.e. the level which is expected to be exceeded once every T-years. Extreme value theory provides the main collection of statistical models that can be used to estimate the risks posed by extreme precipitation events. Broadly, at a single site, a statistical model is fitted to exceedances of a high threshold and the model is used to extrapolate to levels beyond the range of the observed data. However, when we have data at many sites over a spatial domain, fitting a separate model for each separate site makes little sense and it would be better if we could incorporate all this information to improve the reliability of return level estimates. Here, we use the regional frequency analysis approach to define homogeneous regions which are affected by the same storms. Extreme value models are then fitted to the data pooled from across a region. We find that this approach leads to more spatially consistent return level estimates with reduced uncertainty bounds.
Hemodialysis Dose and Adequacy
... a patient's Kt/V is extremely low, the measurement should be repeated, unless a reason for the low Kt/V is obvious. Obvious reasons include treatment interruption, problems with blood or solution flow, and a problem in sampling either the pre- ...
Cannabis, motivation, and life satisfaction in an internet sample
Barnwell, Sara Smucker; Earleywine, Mitch; Wilcox, Rand
2006-01-01
Although little evidence supports cannabis-induced amotivational syndrome, sources continue to assert that the drug saps motivation [1], which may guide current prohibitions. Few studies report low motivation in chronic users; another reveals that they have higher subjective wellbeing. To assess differences in motivation and subjective wellbeing, we used a large sample (N = 487) and strict definitions of cannabis use (7 days/week) and abstinence (never). Standard statistical techniques showed no differences. Robust statistical methods controlling for heteroscedasticity, non-normality and extreme values found no differences in motivation but a small difference in subjective wellbeing. Medical users of cannabis reporting health problems tended to account for a significant portion of subjective wellbeing differences, suggesting that illness decreased wellbeing. All p-values were above p = .05. Thus, daily use of cannabis does not impair motivation. Its impact on subjective wellbeing is small and may actually reflect lower wellbeing due to medical symptoms rather than actual consumption of the plant. PMID:16722561
Quantum centipedes: collective dynamics of interacting quantum walkers
NASA Astrophysics Data System (ADS)
Krapivsky, P. L.; Luck, J. M.; Mallick, K.
2016-08-01
We consider the quantum centipede made of N fermionic quantum walkers on the one-dimensional lattice interacting by means of the simplest of all hard-bound constraints: the distance between two consecutive fermions is either one or two lattice spacings. This composite quantum walker spreads ballistically, just as the simple quantum walk. However, because of the interactions between the internal degrees of freedom, the distribution of its center-of-mass velocity displays numerous ballistic fronts in the long-time limit, corresponding to singularities in the empirical velocity distribution. The spectrum of the centipede and the corresponding group velocities are analyzed by direct means for the first few values of N. Some analytical results are obtained for arbitrary N by exploiting an exact mapping of the problem onto a free-fermion system. We thus derive the maximal velocity describing the ballistic spreading of the two extremal fronts of the centipede wavefunction, including its non-trivial value in the large-N limit.
Shaw, W S; Feuerstein, M; Lincoln, A E; Miller, V I; Wood, P M
2001-08-01
A case manager's ability to obtain worksite accommodations and engage workers in active problem solving may improve health and return to work outcomes for clients with work related upper extremity disorders (WRUEDs). This study examines the feasibility of a 2 day training seminar to help nurse case managers identify ergonomic risk factors, provide accommodation, and conduct problem solving skills training with workers' compensation claimants recovering from WRUEDs. Eight procedural steps to this case management approach were identified, translated into a training workshop format, and conveyed to 65 randomly selected case managers. Results indicate moderate to high self ratings of confidence to perform ergonomic assessments (mean = 7.5 of 10) and to provide problem solving skills training (mean = 7.2 of 10) after the seminar. This training format was suitable to experienced case managers and generated a moderate to high level of confidence to use this case management approach.
Droughts and governance impacts on water scarcity: an~analysis in the Brazilian semi-arid
NASA Astrophysics Data System (ADS)
Silva, A. C. S.; Galvão, C. O.; Silva, G. N. S.
2015-06-01
Extreme events are part of climate variability. Dealing with variability is still a challenge that might be increased due to climate change. However, impacts of extreme events are not only dependent on their variability, but also on management and governance. In Brazil, its semi-arid region is vulnerable to extreme events, especially droughts, for centuries. Actually, other Brazilian regions that have been mostly concerned with floods are currently also experiencing droughts. This article evaluates how a combination between climate variability and water governance might affect water scarcity and increase the impacts of extreme events on some regions. For this evaluation, Ostrom's framework for analyzing social-ecological systems (SES) was applied. Ostrom's framework is useful for understanding interactions between resource systems, governance systems and resource users. This study focuses on social-ecological systems located in a drought-prone region of Brazil. Two extreme events were selected, one in 1997-2000, when Brazil's new water policy was very young, and the other one in 2012-2015. The analysis of SES considering Ostrom's principle "Clearly defined boundaries" showed that deficiencies in water management cause the intensification of drought's impacts for the water users. The reasons are more related to water management and governance problems than to drought event magnitude or climate change. This is a problem that holdup advances in dealing with extreme events.
Violent Extremism, Community-Based Violence Prevention, and Mental Health Professionals.
Weine, Stevan M; Stone, Andrew; Saeed, Aliya; Shanfield, Stephen; Beahrs, John; Gutman, Alisa; Mihajlovic, Aida
2017-01-01
New community-based initiatives being developed to address violent extremism in the United States are utilizing mental health services and leadership. This article reviews current approaches to preventing violent extremism, the contribution that mental illness and psychosocial problems can make to violent extremism, and the rationale for integrating mental health strategies into preventing violent extremism. The authors describe a community-based targeted violence prevention model and the potential roles of mental health professionals. This model consists of a multidisciplinary team that assesses at-risk individuals with comprehensive threat and behavioral evaluations, arranges for ongoing support and treatment, conducts follow-up evaluations, and offers outreach, education, and resources for communities. This model would enable mental health professionals in local communities to play key roles in preventing violent extremism through their practice and leadership.
Shaw, Daniel S.; Sitnick, Stephanie L.; Brennan, Lauretta M.; Choe, Daniel E.; Dishion, Thomas J.; Wilson, Melvin N.; Gardner, Frances
2016-01-01
Several studies suggest that neighborhood deprivation is a unique risk factor in child and adolescent development of problem behavior. We sought to examine whether previously established intervention effects of the Family Check-Up (FCU) on child conduct problems at age 7.5 would persist through age 9.5, and whether neighborhood deprivation would moderate these effects. In addition, we examined whether improvements in parent-child interaction during early childhood associated with the FCU would be related to later reductions in child aggression among families living in the highest-risk neighborhoods. Using a multisite cohort of at-risk children identified on the basis of family, child, and socioeconomic risk and randomly assigned to the FCU, intervention effects were found to be moderated by neighborhood deprivation, such that they were only directly present for those living at moderate versus extreme levels of neighborhood deprivation. Additionally, improvements in child aggression were evident for children living in extreme neighborhood deprivation when parents improved the quality of their parent-child interaction during the toddler period (i.e., moderated mediation). Implications of the findings are discussed in relation to the possibilities and possible limitations in prevention of early problem behavior for those children living in extreme and moderate levels of poverty. PMID:26646197
Research in Stochastic Processes
1988-10-10
To appear in Proceedings Volume, Oberwolfach Conf. on Extremal Value Theory, Ed. J. HUsler and R. Reiss, Springer. 4. M.R. Leadbetter. The exceedance...Hsing, J. Husler and M.R. Leadbetter, On the exceedance point process for a stationary sequence, Probability Theor. Rel. Fields, 20, 1988, 97-112 Z.J...Oberwotfach Conf. on Extreme Value Theory. J. Husler and R. Reiss. eds.. Springer. to appear V. Mandrekar, On a limit theorem and invariance
2009-03-01
transition fatigue regimes; however, microplasticity (i.e., heterogeneous plasticity at the scale of microstructure) is relevant to understanding fatigue...and Socie [57] considered the affect of microplastic 14 Microstructure-Sensitive Extreme Value Probabilities for High Cycle Fatigue of Ni-Base...considers the local stress state as affected by intergranular interactions and microplasticity . For the calculations given below, the volumes over which
NASA Astrophysics Data System (ADS)
Hasan, Husna; Salam, Norfatin; Kassim, Suraiya
2013-04-01
Extreme temperature of several stations in Malaysia is modeled by fitting the annual maximum to the Generalized Extreme Value (GEV) distribution. The Augmented Dickey Fuller (ADF) and Phillips Perron (PP) tests are used to detect stochastic trends among the stations. The Mann-Kendall (MK) test suggests a non-stationary model. Three models are considered for stations with trend and the Likelihood Ratio test is used to determine the best-fitting model. The results show that Subang and Bayan Lepas stations favour a model which is linear for the location parameters while Kota Kinabalu and Sibu stations are suitable with a model in the logarithm of the scale parameters. The return level is the level of events (maximum temperature) which is expected to be exceeded once, on average, in a given number of years, is obtained.
Spatial extreme value analysis to project extremes of large-scale indicators for severe weather
Gilleland, Eric; Brown, Barbara G; Ammann, Caspar M
2013-01-01
Concurrently high values of the maximum potential wind speed of updrafts (Wmax) and 0–6 km wind shear (Shear) have been found to represent conducive environments for severe weather, which subsequently provides a way to study severe weather in future climates. Here, we employ a model for the product of these variables (WmSh) from the National Center for Atmospheric Research/United States National Center for Environmental Prediction reanalysis over North America conditioned on their having extreme energy in the spatial field in order to project the predominant spatial patterns of WmSh. The approach is based on the Heffernan and Tawn conditional extreme value model. Results suggest that this technique estimates the spatial behavior of WmSh well, which allows for exploring possible changes in the patterns over time. While the model enables a method for inferring the uncertainty in the patterns, such analysis is difficult with the currently available inference approach. A variation of the method is also explored to investigate how this type of model might be used to qualitatively understand how the spatial patterns of WmSh correspond to extreme river flow events. A case study for river flows from three rivers in northwestern Tennessee is studied, and it is found that advection of WmSh from the Gulf of Mexico prevails while elsewhere, WmSh is generally very low during such extreme events. © 2013 The Authors. Environmetrics published by JohnWiley & Sons, Ltd. PMID:24223482
The perceived problem-solving ability of nurse managers.
Terzioglu, Fusun
2006-07-01
The development of a problem-solving approach to nursing has been one of the more important changes in nursing during the last decade. Nurse Managers need to have effective problem-solving and management skills to be able to decrease the cost of the health care and to increase the quality of care. This descriptive study was conducted to determine the perceived problem-solving ability of nurse managers. From a population of 87 nurse managers, 71 were selected using the stratified random sampling method, 62 nurse managers agreed to participate. Data were collected through a questionnaire including demographic information and a problem-solving inventory. The problem-solving inventory was developed by Heppner and Petersen in 1982, and validity and readability studies were done. It was adapted to Turkish by Sahin et al (1993). The acquired data have been evaluated on the software spss 10.0 programme, using percentages, mean values, one-way anova and t-test (independent samples t-test). Most of the nurses had 11 or more years of working experience (71%) and work as charge nurses in the clinics. It was determined that 69.4% of the nurse managers did not have any educational training in administration. The most encountered problems stated were issues related to managerial (30.6%) and professional staff (25.8%). It was identified that nurse managers who had received education about management, following scientific publication and scientific meeting and had followed management models, perceived their problem-resolving skills as more adequate than the others (P>0.05). In this study, it was determined that nurses do not perceive that they have problem-solving skills at a desired level. In this context, it is extremely important that this subject be given an important place in both nursing education curriculum and continuing education programmes.
Uncertainty estimation of water levels for the Mitch flood event in Tegucigalpa
NASA Astrophysics Data System (ADS)
Fuentes Andino, D. C.; Halldin, S.; Lundin, L.; Xu, C.
2012-12-01
Hurricane Mitch in 1998 left a devastating flood in Tegucigalpa, the capital city of Honduras. Simulation of elevated water surfaces provides a good way to understand the hydraulic mechanism of large flood events. In this study the one-dimensional HEC-RAS model for steady flow conditions together with the two-dimensional Lisflood-fp model were used to estimate the water level for the Mitch event in the river reaches at Tegucigalpa. Parameters uncertainty of the model was investigated using the generalized likelihood uncertainty estimation (GLUE) framework. Because of the extremely large magnitude of the Mitch flood, no hydrometric measurements were taken during the event. However, post-event indirect measurements of discharge and observed water levels were obtained in previous works by JICA and USGS. To overcome the problem of lacking direct hydrometric measurement data, uncertainty in the discharge was estimated. Both models could well define the value for channel roughness, though more dispersion resulted from the floodplain value. Analysis of the data interaction showed that there was a tradeoff between discharge at the outlet and floodplain roughness for the 1D model. The estimated discharge range at the outlet of the study area encompassed the value indirectly estimated by JICA, however the indirect method used by the USGS overestimated the value. If behavioral parameter sets can well reproduce water surface levels for past events such as Mitch, more reliable predictions for future events can be expected. The results acquired in this research will provide guidelines to deal with the problem of modeling past floods when no direct data was measured during the event, and to predict future large events taking uncertainty into account. The obtained range of the uncertain flood extension will be an outcome useful for decision makers.
Surface atmospheric extremes (Launch and transportation areas)
NASA Technical Reports Server (NTRS)
1972-01-01
The effects of extreme values of surface and low altitude atmospheric parameters on space vehicle design, tests, and operations are discussed. Atmospheric extremes from the surface to 150 meters for geographic locations of interest to NASA are given. Thermal parameters (temperature and solar radiation), humidity, pressure, and atmospheric electricity (lighting and static) are presented. Weather charts and tables are included.
Cooperative Efforts in Fuels Management
Gerald L. Adams
1995-01-01
Our forests have been neglected or protected to death, creating an extreme wildfire risk in wildland urban intermix communities. We as agencies and organizations are just now beginning to understand that the fuel problems we have across the western states are not a single agency problem, but "our problem." Wildfires do not respect boundaries, be they...
Fevang, Silje Katrine Elgen; Hysing, Mari; Sommerfelt, Kristian; Elgen, Irene
2017-12-01
The aims were to investigate mental health problems with the Strength and Difficulties Questionnaire (SDQ) in children born extremely preterm/extremely low birth weight (EP/ELBW) without severe disabilities compared to controls, and to identify peri-, or neonatal factors possibly predicting later mental health problems. A national Norwegian cohort of 11-year-old EP/ELBW children, excluding those with intellectual disabilities, non-ambulatory cerebral palsy, blindness and/or deafness, was assessed. Parents and teachers completed the SDQ. Mean scores and scores ≥90th percentile for the control group, combined (parent and/or teacher reporting the child ≥90th percentile), and pervasive ratings (both parent and teacher reporting the child ≥90th percentile) were presented. The controls consisted of an unselected population of all 11-year-old children born in 1995 who attended public or private schools in Bergen. Of the eligible children, 216 (64%) EP/ELBW and 1882 (61%) control children participated. The EP/ELBW children had significantly higher scores and/or increased risk of parent, teacher, combined, and pervasive rated hyperactivity/inattention, emotional-, and peer problems (OR 2.1-6.3). Only parents reported the EP/ELBW children to be at an increased risk of conduct problems (OR 1.6, 95% CI 1.1-2.6). Only low maternal education at birth was significantly associated with mental health problems at 11 years of age (OR 2.5, 95% CI 1.2-5.4). EP/ELBW children without severe disabilities had increased risk of symptoms of hyperactivity/inattention, emotional-, and peer problems. None of the peri- or neonatal factors were significantly associated with later mental health problems, except for low maternal education.
Extreme Events: low and high total ozone over Arosa, Switzerland
NASA Astrophysics Data System (ADS)
Rieder, H. E.; Staehelin, J.; Maeder, J. A.; Ribatet, M.; Stübi, R.; Weihs, P.; Holawe, F.; Peter, T.; Davison, A. C.
2009-04-01
The frequency distribution of days with extreme low (termed ELOs) and high (termed EHOs) total ozone is analyzed for the world's longest total ozone record (Arosa, Switzerland - for details see Staehelin et al.,1998a,b), with new tools from extreme value theory (e.g. Coles, 2001; Ribatet, 2007). A heavy-tail focused approach is used through the fitting of the Generalized Pareto Distribution (GPD) to the Arosa time series. Asymptotic arguments (Pickands, 1975) justify the use of the GPD for modeling exceedances over a high (or below a low) enough threshold (Coles, 2001). The analysis shows that the GPD is appropriate for modeling the frequency distribution in total ozone above or below a mathematically well-defined threshold. While previous studies focused on so termed ozone mini-holes and mini-highs (e.g. Bojkov and Balis, 2001, Koch et al., 2005), this study is the first to present a mathematical description of extreme events in low and high total ozone for a northern mid-latitudes site (Rieder et al., 2009). The results show (a) an increase in days with extreme low (ELOs) and (b) a decrease in days with extreme high total ozone (EHOs) during the last decades, (c) that the general trend in total ozone is strongly determined by these extreme events and (d) that fitting the GPD is an appropriate method for the estimation of the frequency distribution of so-called ozone mini-holes. Furthermore, this concept allows one to separate the effect of Arctic ozone depletion from that of in situ mid-latitude ozone loss. As shown by this study, ELOs and EHOs have a strong influence on mean values in total ozone and the "extremes concept" could be further used also for validation of Chemistry-Climate-Models (CCMs) within the scientific community. References: Bojkov, R. D., and Balis, D.S.: Characteristics of episodes with extremely low ozone values in the northern middle latitudes 1975-2000, Ann. Geophys., 19, 797-807, 2001. Coles, S.: An Introduction to Statistical Modeling of Extreme Values, Springer Series in Statistics, ISBN:1852334592, Springer, Berlin, 2001. Koch, G., H. Wernli, C. Schwierz, J. Staehelin, and T. Peter (2005), A composite study on the structure and formation of ozone miniholes and minihighs over central Europe, Geophys. Res. Lett., 32, L12810, doi:10.1029/2004GL022062. Pickands, J.: Statistical-Inference using extreme order Statistics, Ann. Stat., 3, 1, 119-131, 1975. Ribatet, M.: POT: Modelling peaks over a threshold, R News, 7, 34-36, 2007. Rieder, H.E., Staehelin, J., Maeder, J.A., Ribatet, M., Stübi, R., Weihs, P., Holawe, F., Peter, T., and Davison, A.C.: From ozone mini holes and mini highs towards extreme value theory: New insights from extreme events and non stationarity, submitted to J. Geophys. Res., 2009. Staehelin, J., Kegel, R., and Harris, N. R.: Trend analysis of the homogenized total ozone series of Arosa (Switzerland), 1929-1996, J. Geophys. Res., 103(D7), 8389-8400, doi:10.1029/97JD03650, 1998a. Staehelin, J., Renaud, A., Bader, J., McPeters, R., Viatte, P., Hoegger, B., Bugnion, V., Giroud, M., and Schill, H.: Total ozone series at Arosa (Switzerland): Homogenization and data comparison, J. Geophys. Res., 103(D5), 5827-5842, doi:10.1029/97JD02402, 1998b.
Making Energy-Water Nexus Scenarios more Fit-for-Purpose through Better Characterization of Extremes
NASA Astrophysics Data System (ADS)
Yetman, G.; Levy, M. A.; Chen, R. S.; Schnarr, E.
2017-12-01
Often quantitative scenarios of future trends exhibit less variability than the historic data upon which the models that generate them are based. The problem of dampened variability, which typically also entails dampened extremes, manifests both temporally and spatially. As a result, risk assessments that rely on such scenarios are in danger of producing misleading results. This danger is pronounced in nexus issues, because of the multiple dimensions of change that are relevant. We illustrate the above problem by developing alternative joint distributions of the probability of drought and of human population totals, across U.S. counties over the period 2010-2030. For the dampened-extremes case we use drought frequencies derived from climate models used in the U.S. National Climate Assessment and the Environmental Protection Agency's population and land use projections contained in its Integrated Climate and Land Use Scenarios (ICLUS). For the elevated extremes case we use an alternative spatial drought frequency estimate based on tree-ring data, covering a 555-year period (Ho et al 2017); and we introduce greater temporal and spatial extremes in the ICLUS socioeconomic projections so that they conform to observed extremes in the historical U.S. spatial census data 1790-present (National Historical Geographic Information System). We use spatial and temporal coincidence of high population and extreme drought as a proxy for energy-water nexus risk. We compare the representation of risk in the dampened-extreme and elevated-extreme scenario analysis. We identify areas of the country where using more realistic portrayals of extremes makes the biggest difference in estimate risk and suggest implications for future risk assessments. References: Michelle Ho, Upmanu Lall, Xun Sun, Edward R. Cook. 2017. Multiscale temporal variability and regional patterns in 555 years of conterminous U.S. streamflow. Water Resources Research. . doi: 10.1002/2016WR019632
Dynamics of Nuclear Regions of Galaxies
NASA Technical Reports Server (NTRS)
Miller, Richard H.
1996-01-01
Current research carried out with the help of the ASEE-NASA Summer Faculty Program, at NASA-Ames, is concentrated on the dynamics of nuclear regions of galaxies. From a dynamical point of view a galaxy is a collection of around 10(sup 11) stars like our Sun, each of which moves in the summed gravitational field of all the remaining stars. Thus galaxy dynamics becomes a self-consistent n-body problem with forces given by Newtonian gravitation. Strong nonlinearity in the gravitational force and the inherent nonlinearity of self-consistent problems both argue for a numerical approach. The technique of numerical experiments consis of constructing an environment in the computer that is as close as possible to the physical conditions in a real galaxy and then carrying out experiments much like laboratory experiments in physics or engineering, in this environment. Computationally, an experiment is an initial value problem, and a good deal of thought and effort goes into the design of the starting conditions that serve as initial values. Experiments are run at Ames because all the 'equipment' is in place-the programs, the necessary computational power, and good facilities for post-run analysis. Our goal for this research program is to study the nuclear regions in detail and this means replacing most of the galaxy by a suitable boundary condition to allow the full capability of numerical experiments to be brought to bear on a small region perhaps 1/1000 of the linear dimensions of an entire galaxy. This is an extremely delicate numerical problem, one in which some small feature overlook, can easily lead to a collapse or blow-up of the entire system. All particles attract each other in gravitational problems, and the 1/r(sup 2) force is: (1) nonlinear; (2) strong at short range; (3) long-range, and (4) unscreened at any distance.
NASA Astrophysics Data System (ADS)
Otto, F. E. L.; Mitchell, D.; Sippel, S.; Black, M. T.; Dittus, A. J.; Harrington, L. J.; Mohd Saleh, N. H.
2014-12-01
A shift in the distribution of socially-relevant climate variables such as daily minimum winter temperatures and daily precipitation extremes, has been attributed to anthropogenic climate change for various mid-latitude regions. However, while there are many process-based arguments suggesting also a change in the shape of these distributions, attribution studies demonstrating this have not currently been undertaken. Here we use a very large initial condition ensemble of ~40,000 members simulating the European winter 2013/2014 using the distributed computing infrastructure under the weather@home project. Two separate scenarios are used:1. current climate conditions, and 2. a counterfactual scenario of "world that might have been" without anthropogenic forcing. Specifically focusing on extreme events, we assess how the estimated parameters of the Generalized Extreme Value (GEV) distribution vary depending on variable-type, sampling frequency (daily, monthly, …) and geographical region. We find that the location parameter changes for most variables but, depending on the region and variables, we also find significant changes in scale and shape parameters. The very large ensemble allows, furthermore, to assess whether such findings in the fitted GEV distributions are consistent with an empirical analysis of the model data, and whether the most extreme data still follow a known underlying distribution that in a small sample size might otherwise be thought of as an out-lier. The ~40,000 member ensemble is simulated using 12 different SST patterns (1 'observed', and 11 best guesses of SSTs with no anthropogenic warming). The range in SSTs, along with the corresponding changings in the NAO and high-latitude blocking inform on the dynamics governing some of these extreme events. While strong tele-connection patterns are not found in this particular experiment, the high number of simulated extreme events allows for a more thorough analysis of the dynamics than has been performed before. Therefore, combining extreme value theory with very large ensemble simulations allows us to understand the dynamics of changes in extreme events which is not possible just using the former but also shows in which cases statistics combined with smaller ensembles give as valid results as very large initial conditions.
DOT National Transportation Integrated Search
1965-08-01
The presence of unstable soils in many areas of Louisiana results in numerous problems in design and construction in these areas. These problem soils are primarily of two categories, the first of which consists of the high clay contents and extreme p...
Bidirectional extreme learning machine for regression problem and its learning effectiveness.
Yang, Yimin; Wang, Yaonan; Yuan, Xiaofang
2012-09-01
It is clear that the learning effectiveness and learning speed of neural networks are in general far slower than required, which has been a major bottleneck for many applications. Recently, a simple and efficient learning method, referred to as extreme learning machine (ELM), was proposed by Huang , which has shown that, compared to some conventional methods, the training time of neural networks can be reduced by a thousand times. However, one of the open problems in ELM research is whether the number of hidden nodes can be further reduced without affecting learning effectiveness. This brief proposes a new learning algorithm, called bidirectional extreme learning machine (B-ELM), in which some hidden nodes are not randomly selected. In theory, this algorithm tends to reduce network output error to 0 at an extremely early learning stage. Furthermore, we find a relationship between the network output error and the network output weights in the proposed B-ELM. Simulation results demonstrate that the proposed method can be tens to hundreds of times faster than other incremental ELM algorithms.
NASA Astrophysics Data System (ADS)
Reinstorf, F.; Kramer, S.; Koch, T.; Pfützner, B.
2017-12-01
Extreme weather conditions during the years 2009 - 2011 in combination with changes in the regional water management led to maximum groundwater levels in large areas of Germany in 2011. This resulted in extensive water logging, with problems especially in urban areas near rivers, where water logging produced huge problems for buildings and infrastructure. The acute situation still exists in many areas and requires the development of solution concepts. Taken the example of the Elbe-Saale-Region in the Federal State of Saxony-Anhalt, were a pilot research project was carried out, the analytical situation, the development of a management tool and the implementation of a groundwater management concept are shown. The central tool is a coupled water budget - groundwater flow model. In combination with sophisticated multi-scale parameter estimation, a high-resolution groundwater level simulation was carried out. A decision support process with an intensive stakeholder interaction combined with high-resolution simulations enables the development of a management concept for extreme groundwater situations in consideration of sustainable and environmentally sound solutions mainly on the base of passive measures.
[Outlier cases in surgical disciplines. Micro-economic and macro-economic problems].
Tecklenburg, A; Liebeneiner, J; Schaefer, O
2009-09-01
Postoperative complications will always occur and the negative impact puts strain on patients, relatives and the attending physicians. The conversion to a remuneration system based on flat rates (diagnosis-related groups) presents additional economic problems for hospitals in some resource-intensive treatments. This particularly pertains to extremely cost-intensive cases in which costs succeed revenue by the factor of 2 and are often surgical procedures. Here the economic risk increases with the number of interventions performed. Despite improvements in the remuneration system this problem persists. An improved payment for these treatments is desirable. To achieve this it is necessary to systematically analyze the extremely cost-intensive cases by experts of different medical disciplines to create a data basis for a proposal of a cost-covering payment.
Neural architecture design based on extreme learning machine.
Bueno-Crespo, Andrés; García-Laencina, Pedro J; Sancho-Gómez, José-Luis
2013-12-01
Selection of the optimal neural architecture to solve a pattern classification problem entails to choose the relevant input units, the number of hidden neurons and its corresponding interconnection weights. This problem has been widely studied in many research works but their solutions usually involve excessive computational cost in most of the problems and they do not provide a unique solution. This paper proposes a new technique to efficiently design the MultiLayer Perceptron (MLP) architecture for classification using the Extreme Learning Machine (ELM) algorithm. The proposed method provides a high generalization capability and a unique solution for the architecture design. Moreover, the selected final network only retains those input connections that are relevant for the classification task. Experimental results show these advantages. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Bárdossy, András; Pegram, Geoffrey
2017-01-01
The use of radar measurements for the space time estimation of precipitation has for many decades been a central topic in hydro-meteorology. In this paper we are interested specifically in daily and sub-daily extreme values of precipitation at gauged or ungauged locations which are important for design. The purpose of the paper is to develop a methodology to combine daily precipitation observations and radar measurements to estimate sub-daily extremes at point locations. Radar data corrected using precipitation-reflectivity relationships lead to biased estimations of extremes. Different possibilities of correcting systematic errors using the daily observations are investigated. Observed gauged daily amounts are interpolated to unsampled points and subsequently disaggregated using the sub-daily values obtained by the radar. Different corrections based on the spatial variability and the subdaily entropy of scaled rainfall distributions are used to provide unbiased corrections of short duration extremes. Additionally a statistical procedure not based on a matching day by day correction is tested. In this last procedure as we are only interested in rare extremes, low to medium values of rainfall depth were neglected leaving a small number of L days of ranked daily maxima in each set per year, whose sum typically comprises about 50% of each annual rainfall total. The sum of these L day maxima is first iterpolated using a Kriging procedure. Subsequently this sum is disaggregated to daily values using a nearest neighbour procedure. The daily sums are then disaggregated by using the relative values of the biggest L radar based days. Of course, the timings of radar and gauge maxima can be different, so the method presented here uses radar for disaggregating daily gauge totals down to 15 min intervals in order to extract the maxima of sub-hourly through to daily rainfall. The methodologies were tested in South Africa, where an S-band radar operated relatively continuously at Bethlehem from 1998 to 2003, whose scan at 1.5 km above ground [CAPPI] overlapped a dense (10 km spacing) set of 45 pluviometers recording in the same 6-year period. This valuable set of data was obtained from each of 37 selected radar pixels [1 km square in plan] which contained a pluviometer not masked out by the radar foot-print. The pluviometer data were also aggregated to daily totals, for the same purpose. The extremes obtained using disaggregation methods were compared to the observed extremes in a cross validation procedure. The unusual and novel goal was not to obtain the reproduction of the precipitation matching in space and time, but to obtain frequency distributions of the point extremes, which we found to be stable.
Pituitary, gonadal and adrenal hormones after prolonged residence at extreme altitude in man.
Basu, M; Pal, K; Prasad, R; Malhotra, A S; Rao, K S; Sawhney, R C
1997-06-01
High altitude-induced alterations in pituitary, gonadal and adrenal hormones were studied in (i) eugonadal men from the armed forces who were resident at sea level (SL), (ii) SL residents staying at an altitude of 3542 m for periods ranging from 3 to 12 months (acclimatized lowlanders, ALL), (iii) ALL who stayed at 6300 m for 6 months, (iv) ALL who trekked from 3542 to 5080 m and stayed at an altitude of more than 6300 m in the glacier region for 6 months, and (v) high-altitude natives (HAN) resident at an altitude of 3300-3700 m. Circulating levels of LH, FSH, prolactin, cortisol, testosterone, dihydrotestosterone (DHT) and progesterone in ALL at 3542 m and in HAN were not significantly different (p > 0.05) from the SL control values. When the ALL living at 3542 m trekked to an extreme altitude of 5080 m, their testosterone levels showed a significant decrease (p < 0.01) compared to the preceding altitude values but had returned to SL values when measured after 6 months' continuous stay at 6300 m. As with testosterone, the levels of DHT and oestradiol-17 beta (E2) after prolonged stay at extreme altitude were also not significantly different (p > 0.05) from the SL values. The LH levels after trekking to 5080 m were significantly higher (p < 0.01) than at an altitude of 3542 m, but decreased to levels found at 3542 m or SL after prolonged residence at extreme altitude. Plasma levels of ACTH, prolactin, FSH and cortisol on arrival at 5080 m, and after a 6-month stay at extreme altitude, were not significantly different (p > 0.05) from the SL values. Plasma progesterone levels tended to increase on arrival at 5080 m but a significant increase (p < 0.001) was evident only after a 6-month stay at extreme altitude. These observations suggest that prolonged residence at lower as well as at extreme altitude does not appreciably alter blood levels of pituitary, gonadal or adrenal hormones except for plasma levels of progesterone. The exact mechanism and significance of this increase remains unknown, but may be important in increasing the sensitivity of the hypoxic ventilatory response and activation of haemoglobin synthesis.
A variational approach to probing extreme events in turbulent dynamical systems
Farazmand, Mohammad; Sapsis, Themistoklis P.
2017-01-01
Extreme events are ubiquitous in a wide range of dynamical systems, including turbulent fluid flows, nonlinear waves, large-scale networks, and biological systems. We propose a variational framework for probing conditions that trigger intermittent extreme events in high-dimensional nonlinear dynamical systems. We seek the triggers as the probabilistically feasible solutions of an appropriately constrained optimization problem, where the function to be maximized is a system observable exhibiting intermittent extreme bursts. The constraints are imposed to ensure the physical admissibility of the optimal solutions, that is, significant probability for their occurrence under the natural flow of the dynamical system. We apply the method to a body-forced incompressible Navier-Stokes equation, known as the Kolmogorov flow. We find that the intermittent bursts of the energy dissipation are independent of the external forcing and are instead caused by the spontaneous transfer of energy from large scales to the mean flow via nonlinear triad interactions. The global maximizer of the corresponding variational problem identifies the responsible triad, hence providing a precursor for the occurrence of extreme dissipation events. Specifically, monitoring the energy transfers within this triad allows us to develop a data-driven short-term predictor for the intermittent bursts of energy dissipation. We assess the performance of this predictor through direct numerical simulations. PMID:28948226
Labouliere, Christa D; Kleinman, Marjorie; Gould, Madelyn S
2015-04-01
The majority of suicidal adolescents have no contact with mental health services, and reduced help-seeking in this population further lessens the likelihood of accessing treatment. A commonly-reported reason for not seeking help is youths' perception that they should solve problems on their own. In this study, we explore associations between extreme self-reliance behavior (i.e., solving problems on your own all of the time), help-seeking behavior, and mental health symptoms in a community sample of adolescents. Approximately 2150 adolescents, across six schools, participated in a school-based suicide prevention screening program, and a subset of at-risk youth completed a follow-up interview two years later. Extreme self-reliance was associated with reduced help-seeking, clinically-significant depressive symptoms, and serious suicidal ideation at the baseline screening. Furthermore, in a subset of youth identified as at-risk at the baseline screening, extreme self-reliance predicted level of suicidal ideation and depressive symptoms two years later even after controlling for baseline symptoms. Given these findings, attitudes that reinforce extreme self-reliance behavior may be an important target for youth suicide prevention programs. Reducing extreme self-reliance in youth with suicidality may increase their likelihood of appropriate help-seeking and concomitant reductions in symptoms.
Labouliere, Christa D.; Kleinman, Marjorie; Gould, Madelyn S.
2015-01-01
The majority of suicidal adolescents have no contact with mental health services, and reduced help-seeking in this population further lessens the likelihood of accessing treatment. A commonly-reported reason for not seeking help is youths’ perception that they should solve problems on their own. In this study, we explore associations between extreme self-reliance behavior (i.e., solving problems on your own all of the time), help-seeking behavior, and mental health symptoms in a community sample of adolescents. Approximately 2150 adolescents, across six schools, participated in a school-based suicide prevention screening program, and a subset of at-risk youth completed a follow-up interview two years later. Extreme self-reliance was associated with reduced help-seeking, clinically-significant depressive symptoms, and serious suicidal ideation at the baseline screening. Furthermore, in a subset of youth identified as at-risk at the baseline screening, extreme self-reliance predicted level of suicidal ideation and depressive symptoms two years later even after controlling for baseline symptoms. Given these findings, attitudes that reinforce extreme self-reliance behavior may be an important target for youth suicide prevention programs. Reducing extreme self-reliance in youth with suicidality may increase their likelihood of appropriate help-seeking and concomitant reductions in symptoms. PMID:25837350
Pediatric lower extremity mower injuries.
Hill, Sean M; Elwood, Eric T
2011-09-01
Lawn mower injuries in children represent an unfortunate common problem to the plastic reconstructive surgeon. There are approximately 68,000 per year reported in the United States. Compounding this problem is the fact that a standard treatment algorithm does not exist. This study follows a series of 7 pediatric patients treated for lower extremity mower injuries by a single plastic surgeon. The extent of soft tissue injury varied. All patients were treated with negative pressure wound therapy as a bridge to definitive closure. Of the 7 patients, 4 required skin grafts, 1 required primary closure, 1 underwent a lower extremity amputation secondary to wounds, and 1 was repaired using a cross-leg flap. Function limitations were minimal for all of our patients after reconstruction. Our basic treatment algorithm is presented with initial debridement followed by the simplest method possible for wound closure using negative pressure wound therapy, if necessary.
Visual Tracking Based on Extreme Learning Machine and Sparse Representation
Wang, Baoxian; Tang, Linbo; Yang, Jinglin; Zhao, Baojun; Wang, Shuigen
2015-01-01
The existing sparse representation-based visual trackers mostly suffer from both being time consuming and having poor robustness problems. To address these issues, a novel tracking method is presented via combining sparse representation and an emerging learning technique, namely extreme learning machine (ELM). Specifically, visual tracking can be divided into two consecutive processes. Firstly, ELM is utilized to find the optimal separate hyperplane between the target observations and background ones. Thus, the trained ELM classification function is able to remove most of the candidate samples related to background contents efficiently, thereby reducing the total computational cost of the following sparse representation. Secondly, to further combine ELM and sparse representation, the resultant confidence values (i.e., probabilities to be a target) of samples on the ELM classification function are used to construct a new manifold learning constraint term of the sparse representation framework, which tends to achieve robuster results. Moreover, the accelerated proximal gradient method is used for deriving the optimal solution (in matrix form) of the constrained sparse tracking model. Additionally, the matrix form solution allows the candidate samples to be calculated in parallel, thereby leading to a higher efficiency. Experiments demonstrate the effectiveness of the proposed tracker. PMID:26506359
NASA Astrophysics Data System (ADS)
Brown, J. C.; Mallik, P. C. V.; Badnell, N. R.
2010-06-01
Brown and Mallik (BM) recently claimed that non-thermal recombination (NTR) can be a dominant source of flare hard X-rays (HXRs) from hot coronal and chromospheric sources. However, major discrepancies between the thermal continua predicted by BM and by the Chianti database as well as RHESSI flare data, led us to discover substantial errors in the heuristic expression used by BM to extend the Kramers expressions beyond the hydrogenic case. Here we present the relevant corrected expressions and show the key modified results. We conclude that, in most cases, NTR emission was overestimated by a factor of 1-8 by BM but is typically still large enough (as much as 20-30% of the total emission) to be very important for electron spectral inference and detection of electron spectral features such as low energy cut-offs since the recombination spectra contain sharp edges. For extreme temperature regimes and/or if the Fe abundance were as high as some values claimed, NTR could even be the dominant source of flare HXRs, reducing the electron number and energy budget, problems such as in the extreme coronal HXR source cases reported by e.g. Krucker et al.
Probability distribution of extreme share returns in Malaysia
NASA Astrophysics Data System (ADS)
Zin, Wan Zawiah Wan; Safari, Muhammad Aslam Mohd; Jaaman, Saiful Hafizah; Yie, Wendy Ling Shin
2014-09-01
The objective of this study is to investigate the suitable probability distribution to model the extreme share returns in Malaysia. To achieve this, weekly and monthly maximum daily share returns are derived from share prices data obtained from Bursa Malaysia over the period of 2000 to 2012. The study starts with summary statistics of the data which will provide a clue on the likely candidates for the best fitting distribution. Next, the suitability of six extreme value distributions, namely the Gumbel, Generalized Extreme Value (GEV), Generalized Logistic (GLO) and Generalized Pareto (GPA), the Lognormal (GNO) and the Pearson (PE3) distributions are evaluated. The method of L-moments is used in parameter estimation. Based on several goodness of fit tests and L-moment diagram test, the Generalized Pareto distribution and the Pearson distribution are found to be the best fitted distribution to represent the weekly and monthly maximum share returns in Malaysia stock market during the studied period, respectively.
Trukhmanov, I M; Suslova, G A; Ponomarenko, G N
This paper is devoted to the characteristic of the informative value of the functional step test with the application of the heel cushions in the children for the purpose of differential diagnostics of anatomic and functional differences in the length of the lower extremities. A total of 85 schoolchildren with different length of the lower extremities have been examined. The comparative evaluation of the results of clinical and instrumental examinations was undertaken. The data obtained with the help of the functional step test give evidence of its very high sensitivity, specificity, and clinical significant as a tool for the examination of the children with different length of the low extremities. It is concluded that the test is one of the most informative predictors of the effectiveness of rehabilitation in the children with different length of the lower extremities.
Extreme value laws for fractal intensity functions in dynamical systems: Minkowski analysis
NASA Astrophysics Data System (ADS)
Mantica, Giorgio; Perotti, Luca
2016-09-01
Typically, in the dynamical theory of extremal events, the function that gauges the intensity of a phenomenon is assumed to be convex and maximal, or singular, at a single, or at most a finite collection of points in phase-space. In this paper we generalize this situation to fractal landscapes, i.e. intensity functions characterized by an uncountable set of singularities, located on a Cantor set. This reveals the dynamical rôle of classical quantities like the Minkowski dimension and content, whose definition we extend to account for singular continuous invariant measures. We also introduce the concept of extremely rare event, quantified by non-standard Minkowski constants and we study its consequences to extreme value statistics. Limit laws are derived from formal calculations and are verified by numerical experiments. Dedicated to the memory of Joseph Ford, on the twentieth anniversary of his departure.
Momentum broadening in unstable quark-gluon plasma
Carrington, M. E.; Mrówczyński, St.; Schenke, B.
2017-02-01
We present that quark-gluon plasma produced at the early stage of ultrarelativistic heavy-ion collisions is unstable, if weakly coupled, due to the anisotropy of its momentum distribution. Chromomagnetic fields are spontaneously generated and can reach magnitudes much exceeding typical values of the fields in equilibrated plasma. We consider a high-energy test parton traversing an unstable plasma that is populated with strong fields. We study the momentum broadening parametermore » $$ˆ\\atop{q}$$ which determines the radiative energy loss of the test parton. We develop a formalism which gives $$ˆ\\atop{q}$$ as the solution of an initial value problem, and we focus on extremely oblate plasmas which are physically relevant for relativistic heavy-ion collisions. The parameter $$ˆ\\atop{q}$$ is found to be strongly dependent on time. For short times it is of the order of the equilibrium value, but at later times $$ˆ\\atop{q}$$ grows exponentially due to the interaction of the test parton with unstable modes and becomes much bigger than the value in equilibrium. The momentum broadening is also strongly directionally dependent and is largest when the test parton velocity is transverse to the beam axis. Lastly, consequences of our findings for the phenomenology of jet quenching in relativistic heavy-ion collisions are briefly discussed.« less
A dependence modelling study of extreme rainfall in Madeira Island
NASA Astrophysics Data System (ADS)
Gouveia-Reis, Délia; Guerreiro Lopes, Luiz; Mendonça, Sandra
2016-08-01
The dependence between variables plays a central role in multivariate extremes. In this paper, spatial dependence of Madeira Island's rainfall data is addressed within an extreme value copula approach through an analysis of maximum annual data. The impact of altitude, slope orientation, distance between rain gauge stations and distance from the stations to the sea are investigated for two different periods of time. The results obtained highlight the influence of the island's complex topography on the spatial distribution of extreme rainfall in Madeira Island.
Novel Atmospheric and Sea State Modeling in Ocean Energy Applications
NASA Astrophysics Data System (ADS)
Kallos, George; Galanis, George; Kalogeri, Christina; Larsen, Xiaoli Guo
2013-04-01
The rapidly increasing use of renewable energy sources poses new challenges for the research and technological community today. The integration of the, usually, highly variable wind and wave energy amounts into the general grid, the optimization of energy transition and the forecast of extreme values that could lead to instabilities and failures of the system can be listed among them. In the present work, novel methodologies based on state of the art numerical wind/wave simulation systems and advanced statistical techniques addressing such type of problems are discussed. In particular, extremely high resolution modeling systems simulating the atmospheric and sea state conditions with spatial resolution of 100 meters or less and temporal discretization of a few seconds are utilized in order to simulate in the most detailed way the combined wind-wave energy potential at offshore sites. In addition, a statistical analysis based on a variety of mean and variation measures as well as univariate and bivariate probability distributions is used for the estimation of the variability of the power potential revealing the advantages of the use of combined forms of energy by offshore platforms able to produce wind and wave power simultaneously. The estimation and prediction of extreme wind/wave conditions - a critical issue both for site assessment and infrastructure maintenance - is also studied by means of the 50-year return period over areas with increased power potential. This work has been carried out within the framework of the FP7 project MARINA Platform (http://www.marina-platform.info/index.aspx).
ERIC Educational Resources Information Center
Mahan, Luther A.
1970-01-01
Compares the effects of two problem-solving teaching approaches. Lower ability students in an activity group demonstrated superior growth in basic science understanding, &roblem-solving skills, science interests, personal adjustment, and school attitudes. Neither method favored cognitive learning by higher ability students. (PR)
Resource and Information Maintenance of Foreign Citizens in Russia: Statement of a Problem
ERIC Educational Resources Information Center
Dorozhkin, Evgenij M.; Leontyeva, Tatyana V.; Shchetynina, Anna V.; Krivtsov, Artem I.
2016-01-01
The relevance of studied problem is determined by the fact that in a multiethnic country the problem of the ethno-cultural specificity of different groups of people is extremely severe, and the activity of the processes of intercultural communications in the modern world requires knowledge and understanding of other cultures. The aim of the…
A New Approach to Extreme Value Estimation Applicable to a Wide Variety of Random Variables
NASA Technical Reports Server (NTRS)
Holland, Frederic A., Jr.
1997-01-01
Designing reliable structures requires an estimate of the maximum and minimum values (i.e., strength and load) that may be encountered in service. Yet designs based on very extreme values (to insure safety) can result in extra material usage and hence, uneconomic systems. In aerospace applications, severe over-design cannot be tolerated making it almost mandatory to design closer to the assumed limits of the design random variables. The issue then is predicting extreme values that are practical, i.e. neither too conservative or non-conservative. Obtaining design values by employing safety factors is well known to often result in overly conservative designs and. Safety factor values have historically been selected rather arbitrarily, often lacking a sound rational basis. To answer the question of how safe a design needs to be has lead design theorists to probabilistic and statistical methods. The so-called three-sigma approach is one such method and has been described as the first step in utilizing information about the data dispersion. However, this method is based on the assumption that the random variable is dispersed symmetrically about the mean and is essentially limited to normally distributed random variables. Use of this method can therefore result in unsafe or overly conservative design allowables if the common assumption of normality is incorrect.
Eum, Hyung-Il; Gachon, Philippe; Laprise, René
2016-01-01
This study examined the impact of model biases on climate change signals for daily precipitation and for minimum and maximum temperatures. Through the use of multiple climate scenarios from 12 regional climate model simulations, the ensemble mean, and three synthetic simulations generated by a weighting procedure, we investigated intermodel seasonal climate change signals between current and future periods, for both median and extreme precipitation/temperature values. A significant dependence of seasonal climate change signals on the model biases over southern Québec in Canada was detected for temperatures, but not for precipitation. This suggests that the regional temperature change signal is affectedmore » by local processes. Seasonally, model bias affects future mean and extreme values in winter and summer. In addition, potentially large increases in future extremes of temperature and precipitation values were projected. For three synthetic scenarios, systematically less bias and a narrow range of mean change for all variables were projected compared to those of climate model simulations. In addition, synthetic scenarios were found to better capture the spatial variability of extreme cold temperatures than the ensemble mean scenario. Finally, these results indicate that the synthetic scenarios have greater potential to reduce the uncertainty of future climate projections and capture the spatial variability of extreme climate events.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eum, Hyung-Il; Gachon, Philippe; Laprise, René
This study examined the impact of model biases on climate change signals for daily precipitation and for minimum and maximum temperatures. Through the use of multiple climate scenarios from 12 regional climate model simulations, the ensemble mean, and three synthetic simulations generated by a weighting procedure, we investigated intermodel seasonal climate change signals between current and future periods, for both median and extreme precipitation/temperature values. A significant dependence of seasonal climate change signals on the model biases over southern Québec in Canada was detected for temperatures, but not for precipitation. This suggests that the regional temperature change signal is affectedmore » by local processes. Seasonally, model bias affects future mean and extreme values in winter and summer. In addition, potentially large increases in future extremes of temperature and precipitation values were projected. For three synthetic scenarios, systematically less bias and a narrow range of mean change for all variables were projected compared to those of climate model simulations. In addition, synthetic scenarios were found to better capture the spatial variability of extreme cold temperatures than the ensemble mean scenario. Finally, these results indicate that the synthetic scenarios have greater potential to reduce the uncertainty of future climate projections and capture the spatial variability of extreme climate events.« less
NASA Astrophysics Data System (ADS)
Rieder, H. E.; Staehelin, J.; Maeder, J. A.; Ribatet, M.; Stübi, R.; Weihs, P.; Holawe, F.; Peter, T.; Davison, A. C.
2009-04-01
Over the last few decades negative trends in stratospheric ozone have been studied because of the direct link between decreasing stratospheric ozone and increasing surface UV-radiation. Recently a discussion on ozone recovery has begun. Long-term measurements of total ozone extending back earlier than 1958 are limited and only available from a few stations in the northern hemisphere. The world's longest total ozone record is available from Arosa, Switzerland (Staehelin et al., 1998a,b). At this site total ozone measurements have been made since late 1926 through the present day. Within this study (Rieder et al., 2009) new tools from extreme value theory (e.g. Coles, 2001; Ribatet, 2007) are applied to select mathematically well-defined thresholds for extreme low and extreme high total ozone. A heavy-tail focused approach is used by fitting the Generalized Pareto Distribution (GPD) to the Arosa time series. Asymptotic arguments (Pickands, 1975) justify the use of the GPD for modeling exceedances over a sufficiently high (or below a sufficiently low) threshold (Coles, 2001). More precisely, the GPD is the limiting distribution of normalized excesses over a threshold, as the threshold approaches the endpoint of the distribution. In practice, GPD parameters are fitted, to exceedances by maximum likelihood or other methods - such as the probability weighted moments. A preliminary step consists in defining an appropriate threshold for which the asymptotic GPD approximation holds. Suitable tools for threshold selection as the MRL-plot (mean residual life plot) and TC-plot (stability plot) from the POT-package (Ribatet, 2007) are presented. The frequency distribution of extremes in low (termed ELOs) and high (termed EHOs) total ozone and their influence on the long-term changes in total ozone are analyzed. Further it is shown that from the GPD-model the distribution of so-called ozone mini holes (e.g. Bojkov and Balis, 2001) can be precisely estimated and that the "extremes concept" provides new information on the data distribution and variability within the Arosa record as well as on the influence of ELOs and EHOs on the long-term trends of the ozone time series. References: Bojkov, R. D., and Balis, D.S.: Characteristics of episodes with extremely low ozone values in the northern middle latitudes 1975-2000, Ann. Geophys., 19, 797-807, 2001. Coles, S.: An Introduction to Statistical Modeling of Extreme Values, Springer Series in Statistics, ISBN:1852334592, Springer, Berlin, 2001. Pickands, J.: Statistical inference using extreme order statistics, Ann. Stat., 3, 1, 119-131, 1975. Ribatet, M.: POT: Modelling peaks over a threshold, R News, 7, 34-36, 2007. Rieder, H.E., Staehelin, J., Maeder, J.A., Stübi, R., Weihs, P., Holawe, F., and M. Ribatet: From ozone mini holes and mini highs towards extreme value theory: New insights from extreme events and non stationarity, submitted to J. Geophys. Res., 2009. Staehelin, J., Kegel, R., and Harris, N. R.: Trend analysis of the homogenized total ozone series of Arosa (Switzerland), 1929-1996, J. Geophys. Res., 103(D7), 8389-8400, doi:10.1029/97JD03650, 1998a. Staehelin, J., Renaud, A., Bader, J., McPeters, R., Viatte, P., Hoegger, B., Bugnion, V., Giroud, M., and Schill, H.: Total ozone series at Arosa (Switzerland): Homogenization and data comparison, J. Geophys. Res., 103(D5), 5827-5842, doi:10.1029/97JD02402, 1998b.
400 Years of summer hydroclimate from stable isotopes in Iberian trees
NASA Astrophysics Data System (ADS)
Andreu-Hayles, Laia; Ummenhofer, Caroline C.; Barriendos, Mariano; Schleser, Gerhard H.; Helle, Gerhard; Leuenberger, Markus; Gutiérrez, Emilia; Cook, Edward R.
2017-07-01
Tree rings are natural archives that annually record distinct types of past climate variability depending on the parameters measured. Here, we use ring-width and stable isotopes in cellulose of trees from the northwestern Iberian Peninsula (IP) to understand regional summer hydroclimate over the last 400 years and the associated atmospheric patterns. Correlations between tree rings and climate data demonstrate that isotope signatures in the targeted Iberian pine forests are very sensitive to water availability during the summer period, and are mainly controlled by stomatal conductance. Non-linear methods based on extreme events analysis allow for capturing distinct seasonal climatic variability recorded by tree-ring parameters and asymmetric signals of the associated atmospheric features. Moreover, years with extreme high (low) values in the tree-ring records were characterised by coherent large-scale atmospheric circulation patterns with reduced (enhanced) moisture transport onto the northwestern IP. These analyses of extremes revealed that high/low proxy values do not necessarily correspond to mirror images in the atmospheric anomaly patterns, suggesting different drivers of these patterns and the corresponding signature recorded in the proxies. Regional hydroclimate features across the broader IP and western Europe during extreme wet/dry summers detected by the northwestern IP trees compare favourably to independent multicentury sea level pressure and drought reconstructions for Europe. Historical records also validate our findings that attribute non-linear moisture signals recorded by extreme tree-ring values to distinct large-scale atmospheric patterns and allow for 400-year reconstructions of the frequency of occurrence of extreme conditions in late spring and summer hydroclimate.
400 years of summer hydroclimate from stable isotopes in Iberian trees
NASA Astrophysics Data System (ADS)
Andreu-Hayles, Laia; Ummenhofer, Caroline C.; Barriendos, Mariano; Schleser, Gerhard H.; Helle, Gerhard; Leuenberger, Markus; Gutierrez, Emilia; Cook, Edward R.
2017-04-01
Tree rings are natural archives that annually record distinct types of past climate variability depending on the parameters measured. Here, we use ring-width and stable isotopes in cellulose of trees from the northwestern Iberian Peninsula (IP) to understand regional summer hydroclimate over the last 400 years and the associated atmospheric patterns. Correlations between tree rings and climate data demonstrate that isotope signatures in the targeted Iberian pine forests are very sensitive to water availability during the summer period, and are mainly controlled by stomatal conductance. Non-linear methods based on extreme events analysis allow for capturing distinct seasonal climatic variability recorded by tree-ring parameters and asymmetric signals of the associated atmospheric features. Moreover, years with extreme high (low) values in the tree-ring records were characterised by coherent large-scale atmospheric circulation patterns with reduced (enhanced) moisture transport onto the northwestern IP. These analyses of extremes revealed that high/low proxy values do not necessarily correspond to mirror images in the atmospheric anomaly patterns, suggesting different drivers of these patterns and the corresponding signature recorded in the proxies. Regional hydroclimate features across the broader IP and western Europe during extreme wet/dry summers detected by the northwestern IP trees compare favourably to an independent multicentury sea level pressure and drought reconstruction for Europe. Historical records also validate our findings that attribute non-linear moisture signals recorded by extreme tree-ring values to distinct large-scale atmospheric patterns and allow for 400-yr reconstructions of the frequency of occurrence of extreme conditions in summer hydroclimate. We will discuss how the results for Lillo compare with other records.
NASA Astrophysics Data System (ADS)
García-Cueto, O. Rafael; Cavazos, M. Tereza; de Grau, Pamela; Santillán-Soto, Néstor
2014-04-01
The generalized extreme value distribution is applied in this article to model the statistical behavior of the maximum and minimum temperature distribution tails in four cities of Baja California in northwestern Mexico, using data from 1950-2010. The approach used of the maximum of annual time blocks. Temporal trends were included as covariates in the location parameter (μ), which resulted in significant improvements to the proposed models, particularly for the extreme maximum temperature values in the cities of Mexicali, Tijuana, and Tecate, and the extreme minimum temperature values in Mexicali and Ensenada. These models were used to estimate future probabilities over the next 100 years (2015-2110) for different time periods, and they were compared with changes in the extreme (P90th and P10th) percentiles of maximum and minimum temperature scenarios for a set of six general circulation models under low (RCP4.5) and high (RCP8.5) radiative forcings. By the end of the twenty-first century, the scenarios of the changes in extreme maximum summer temperature are of the same order in both the statistical model and the high radiative scenario (increases of 4-5 °C). The low radiative scenario is more conservative (increases of 2-3 °C). The winter scenario shows that minimum temperatures could be less severe; the temperature increases suggested by the probabilistic model are greater than those projected for the end of the century by the set of global models under RCP4.5 and RCP8.5 scenarios. The likely impacts on the region are discussed.
Food in health security in North East Asia.
Moon, Hyun-Kyung
2009-01-01
Food and health security in North East Asia including South Korea, North Korea, China and Japan was compared. Because this region contains countries with many complex problems, it is worthwhile to study the current situation. With about 24% of the world's population, all North East Asian countries supply between 2400 and 3000 Kcal of energy. Regarding health status, two extreme problems exist. One is malnutrition in North Korea and China and the other is chronic degenerative disease in Japan, South Korea and China. Because quality, quantity and safety of the food supply have to be secured for health security, some topics are selected and discussed. 1) World food price can have an effect on food security for countries with a low food self sufficiency rate such as Japan and Korea; specially, for the urban poor. 2) Population aging can increase the number of aged people without food security. An aged population with less income and no support from their off-spring, because of disappearing traditional values, may have food insecurity. 3) Population growth and economic growth in this region may worsen food problems. Since a quarter of the world's population resides in this region, populations will continue to increase. With economic growth, people will consume more animal products. 4) Climate change generates food production problems. As the progress of industry continues, there will be less land for food and more pollutants in the environment. 5) Political instability will cause food insecurity and conflict will cause problems with regard to food aid.
NASA Technical Reports Server (NTRS)
Kuwata, Yoshiaki; Pavone, Marco; Balaram, J. (Bob)
2012-01-01
This paper presents a novel risk-constrained multi-stage decision making approach to the architectural analysis of planetary rover missions. In particular, focusing on a 2018 Mars rover concept, which was considered as part of a potential Mars Sample Return campaign, we model the entry, descent, and landing (EDL) phase and the rover traverse phase as four sequential decision-making stages. The problem is to find a sequence of divert and driving maneuvers so that the rover drive is minimized and the probability of a mission failure (e.g., due to a failed landing) is below a user specified bound. By solving this problem for several different values of the model parameters (e.g., divert authority), this approach enables rigorous, accurate and systematic trade-offs for the EDL system vs. the mobility system, and, more in general, cross-domain trade-offs for the different phases of a space mission. The overall optimization problem can be seen as a chance-constrained dynamic programming problem, with the additional complexity that 1) in some stages the disturbances do not have any probabilistic characterization, and 2) the state space is extremely large (i.e, hundreds of millions of states for trade-offs with high-resolution Martian maps). To this purpose, we solve the problem by performing an unconventional combination of average and minimax cost analysis and by leveraging high efficient computation tools from the image processing community. Preliminary trade-off results are presented.
Kodejska, Milos; Mokry, Pavel; Linhart, Vaclav; Vaclavik, Jan; Sluka, Tomas
2012-12-01
An adaptive system for the suppression of vibration transmission using a single piezoelectric actuator shunted by a negative capacitance circuit is presented. It is known that by using a negative-capacitance shunt, the spring constant of a piezoelectric actuator can be controlled to extreme values of zero or infinity. Because the value of spring constant controls a force transmitted through an elastic element, it is possible to achieve a reduction of transmissibility of vibrations through the use of a piezoelectric actuator by reducing its effective spring constant. Narrow frequency range and broad frequency range vibration isolation systems are analyzed, modeled, and experimentally investigated. The problem of high sensitivity of the vibration control system to varying operational conditions is resolved by applying an adaptive control to the circuit parameters of the negative capacitor. A control law that is based on the estimation of the value of the effective spring constant of a shunted piezoelectric actuator is presented. An adaptive system which achieves a self-adjustment of the negative capacitor parameters is presented. It is shown that such an arrangement allows the design of a simple electronic system which offers a great vibration isolation efficiency under variable vibration conditions.
Study of eating attitudes and behaviours in junior college students in Mumbai, India.
Tendulkar, Prajakta; Krishnadas, Rajeev; Durge, Vijay; Sharma, Sumit; Nayak, Sapna; Kamat, Sanjeev; Dhavale, Hemangee
2006-10-01
Eating disorders have been described as possible 'culture-bound syndromes', with roots in Western cultural values and conflicts. They may, in fact, be more prevalent within various non-Western cultural groups than previously recognised, as Western values become more widely accepted. Cross-cultural experiences suggest that cultural change itself may be associated with increased vulnerability to eating disorders, especially when Western values about physical aesthetics are involved. to assess the eating attitudes and behaviours among adolescents in the urban ethnic city, Mumbai, a survey was conducted amongst 451 college students. the study, based in four junior colleges, comprised 451 subjects who completed a semi-structured questionnaire, a 26-item Eating Attitudes Test (EAT-26) and the Personal Assessment Inventory (IPAT). the results revealed faulty eating habits in 13.3% of the subjects. A statistically significant proportion perceived them-selves to have problems with eating, substance use, dieting and exercise practices, resorting to extreme measures to achieve weight loss. A high rate of faulty eating habits was observed in males. Higher scores on depression and suicidal ideation were reported in the population with faulty eating habits. a significant percentage of college-going populations in urban settings probably have faulty eating habits.
Realistic anomaly-mediated supersymmetry breaking
NASA Astrophysics Data System (ADS)
Chacko, Zacharia; Luty, Markus A.; Maksymyk, Ivan; Pontón, Eduardo
2000-03-01
We consider supersymmetry breaking communicated entirely by the superconformal anomaly in supergravity. This scenario is naturally realized if supersymmetry is broken in a hidden sector whose couplings to the observable sector are suppressed by more than powers of the Planck scale, as occurs if supersymmetry is broken in a parallel universe living in extra dimensions. This scenario is extremely predictive: soft supersymmetry breaking couplings are completely determined by anomalous dimensions in the effective theory at the weak scale. Gaugino and scalar masses are naturally of the same order, and flavor-changing neutral currents are automatically suppressed. The most glaring problem with this scenario is that slepton masses are negative in the minimal supersymmetric standard model. We point out that this problem can be simply solved by coupling extra Higgs doublets to the leptons. Lepton flavor-changing neutral currents can be naturally avoided by approximate symmetries. We also describe more speculative solutions involving compositeness near the weak scale. We then turn to electroweak symmetry breaking. Adding an explicit μ term gives a value for Bμ that is too large by a factor of ~ 100. We construct a realistic model in which the μ term arises from the vacuum expectation value of a singlet field, so all weak-scale masses are directly related to m3/2. We show that fully realistic electroweak symmetry breaking can occur in this model with moderate fine-tuning.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kansa, E.J.; Axelrod, M.C.; Kercher, J.R.
1994-05-01
Our current research into the response of natural ecosystems to a hypothesized climatic change requires that we have estimates of various meteorological variables on a regularly spaced grid of points on the surface of the earth. Unfortunately, the bulk of the world`s meteorological measurement stations is located at airports that tend to be concentrated on the coastlines of the world or near populated areas. We can also see that the spatial density of the station locations is extremely non-uniform with the greatest density in the USA, followed by Western Europe. Furthermore, the density of airports is rather sparse in desertmore » regions such as the Sahara, the Arabian, Gobi, and Australian deserts; likewise the density is quite sparse in cold regions such as Antarctica Northern Canada, and interior northern Russia. The Amazon Basin in Brazil has few airports. The frequency of airports is obviously related to the population centers and the degree of industrial development of the country. We address the following problem here. Given values of meteorological variables, such as maximum monthly temperature, measured at the more than 5,500 airport stations, interpolate these values onto a regular grid of terrestrial points spaced by one degree in both latitude and longitude. This is known as the scattered data problem.« less
Extreme Mean and Its Applications
NASA Technical Reports Server (NTRS)
Swaroop, R.; Brownlow, J. D.
1979-01-01
Extreme value statistics obtained from normally distributed data are considered. An extreme mean is defined as the mean of p-th probability truncated normal distribution. An unbiased estimate of this extreme mean and its large sample distribution are derived. The distribution of this estimate even for very large samples is found to be nonnormal. Further, as the sample size increases, the variance of the unbiased estimate converges to the Cramer-Rao lower bound. The computer program used to obtain the density and distribution functions of the standardized unbiased estimate, and the confidence intervals of the extreme mean for any data are included for ready application. An example is included to demonstrate the usefulness of extreme mean application.
Parallel Computation of Flow in Heterogeneous Media Modelled by Mixed Finite Elements
NASA Astrophysics Data System (ADS)
Cliffe, K. A.; Graham, I. G.; Scheichl, R.; Stals, L.
2000-11-01
In this paper we describe a fast parallel method for solving highly ill-conditioned saddle-point systems arising from mixed finite element simulations of stochastic partial differential equations (PDEs) modelling flow in heterogeneous media. Each realisation of these stochastic PDEs requires the solution of the linear first-order velocity-pressure system comprising Darcy's law coupled with an incompressibility constraint. The chief difficulty is that the permeability may be highly variable, especially when the statistical model has a large variance and a small correlation length. For reasonable accuracy, the discretisation has to be extremely fine. We solve these problems by first reducing the saddle-point formulation to a symmetric positive definite (SPD) problem using a suitable basis for the space of divergence-free velocities. The reduced problem is solved using parallel conjugate gradients preconditioned with an algebraically determined additive Schwarz domain decomposition preconditioner. The result is a solver which exhibits a good degree of robustness with respect to the mesh size as well as to the variance and to physically relevant values of the correlation length of the underlying permeability field. Numerical experiments exhibit almost optimal levels of parallel efficiency. The domain decomposition solver (DOUG, http://www.maths.bath.ac.uk/~parsoft) used here not only is applicable to this problem but can be used to solve general unstructured finite element systems on a wide range of parallel architectures.
2010-04-01
000 the response of damage dependent processes like fatigue crack formation, a framework is needed that accounts for the extreme value life...many different damage processes (e.g. fatigue, creep, fracture). In this work, multiple material volumes for both IN100 and Ti-6Al-4V are simulated via...polycrystalline P/M Ni-base superalloy IN100 Typically, fatigue damage formation in polycrystalline superalloys has been linked to the existence of
Impact of possible climate changes on river runoff under different natural conditions
NASA Astrophysics Data System (ADS)
Gusev, Yeugeniy M.; Nasonova, Olga N.; Kovalev, Evgeny E.; Ayzel, Georgy V.
2018-06-01
The present study was carried out within the framework of the International Inter-Sectoral Impact Model Intercomparison Project (ISI-MIP) for 11 large river basins located in different continents of the globe under a wide variety of natural conditions. The aim of the study was to investigate possible changes in various characteristics of annual river runoff (mean values, standard deviations, frequency of extreme annual runoff) up to 2100 on the basis of application of the land surface model SWAP and meteorological projections simulated by five General Circulation Models (GCMs) according to four RCP scenarios. Analysis of the obtained results has shown that changes in climatic runoff are different (both in magnitude and sign) for the river basins located in different regions of the planet due to differences in natural (primarily climatic) conditions. The climatic elasticities of river runoff to changes in air temperature and precipitation were estimated that makes it possible, as the first approximation, to project changes in climatic values of annual runoff, using the projected changes in mean annual air temperature and annual precipitation for the river basins. It was found that for most rivers under study, the frequency of occurrence of extreme runoff values increases. This is true both for extremely high runoff (when the projected climatic runoff increases) and for extremely low values (when the projected climatic runoff decreases).
Natural Hazards characterisation in industrial practice
NASA Astrophysics Data System (ADS)
Bernardara, Pietro
2017-04-01
The definition of rare hydroclimatic extremes (up to 10-4 annual probability of occurrence) is of the utmost importance for the design of high value industrial infrastructures, such as grids, power plants, offshore platforms. The underestimation as well as the overestimation of the risk may lead to huge costs (ex. mid-life expensive works or overdesign) which may even prevent the project to happen. Nevertheless, the uncertainty associated to the extrapolation towards the rare frequencies are huge and manifold. They are mainly due to the scarcity of observations, the lack of quality on the extreme value records and on the arbitrary choice of the models used for extrapolations. This often put the design engineers in uncomfortable situations when they must choose the design values to use. Providentially, the recent progresses in the earth observation techniques, information technology, historical data collection and weather and ocean modelling are making huge datasets available. A careful use of big datasets of observations and modelled data are leading towards a better understanding of the physics of the underlying phenomena, the complex interactions between them and thus of the extreme events frequency extrapolations. This will move the engineering practice from the single site, small sample, application of statistical analysis to a more spatially coherent, physically driven extrapolation of extreme values. Few examples, from the EDF industrial practice are given to illustrate these progresses and their potential impact on the design approaches.
Upper Extremity Artificial Limb Control as an Issue Related to Movement and Mobility in Daily Living
ERIC Educational Resources Information Center
Wallace, Steve; Anderson, David I.; Trujillo, Michael; Weeks, Douglas L.
2005-01-01
The 1992 NIH Research Planning Conference on Prosthetic and Orthotic Research for the 21st Century (Childress, 1992) recognized that the field of prosthetics lacks theoretical understanding and empirical studies on learning to control an upper-extremity prosthesis. We have addressed this problem using a novel approach in which persons without…
Greedy algorithms in disordered systems
NASA Astrophysics Data System (ADS)
Duxbury, P. M.; Dobrin, R.
1999-08-01
We discuss search, minimal path and minimal spanning tree algorithms and their applications to disordered systems. Greedy algorithms solve these problems exactly, and are related to extremal dynamics in physics. Minimal cost path (Dijkstra) and minimal cost spanning tree (Prim) algorithms provide extremal dynamics for a polymer in a random medium (the KPZ universality class) and invasion percolation (without trapping) respectively.
Interception in three dimensions - An energy formulation
NASA Technical Reports Server (NTRS)
Rajan, N.; Ardema, M. D.
1983-01-01
The problem of minimum-time interception of a target flying in three dimensional space is analyzed with the interceptor aircraft modeled through energy-state approximation. A coordinate transformation that uncouples the interceptor's extremals from the target motion in an open-loop sense is introduced, and the necessary conditions for optimality and the optimal controls are derived. Example extremals are shown.
NASA Astrophysics Data System (ADS)
Masud, M. B.; Khaliq, M. N.; Wheater, H. S.
2017-09-01
The effects of climate change on April-October short- and long-duration precipitation extremes over the Canadian Prairie Provinces were evaluated using a multi-Regional Climate Model (RCM) ensemble available through the North American Regional Climate Change Assessment Program. Simulations considered include those performed with six RCMs driven by the National Centre for Environmental Prediction (NCEP) reanalysis II product for the 1981-2000 period and those driven by four Atmosphere-Ocean General Circulation Models (AOGCMs) for the current 1971-2000 and future 2041-2070 periods (i.e. a total of 11 current-to-future period simulation pairs). A regional frequency analysis approach was used to develop 2-, 5-, 10-, 25-, and 50-year return values of precipitation extremes from NCEP and AOGCM-driven current and future period simulations that respectively were used to study the performance of RCMs and projected changes for selected return values at regional, grid-cell and local scales. Performance errors due to internal dynamics and physics of RCMs studied for the 1981-2000 period reveal considerable variation in the performance of the RCMs. However, the performance errors were found to be much smaller for RCM ensemble averages than for individual RCMs. Projected changes in future climate to selected regional return values of short-duration (e.g. 15- and 30-min) precipitation extremes and for longer return periods (e.g. 50-year) were found to be mostly larger than those to the longer duration (e.g. 24- and 48-h) extremes and short return periods (e.g. 2-year). Overall, projected changes in precipitation extremes were larger for southeastern regions followed by southern and northern regions and smaller for southwestern and western regions of the study area. The changes to return values were also found to be statistically significant for the majority of the RCM-AOGCM simulation pairs. These projections might be useful as a key input for the future planning of urban drainage infrastructure and development of strategic climate change adaptation measures.
NASA Astrophysics Data System (ADS)
Korytárová, J.; Vaňková, L.
2017-10-01
Paper builds on previous research of the authors into the evaluation of economic efficiency of transport infrastructure projects evaluated by the economic efficiency ratio - NPV, IRR and BCR. Values of indicators and subsequent outputs of the sensitivity analysis show extremely favourable values in some cases. The authors dealt with the analysis of these indicators down to the level of the input variables and examined which inputs have a larger share of these extreme values. NCF for the calculation of above mentioned ratios is created by benefits that arise as the difference between zero and investment options of the project (savings in travel and operating costs, savings in travel time costs, reduction in accident costs and savings in exogenous costs) as well as total agency costs. Savings in travel time costs which contribute to the overall utility of projects by more than 70% appear to be the most important benefits in the long term horizon. This is the reason why this benefit emphasized. The outcome of the article has resulted how the particular basic variables contributed to the total robustness of economic efficiency of these project.
NASA Technical Reports Server (NTRS)
Ashouri, Hamed; Sorooshian, Soroosh; Hsu, Kuo-Lin; Bosilovich, Michael G.; Lee, Jaechoul; Wehner, Michael F.; Collow, Allison
2016-01-01
This study evaluates the performance of NASA's Modern-Era Retrospective Analysis for Research and Applications (MERRA) precipitation product in reproducing the trend and distribution of extreme precipitation events. Utilizing the extreme value theory, time-invariant and time-variant extreme value distributions are developed to model the trends and changes in the patterns of extreme precipitation events over the contiguous United States during 1979-2010. The Climate Prediction Center (CPC) U.S.Unified gridded observation data are used as the observational dataset. The CPC analysis shows that the eastern and western parts of the United States are experiencing positive and negative trends in annual maxima, respectively. The continental-scale patterns of change found in MERRA seem to reasonably mirror the observed patterns of change found in CPC. This is not previously expected, given the difficulty in constraining precipitation in reanalysis products. MERRA tends to overestimate the frequency at which the 99th percentile of precipitation is exceeded because this threshold tends to be lower in MERRA, making it easier to be exceeded. This feature is dominant during the summer months. MERRA tends to reproduce spatial patterns of the scale and location parameters of the generalized extreme value and generalized Pareto distributions. However, MERRA underestimates these parameters, particularly over the Gulf Coast states, leading to lower magnitudes in extreme precipitation events. Two issues in MERRA are identified: 1) MERRA shows a spurious negative trend in Nebraska and Kansas, which is most likely related to the changes in the satellite observing system over time that has apparently affected the water cycle in the central United States, and 2) the patterns of positive trend over the Gulf Coast states and along the East Coast seem to be correlated with the tropical cyclones in these regions. The analysis of the trends in the seasonal precipitation extremes indicates that the hurricane and winter seasons are contributing the most to these trend patterns in the southeastern United States. In addition, the increasing annual trend simulated by MERRA in the Gulf Coast region is due to an incorrect trend in winter precipitation extremes.
Ashouri, Hamed; Sorooshian, Soroosh; Hsu, Kuo-Lin; ...
2016-02-03
This study evaluates the performance of NASA's Modern-Era Retrospective Analysis for Research and Applications (MERRA) precipitation product in reproducing the trend and distribution of extreme precipitation events. Utilizing the extreme value theory, time-invariant and time-variant extreme value distributions are developed to model the trends and changes in the patterns of extreme precipitation events over the contiguous United States during 1979-2010. The Climate Prediction Center (CPC)U.S.Unified gridded observation data are used as the observational dataset. The CPC analysis shows that the eastern and western parts of the United States are experiencing positive and negative trends in annual maxima, respectively. The continental-scalemore » patterns of change found in MERRA seem to reasonably mirror the observed patterns of change found in CPC. This is not previously expected, given the difficulty in constraining precipitation in reanalysis products. MERRA tends to overestimate the frequency at which the 99th percentile of precipitation is exceeded because this threshold tends to be lower in MERRA, making it easier to be exceeded. This feature is dominant during the summer months. MERRAtends to reproduce spatial patterns of the scale and location parameters of the generalized extreme value and generalized Pareto distributions. However, MERRA underestimates these parameters, particularly over the Gulf Coast states, leading to lower magnitudes in extreme precipitation events. Two issues in MERRA are identified: 1)MERRAshows a spurious negative trend in Nebraska andKansas, which ismost likely related to the changes in the satellite observing system over time that has apparently affected the water cycle in the central United States, and 2) the patterns of positive trend over theGulf Coast states and along the East Coast seem to be correlated with the tropical cyclones in these regions. The analysis of the trends in the seasonal precipitation extremes indicates that the hurricane and winter seasons are contributing the most to these trend patterns in the southeastern United States. The increasing annual trend simulated by MERRA in the Gulf Coast region is due to an incorrect trend in winter precipitation extremes.« less
Are behaviour problems in extremely low-birthweight children related to their motor ability?
Danks, Marcella; Cherry, Kate; Burns, Yvonne R; Gray, Peter H
2017-04-01
To investigate whether behaviour problems are independently related to mild motor impairment in 11-13-year-old children born preterm with extremely low birthweight (ELBW). The cross-sectional study included 48 (27 males) non-disabled, otherwise healthy ELBW children (<1000 g) and 55 (28 males) term-born peers. Parents reported behaviour using the Child Behaviour Checklist (CBCL). Children completed the Movement Assessment Battery for Children (Movement ABC). Extremely low birthweight children had poorer behaviour scores (CBCL Total Problem T score: mean difference = 5.89, 95% confidence interval = 10.29, 1.49, p = 0.009) and Movement ABC Total Motor Impairment Scores (ELBW group median = 17.5, IQR = 12.3; term-born group median = 7.5, IQR = 9, p < 0.01) than term-born peers. Behaviour was related to motor score (regression coefficient 2.16; 95% confidence interval 0.34, 3.97, p = 0.02) independent of gender, socio-economic factors or birthweight. Motor score had the strongest association with attention (ρ = 0.51; p < 0.01) and social behaviours (ρ = 0.50; p < 0.01). Behaviour problems of otherwise healthy 11- to 13-year-old ELBW children are not related to prematurity independent of their motor difficulties. Supporting improved motor competence in ELBW preteen children may support improved behaviour, particularly attention and social behaviours. ©2016 Foundation Acta Paediatrica. Published by John Wiley & Sons Ltd.
Reaction-diffusion on the fully-connected lattice: A+A\\rightarrow A
NASA Astrophysics Data System (ADS)
Turban, Loïc; Fortin, Jean-Yves
2018-04-01
Diffusion-coagulation can be simply described by a dynamic where particles perform a random walk on a lattice and coalesce with probability unity when meeting on the same site. Such processes display non-equilibrium properties with strong fluctuations in low dimensions. In this work we study this problem on the fully-connected lattice, an infinite-dimensional system in the thermodynamic limit, for which mean-field behaviour is expected. Exact expressions for the particle density distribution at a given time and survival time distribution for a given number of particles are obtained. In particular, we show that the time needed to reach a finite number of surviving particles (vanishing density in the scaling limit) displays strong fluctuations and extreme value statistics, characterized by a universal class of non-Gaussian distributions with singular behaviour.
Study of mesoscale phenomena, winter monsoon clouds and snow area based on LANDSAT data
NASA Technical Reports Server (NTRS)
Tsuchiya, K. (Principal Investigator)
1976-01-01
The author has identified the following significant results. Most longitudinal clouds which appear as continuous linear clouds are composed of small transversal clouds. There are mountain waves of different wavelength in a comparatively narrow area indicating complicated orographical effects on wind and temperature distribution or on both dynamical and static stability condition. There is a particular shape of cirrus cloud suggestive of turbulence in the vicinity of CAT in the upper troposphere near jet stream level and its cold air side. Thin cirrus of overcast condition can be distinguished by MSS; however, extremely thin cirrus of partly cloudy condition cannot be detected even in LANDSAT data. This presents a serious problem in the interpretation of satellite thermal infrared radiation data since they affect the value.
Optimal thrust level for orbit insertion
NASA Astrophysics Data System (ADS)
Cerf, Max
2017-07-01
The minimum-fuel orbital transfer is analyzed in the case of a launcher upper stage using a constantly thrusting engine. The thrust level is assumed to be constant and its value is optimized together with the thrust direction. A closed-loop solution for the thrust direction is derived from the extremal analysis for a planar orbital transfer. The optimal control problem reduces to two unknowns, namely the thrust level and the final time. Guessing and propagating the costates is no longer necessary and the optimal trajectory is easily found from a rough initialization. On the other hand the initial costates are assessed analytically from the initial conditions and they can be used as initial guess for transfers at different thrust levels. The method is exemplified on a launcher upper stage targeting a geostationary transfer orbit.
Pediatric Major Head Injury: Not a Minor Problem.
Leetch, Aaron N; Wilson, Bryan
2018-05-01
Traumatic brain injury is a highly prevalent and devastating cause of morbidity and mortality in children. A rapid, stepwise approach to the traumatized child should proceed, addressing life-threatening problems first. Management focuses on preventing secondary injury from physiologic extremes such as hypoxemia, hypotension, prolonged hyperventilation, temperature extremes, and rapid changes in cerebral blood flow. Initial Glasgow Coma Score, hyperglycemia, and imaging are often prognostic of outcome. Surgically amenable lesions should be evacuated promptly. Reduction of intracranial pressure through hyperosmolar therapy, decompressive craniotomy, and seizure prophylaxis may be considered after stabilization. Nonaccidental trauma should be considered when evaluating pediatric trauma patients. Copyright © 2017 Elsevier Inc. All rights reserved.
Minimum Propellant Low-Thrust Maneuvers near the Libration Points
NASA Astrophysics Data System (ADS)
Marinescu, A.; Dumitrache, M.
The impulse technique certainly can bring the vehicle on orbits around the libration points or close to them. The question that aries is, by what means can the vehicle arrive in such cases at the libration points? A first investigation carried out in this paper can give an answer: the use of the technique of low-thrust, which, in addition, can bring the vehicle from the libration points near to or into orbits around these points. This aspect is considered in this present paper where for the applications we have considered the transfer for orbits of the equidistant point L4 and of the collinear point L2, from Earth-moon system. This transfer maneuver can be used to insertion one satellite on libration points orbits. In Earth- moon system the points L 4 and L 5 because an vehicle in on of the equidistant points in quite stable and remains in its vicinity of perturbed, have potential interest for the establishment of transporder satellite for interplanetary tracking. In contrast an vehicle in one of the collinear points is quite instable and it will oscillate along the Earth-moon-axis at increasing amplitude and gradually escape from the libration point. Let use assume that a space vehicle equipped with a low-thrust propulsion is near a libration point L. We consider the planar motion in the restricted frame of the three bodies in the rotating system L, where the Earth-moon distance D=l. The unit of time T is period of the moon's orbit divided by 2 and multiplied by the square root of the quantity one plus the moon/Earth mass ratio, and the unit of mass is the Earth's mass. With these predictions the motion equatios of the vehicle equiped with a low-thrust propulsion installation in the linear approximation near the libration point, have been established. The parameters of the motion at the beginning and the end of these maneuvers are known, the variational problem has been formulated as a Lagrange type problem with fixed extremities. On established the differential equations of the extremals and integrating these differential equations we obtain the desired extremals which characterize the minimum propellant optimal manoeuvres of transfer from libration points to their orbits. By means of Legendre conditions for weak minimum and Weierstrass condition for strong minimum, is demonstrated that variational problem so formulated has sense and is a problem of minimum. The integration of extremal's differential equations system can not lead to analytical solutions easily to obtain and for this we have directed to a numerical integration. The problem is a bilocal one because the motion parameter values are predicted at the beginning and of the maneuver (the manoeuvre duration coincides with the combustion duration) the values of the Lagrange multipliers not being specified at the beginning and end of the manoeuvre. For determination of the velocities at any point on the libration point L4 and L2 has been elaborated the program of calculus on the integration of the motion equations without accelerations due thrust during a revolution period the coordinates and velocities to be equal, with which have been calculated the velocities at the apoapsis A and respectively A'. With these specifications, the final conditions (at the end of the maneuver) could be established, and the determination of optimal transfer parameters in the specified points could be determined. The calculus performed for the transfer from the libration points L4 and L2 to their orbits, shows that the evolution velocities on the orbits are in general small, the velocities on the L2 orbits being greater than the velocities on L 4 orbits having the same semimajor axis. This fact is explicable because the period of evolution on orbits of libration point L4 is greater than the period of orbits of the libration point L2. For the transfer in the apoapsis of both orbits (the points A. and A') on can remarque the fact the accelerations due thrust are greater for orbits around the libration point L2 comparatively with orbits having the same semimajor axis around the libration point L 4 ( maneuver duration = 106 s = 11.574 days for L 4 and = 105 s = 1.157 days for L2 ). Considering orbits around libration points L4 and L2 with semimajor axis between 150-15000 km the components of acceleration due thrust have values between 10-2 -10-5 m/S2 which lays in the range of performances of law thrust propulsion installations (the D, T units have been converted in m, s). *Senior Scientist. Member AIAA **Researche Engineer
Zhang, Mi; Wen, Xue Fa; Zhang, Lei Ming; Wang, Hui Min; Guo, Yi Wen; Yu, Gui Rui
2018-02-01
Extreme high temperature is one of important extreme weathers that impact forest ecosystem carbon cycle. In this study, applying CO 2 flux and routine meteorological data measured during 2003-2012, we examined the impacts of extreme high temperature and extreme high temperature event on net carbon uptake of subtropical coniferous plantation in Qianyanzhou. Combining with wavelet analysis, we analyzed environmental controls on net carbon uptake at different temporal scales, when the extreme high temperature and extreme high temperature event happened. The results showed that mean daily cumulative NEE decreased by 51% in the days with daily maximum air temperature range between 35 ℃ and 40 ℃, compared with that in the days with the range between 30 ℃ and 34 ℃. The effects of the extreme high temperature and extreme high temperature event on monthly NEE and annual NEE related to the strength and duration of extreme high tempe-rature event. In 2003, when strong extreme high temperature event happened, the sum of monthly cumulative NEE in July and August was only -11.64 g C·m -2 ·(2 month) -1 . The value decreased by 90%, compared with multi-year average value. At the same time, the relative variation of annual NEE reached -6.7%. In July and August, when the extreme high temperature and extreme high temperature event occurred, air temperature (T a ) and vapor press deficit (VPD) were the dominant controller for the daily variation of NEE. The coherency between NEE T a and NEE VPD was 0.97 and 0.95, respectively. At 8-, 16-, and 32-day periods, T a , VPD, soil water content at 5 cm depth (SWC), and precipitation (P) controlled NEE. The coherency between NEE SWC and NEE P was higher than 0.8 at monthly scale. The results indicated that atmospheric water deficit impacted NEE at short temporal scale, when the extreme high temperature and extreme high temperature event occurred, both of atmospheric water deficit and soil drought stress impacted NEE at long temporal scales in this ecosystem.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Jiali; Han, Yuefeng; Stein, Michael L.
2016-02-10
The Weather Research and Forecast (WRF) model downscaling skill in extreme maximum daily temperature is evaluated by using the generalized extreme value (GEV) distribution. While the GEV distribution has been used extensively in climatology and meteorology for estimating probabilities of extreme events, accurately estimating GEV parameters based on data from a single pixel can be difficult, even with fairly long data records. This work proposes a simple method assuming that the shape parameter, the most difficult of the three parameters to estimate, does not vary over a relatively large region. This approach is applied to evaluate 31-year WRF-downscaled extreme maximummore » temperature through comparison with North American Regional Reanalysis (NARR) data. Uncertainty in GEV parameter estimates and the statistical significance in the differences of estimates between WRF and NARR are accounted for by conducting bootstrap resampling. Despite certain biases over parts of the United States, overall, WRF shows good agreement with NARR in the spatial pattern and magnitudes of GEV parameter estimates. Both WRF and NARR show a significant increase in extreme maximum temperature over the southern Great Plains and southeastern United States in January and over the western United States in July. The GEV model shows clear benefits from the regionally constant shape parameter assumption, for example, leading to estimates of the location and scale parameters of the model that show coherent spatial patterns.« less
Extreme storm surge and wind wave climate scenario simulations at the Venetian littoral
NASA Astrophysics Data System (ADS)
Lionello, P.; Galati, M. B.; Elvini, E.
Scenario climate projections for extreme marine storms producing storm surges and wind waves are very important for the northern flat coast of the Adriatic Sea, where the area at risk includes a unique cultural and environmental heritage, and important economic activities. This study uses a shallow water model and a spectral wave model for computing the storm surge and the wind wave field, respectively, from the sea level pressure and wind fields that have been computed by the RegCM regional climate model. Simulations cover the period 1961-1990 for the present climate (control simulations) and the period 2071-2100 for the A2 and B2 scenarios. Generalized Extreme Value analysis is used for estimating values for the 10 and 100 year return times. The adequacy of these modeling tools for a reliable estimation of the climate change signal, without needing further downscaling is shown. However, this study has mainly a methodological value, because issues such as interdecadal variability and intermodel variability cannot be addressed, since the analysis is based on single model 30-year long simulations. The control simulation looks reasonably accurate for extreme value analysis, though it overestimates/underestimates the frequency of high/low surge and wind wave events with respect to observations. Scenario simulations suggest higher frequency of intense storms for the B2 scenario, but not for the A2. Likely, these differences are not the effect of climate change, but of climate multidecadal variability. Extreme storms are stronger in future scenarios, but differences are not statistically significant. Therefore this study does not provide convincing evidence for more stormy conditions in future scenarios.
Extreme fluctuations in stochastic network coordination with time delays
NASA Astrophysics Data System (ADS)
Hunt, D.; Molnár, F.; Szymanski, B. K.; Korniss, G.
2015-12-01
We study the effects of uniform time delays on the extreme fluctuations in stochastic synchronization and coordination problems with linear couplings in complex networks. We obtain the average size of the fluctuations at the nodes from the behavior of the underlying modes of the network. We then obtain the scaling behavior of the extreme fluctuations with system size, as well as the distribution of the extremes on complex networks, and compare them to those on regular one-dimensional lattices. For large complex networks, when the delay is not too close to the critical one, fluctuations at the nodes effectively decouple, and the limit distributions converge to the Fisher-Tippett-Gumbel density. In contrast, fluctuations in low-dimensional spatial graphs are strongly correlated, and the limit distribution of the extremes is the Airy density. Finally, we also explore the effects of nonlinear couplings on the stability and on the extremes of the synchronization landscapes.
[Quantitative Evaluation of Metal Artifacts on CT Images on the Basis of Statistics of Extremes].
Kitaguchi, Shigetoshi; Imai, Kuniharu; Ueda, Suguru; Hashimoto, Naomi; Hattori, Shouta; Saika, Takahiro; Ono, Yoshifumi
2016-05-01
It is well-known that metal artifacts have a harmful effect on the image quality of computed tomography (CT) images. However, the physical property remains still unknown. In this study, we investigated the relationship between metal artifacts and tube currents using statistics of extremes. A commercially available phantom for measuring CT dose index 160 mm in diameter was prepared and a brass rod 13 mm in diameter was placed at the centerline of the phantom. This phantom was used as a target object to evaluate metal artifacts and was scanned using an area detector CT scanner with various tube currents under a constant tube voltage of 120 kV. Sixty parallel line segments with a length of 100 pixels were placed to cross metal artifacts on CT images and the largest difference between two adjacent CT values in each of 60 CT value profiles of these line segments was employed as a feature variable for measuring metal artifacts; these feature variables were analyzed on the basis of extreme value theory. The CT value variation induced by metal artifacts was statistically characterized by Gumbel distribution, which was one of the extreme value distributions; namely, metal artifacts have the same statistical characteristic as streak artifacts. Therefore, Gumbel evaluation method makes it possible to analyze not only streak artifacts but also metal artifacts. Furthermore, the location parameter in Gumbel distribution was shown to be in inverse proportion to the square root of a tube current. This result suggested that metal artifacts have the same dose dependence as image noises.
Unified Lambert Tool for Massively Parallel Applications in Space Situational Awareness
NASA Astrophysics Data System (ADS)
Woollands, Robyn M.; Read, Julie; Hernandez, Kevin; Probe, Austin; Junkins, John L.
2018-03-01
This paper introduces a parallel-compiled tool that combines several of our recently developed methods for solving the perturbed Lambert problem using modified Chebyshev-Picard iteration. This tool (unified Lambert tool) consists of four individual algorithms, each of which is unique and better suited for solving a particular type of orbit transfer. The first is a Keplerian Lambert solver, which is used to provide a good initial guess (warm start) for solving the perturbed problem. It is also used to determine the appropriate algorithm to call for solving the perturbed problem. The arc length or true anomaly angle spanned by the transfer trajectory is the parameter that governs the automated selection of the appropriate perturbed algorithm, and is based on the respective algorithm convergence characteristics. The second algorithm solves the perturbed Lambert problem using the modified Chebyshev-Picard iteration two-point boundary value solver. This algorithm does not require a Newton-like shooting method and is the most efficient of the perturbed solvers presented herein, however the domain of convergence is limited to about a third of an orbit and is dependent on eccentricity. The third algorithm extends the domain of convergence of the modified Chebyshev-Picard iteration two-point boundary value solver to about 90% of an orbit, through regularization with the Kustaanheimo-Stiefel transformation. This is the second most efficient of the perturbed set of algorithms. The fourth algorithm uses the method of particular solutions and the modified Chebyshev-Picard iteration initial value solver for solving multiple revolution perturbed transfers. This method does require "shooting" but differs from Newton-like shooting methods in that it does not require propagation of a state transition matrix. The unified Lambert tool makes use of the General Mission Analysis Tool and we use it to compute thousands of perturbed Lambert trajectories in parallel on the Space Situational Awareness computer cluster at the LASR Lab, Texas A&M University. We demonstrate the power of our tool by solving a highly parallel example problem, that is the generation of extremal field maps for optimal spacecraft rendezvous (and eventual orbit debris removal). In addition we demonstrate the need for including perturbative effects in simulations for satellite tracking or data association. The unified Lambert tool is ideal for but not limited to space situational awareness applications.
Governmental and Nongovernmental Youth Welfare in the New German Lander.
ERIC Educational Resources Information Center
Gawlik, Marion; And Others
1994-01-01
Survey of the general conditions of youth welfare departments in eastern Germany revealed severe money shortages. Increasing demands on youth welfare, rising social problems, right-wing extremism, and widespread unemployment among youths cause long-term social problems and prohibit effective youth welfare. (RJM)
NASA Technical Reports Server (NTRS)
Parnell, Gregory S.; Rowell, William F.; Valusek, John R.
1987-01-01
In recent years there has been increasing interest in applying the computer based problem solving techniques of Artificial Intelligence (AI), Operations Research (OR), and Decision Support Systems (DSS) to analyze extremely complex problems. A conceptual framework is developed for successfully integrating these three techniques. First, the fields of AI, OR, and DSS are defined and the relationships among the three fields are explored. Next, a comprehensive adaptive design methodology for AI and OR modeling within the context of a DSS is described. These observations are made: (1) the solution of extremely complex knowledge problems with ill-defined, changing requirements can benefit greatly from the use of the adaptive design process, (2) the field of DSS provides the focus on the decision making process essential for tailoring solutions to these complex problems, (3) the characteristics of AI, OR, and DSS tools appears to be converging rapidly, and (4) there is a growing need for an interdisciplinary AI/OR/DSS education.
Extreme-Scale Bayesian Inference for Uncertainty Quantification of Complex Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Biros, George
Uncertainty quantification (UQ)—that is, quantifying uncertainties in complex mathematical models and their large-scale computational implementations—is widely viewed as one of the outstanding challenges facing the field of CS&E over the coming decade. The EUREKA project set to address the most difficult class of UQ problems: those for which both the underlying PDE model as well as the uncertain parameters are of extreme scale. In the project we worked on these extreme-scale challenges in the following four areas: 1. Scalable parallel algorithms for sampling and characterizing the posterior distribution that exploit the structure of the underlying PDEs and parameter-to-observable map. Thesemore » include structure-exploiting versions of the randomized maximum likelihood method, which aims to overcome the intractability of employing conventional MCMC methods for solving extreme-scale Bayesian inversion problems by appealing to and adapting ideas from large-scale PDE-constrained optimization, which have been very successful at exploring high-dimensional spaces. 2. Scalable parallel algorithms for construction of prior and likelihood functions based on learning methods and non-parametric density estimation. Constructing problem-specific priors remains a critical challenge in Bayesian inference, and more so in high dimensions. Another challenge is construction of likelihood functions that capture unmodeled couplings between observations and parameters. We will create parallel algorithms for non-parametric density estimation using high dimensional N-body methods and combine them with supervised learning techniques for the construction of priors and likelihood functions. 3. Bayesian inadequacy models, which augment physics models with stochastic models that represent their imperfections. The success of the Bayesian inference framework depends on the ability to represent the uncertainty due to imperfections of the mathematical model of the phenomena of interest. This is a central challenge in UQ, especially for large-scale models. We propose to develop the mathematical tools to address these challenges in the context of extreme-scale problems. 4. Parallel scalable algorithms for Bayesian optimal experimental design (OED). Bayesian inversion yields quantified uncertainties in the model parameters, which can be propagated forward through the model to yield uncertainty in outputs of interest. This opens the way for designing new experiments to reduce the uncertainties in the model parameters and model predictions. Such experimental design problems have been intractable for large-scale problems using conventional methods; we will create OED algorithms that exploit the structure of the PDE model and the parameter-to-output map to overcome these challenges. Parallel algorithms for these four problems were created, analyzed, prototyped, implemented, tuned, and scaled up for leading-edge supercomputers, including UT-Austin’s own 10 petaflops Stampede system, ANL’s Mira system, and ORNL’s Titan system. While our focus is on fundamental mathematical/computational methods and algorithms, we will assess our methods on model problems derived from several DOE mission applications, including multiscale mechanics and ice sheet dynamics.« less
Assessing Regional Scale Variability in Extreme Value Statistics Under Altered Climate Scenarios
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brunsell, Nathaniel; Mechem, David; Ma, Chunsheng
Recent studies have suggested that low-frequency modes of climate variability can significantly influence regional climate. The climatology associated with extreme events has been shown to be particularly sensitive. This has profound implications for droughts, heat waves, and food production. We propose to examine regional climate simulations conducted over the continental United States by applying a recently developed technique which combines wavelet multi–resolution analysis with information theory metrics. This research is motivated by two fundamental questions concerning the spatial and temporal structure of extreme events. These questions are 1) what temporal scales of the extreme value distributions are most sensitive tomore » alteration by low-frequency climate forcings and 2) what is the nature of the spatial structure of variation in these timescales? The primary objective is to assess to what extent information theory metrics can be useful in characterizing the nature of extreme weather phenomena. Specifically, we hypothesize that (1) changes in the nature of extreme events will impact the temporal probability density functions and that information theory metrics will be sensitive these changes and (2) via a wavelet multi–resolution analysis, we will be able to characterize the relative contribution of different timescales on the stochastic nature of extreme events. In order to address these hypotheses, we propose a unique combination of an established regional climate modeling approach and advanced statistical techniques to assess the effects of low-frequency modes on climate extremes over North America. The behavior of climate extremes in RCM simulations for the 20th century will be compared with statistics calculated from the United States Historical Climatology Network (USHCN) and simulations from the North American Regional Climate Change Assessment Program (NARCCAP). This effort will serve to establish the baseline behavior of climate extremes, the validity of an innovative multi–resolution information theory approach, and the ability of the RCM modeling framework to represent the low-frequency modulation of extreme climate events. Once the skill of the modeling and analysis methodology has been established, we will apply the same approach for the AR5 (IPCC Fifth Assessment Report) climate change scenarios in order to assess how climate extremes and the the influence of lowfrequency variability on climate extremes might vary under changing climate. The research specifically addresses the DOE focus area 2. Simulation of climate extremes under a changing climate. Specific results will include (1) a better understanding of the spatial and temporal structure of extreme events, (2) a thorough quantification of how extreme values are impacted by low-frequency climate teleconnections, (3) increased knowledge of current regional climate models ability to ascertain these influences, and (4) a detailed examination of the how the distribution of extreme events are likely to change under different climate change scenarios. In addition, this research will assess the ability of the innovative wavelet information theory approach to characterize extreme events. Any and all of these results will greatly enhance society’s ability to understand and mitigate the regional ramifications of future global climate change.« less
A delay differential model of ENSO variability: Extreme values and stability analysis
NASA Astrophysics Data System (ADS)
Zaliapin, I.; Ghil, M.
2009-04-01
We consider a delay differential equation (DDE) model for El-Niño Southern Oscillation (ENSO) variability [Ghil et al. (2008), Nonlin. Proc. Geophys., 15, 417-433.] The model combines two key mechanisms that participate in ENSO dynamics: delayed negative feedback and seasonal forcing. Toy models of this type were shown to capture major features of the ENSO phenomenon [Jin et al., Science (1994); Tziperman et al., Science (1994)]; they provide a convenient paradigm for explaining interannual ENSO variability and shed new light on its dynamical properties. So far, though, DDE model studies of ENSO have been limited to linear stability analysis of steady-state solutions, which are not typical in forced systems, case studies of particular trajectories, or one-dimensional scenarios of transition to chaos, varying a single parameter while the others are kept fixed. In this work we take several steps toward a comprehensive analysis of DDE models relevant for ENSO phenomenology and illustrate the complexity of phase-parameter space structure for even such a simple model of climate dynamics. We formulate an initial value problem for our model and prove the existence, uniqueness, and continuous dependence theorem. We then use this theoretical result to perform detailed numerical stability analyses of the model in the three-dimensional space of its physically relevant parameters: strength of seasonal forcing b, atmosphere-ocean coupling ΰ, and propagation period ? of oceanic waves across the Tropical Pacific. Two regimes of variability, stable and unstable, are reported; they are separated by a sharp neutral curve in the (b,?) plane at constant ΰ. The detailed structure of the neutral curve becomes very irregular and possibly fractal, while individual trajectories within the unstable region become highly complex and possibly chaotic, as the atmosphere-ocean coupling ΰ increases. In the unstable regime, spontaneous transitions occur in the mean temperature (i.e., thermocline depth), period, and extreme annual values, for purely periodic, seasonal forcing. The model reproduces the Devils bleachers characterizing other ENSO models, such as nonlinear, coupled systems of partial differential equations; some of the features of this behavior have been documented in general circulation models, as well as in observations. We analyze the values of annual extremes and their location within an annual cycle and report the phase-locking phenomenon, which is connected to the occurrence of El-Niño events during the boreal (Northern Hemisphere) winter. We report existence of multiple solutions and study their basins of attraction in a space of initial conditions. We also present a model-based justification for the observed quasi-biennial oscillation in Tropical Pacific SSTs. We expect similar behavior in much more detailed and realistic models, where it is harder to describe its causes as completely. The basic mechanisms used in our model (delayed feedback and forcing) may be relevant to other natural systems in which internal instabilities interact with external forcing and give rise to extreme events.
Li, Hong; Liu, Mingyong; Zhang, Feihu
2017-01-01
This paper presents a multi-objective evolutionary algorithm of bio-inspired geomagnetic navigation for Autonomous Underwater Vehicle (AUV). Inspired by the biological navigation behavior, the solution was proposed without using a priori information, simply by magnetotaxis searching. However, the existence of the geomagnetic anomalies has significant influence on the geomagnetic navigation system, which often disrupts the distribution of the geomagnetic field. An extreme value region may easily appear in abnormal regions, which makes AUV lost in the navigation phase. This paper proposes an improved bio-inspired algorithm with behavior constraints, for sake of making AUV escape from the abnormal region. First, the navigation problem is considered as the optimization problem. Second, the environmental monitoring operator is introduced, to determine whether the algorithm falls into the geomagnetic anomaly region. Then, the behavior constraint operator is employed to get out of the abnormal region. Finally, the termination condition is triggered. Compared to the state-of- the-art, the proposed approach effectively overcomes the disturbance of the geomagnetic abnormal. The simulation result demonstrates the reliability and feasibility of the proposed approach in complex environments.
Li, Hong; Liu, Mingyong; Zhang, Feihu
2017-01-01
This paper presents a multi-objective evolutionary algorithm of bio-inspired geomagnetic navigation for Autonomous Underwater Vehicle (AUV). Inspired by the biological navigation behavior, the solution was proposed without using a priori information, simply by magnetotaxis searching. However, the existence of the geomagnetic anomalies has significant influence on the geomagnetic navigation system, which often disrupts the distribution of the geomagnetic field. An extreme value region may easily appear in abnormal regions, which makes AUV lost in the navigation phase. This paper proposes an improved bio-inspired algorithm with behavior constraints, for sake of making AUV escape from the abnormal region. First, the navigation problem is considered as the optimization problem. Second, the environmental monitoring operator is introduced, to determine whether the algorithm falls into the geomagnetic anomaly region. Then, the behavior constraint operator is employed to get out of the abnormal region. Finally, the termination condition is triggered. Compared to the state-of- the-art, the proposed approach effectively overcomes the disturbance of the geomagnetic abnormal. The simulation result demonstrates the reliability and feasibility of the proposed approach in complex environments. PMID:28747884
Zhang, Zhilin; Pi, Zhouyue; Liu, Benyuan
2015-02-01
Heart rate monitoring using wrist-type photoplethysmographic signals during subjects' intensive exercise is a difficult problem, since the signals are contaminated by extremely strong motion artifacts caused by subjects' hand movements. So far few works have studied this problem. In this study, a general framework, termed TROIKA, is proposed, which consists of signal decomposiTion for denoising, sparse signal RecOnstructIon for high-resolution spectrum estimation, and spectral peaK trAcking with verification. The TROIKA framework has high estimation accuracy and is robust to strong motion artifacts. Many variants can be straightforwardly derived from this framework. Experimental results on datasets recorded from 12 subjects during fast running at the peak speed of 15 km/h showed that the average absolute error of heart rate estimation was 2.34 beat per minute, and the Pearson correlation between the estimates and the ground truth of heart rate was 0.992. This framework is of great values to wearable devices such as smartwatches which use PPG signals to monitor heart rate for fitness.
Turbulence as a Problem in Non-equilibrium Statistical Mechanics
NASA Astrophysics Data System (ADS)
Goldenfeld, Nigel; Shih, Hong-Yan
2017-05-01
The transitional and well-developed regimes of turbulent shear flows exhibit a variety of remarkable scaling laws that are only now beginning to be systematically studied and understood. In the first part of this article, we summarize recent progress in understanding the friction factor of turbulent flows in rough pipes and quasi-two-dimensional soap films, showing how the data obey a two-parameter scaling law known as roughness-induced criticality, and exhibit power-law scaling of friction factor with Reynolds number that depends on the precise form of the nature of the turbulent cascade. These results hint at a non-equilibrium fluctuation-dissipation relation that applies to turbulent flows. The second part of this article concerns the lifetime statistics in smooth pipes around the transition, showing how the remarkable super-exponential scaling with Reynolds number reflects deep connections between large deviation theory, extreme value statistics, directed percolation and the onset of coexistence in predator-prey ecosystems. Both these phenomena reflect the way in which turbulence can be fruitfully approached as a problem in non-equilibrium statistical mechanics.
NASA Astrophysics Data System (ADS)
Mahdavi, Ali; Seyyedian, Hamid
2014-05-01
This study presents a semi-analytical solution for steady groundwater flow in trapezoidal-shaped aquifers in response to an areal diffusive recharge. The aquifer is homogeneous, anisotropic and interacts with four surrounding streams of constant-head. Flow field in this laterally bounded aquifer-system is efficiently constructed by means of variational calculus. This is accomplished by minimizing a properly defined penalty function for the associated boundary value problem. Simple yet demonstrative scenarios are defined to investigate anisotropy effects on the water table variation. Qualitative examination of the resulting equipotential contour maps and velocity vector field illustrates the validity of the method, especially in the vicinity of boundary lines. Extension to the case of triangular-shaped aquifer with or without an impervious boundary line is also demonstrated through a hypothetical example problem. The present solution benefits from an extremely simple mathematical expression and exhibits strictly close agreement with the numerical results obtained from Modflow. Overall, the solution may be used to conduct sensitivity analysis on various hydrogeological parameters that affect water table variation in aquifers defined in trapezoidal or triangular-shaped domains.
Sabar, Nasser R; Ayob, Masri; Kendall, Graham; Qu, Rong
2015-02-01
Hyper-heuristics are search methodologies that aim to provide high-quality solutions across a wide variety of problem domains, rather than developing tailor-made methodologies for each problem instance/domain. A traditional hyper-heuristic framework has two levels, namely, the high level strategy (heuristic selection mechanism and the acceptance criterion) and low level heuristics (a set of problem specific heuristics). Due to the different landscape structures of different problem instances, the high level strategy plays an important role in the design of a hyper-heuristic framework. In this paper, we propose a new high level strategy for a hyper-heuristic framework. The proposed high-level strategy utilizes a dynamic multiarmed bandit-extreme value-based reward as an online heuristic selection mechanism to select the appropriate heuristic to be applied at each iteration. In addition, we propose a gene expression programming framework to automatically generate the acceptance criterion for each problem instance, instead of using human-designed criteria. Two well-known, and very different, combinatorial optimization problems, one static (exam timetabling) and one dynamic (dynamic vehicle routing) are used to demonstrate the generality of the proposed framework. Compared with state-of-the-art hyper-heuristics and other bespoke methods, empirical results demonstrate that the proposed framework is able to generalize well across both domains. We obtain competitive, if not better results, when compared to the best known results obtained from other methods that have been presented in the scientific literature. We also compare our approach against the recently released hyper-heuristic competition test suite. We again demonstrate the generality of our approach when we compare against other methods that have utilized the same six benchmark datasets from this test suite.
Spatial distribution of precipitation extremes in Norway
NASA Astrophysics Data System (ADS)
Verpe Dyrrdal, Anita; Skaugen, Thomas; Lenkoski, Alex; Thorarinsdottir, Thordis; Stordal, Frode; Førland, Eirik J.
2015-04-01
Estimates of extreme precipitation, in terms of return levels, are crucial in planning and design of important infrastructure. Through two separate studies, we have examined the levels and spatial distribution of daily extreme precipitation over catchments in Norway, and hourly extreme precipitation in a point. The analyses were carried out through the development of two new methods for estimating extreme precipitation in Norway. For daily precipitation we fit the Generalized Extreme Value (GEV) distribution to areal time series from a gridded dataset, consisting of daily precipitation during the period 1957-today with a resolution of 1x1 km². This grid-based method is more objective and less manual and time-consuming compared to the existing method at MET Norway. In addition, estimates in ungauged catchments are easier to obtain, and the GEV approach includes a measure of uncertainty, which is a requirement in climate studies today. Further, we go into depth on the debated GEV shape parameter, which plays an important role for longer return periods. We show that it varies according to dominating precipitation types, having positive values in the southeast and negative values in the southwest. We also find indications that the degree of orographic enhancement might affect the shape parameter. For hourly precipitation, we estimate return levels on a 1x1 km² grid, by linking GEV distributions with latent Gaussian fields in a Bayesian hierarchical model (BHM). Generalized linear models on the GEV parameters, estimated from observations, are able to incorporate location-specific geographic and meteorological information and thereby accommodate these effects on extreme precipitation. Gaussian fields capture additional unexplained spatial heterogeneity and overcome the sparse grid on which observations are collected, while a Bayesian model averaging component directly assesses model uncertainty. We find that mean summer precipitation, mean summer temperature, latitude, longitude, mean annual precipitation and elevation are good covariate candidates for hourly precipitation in our model. Summer indices succeed because hourly precipitation extremes often occur during the convective season. The spatial distribution of hourly and daily precipitation differs in Norway. Daily precipitation extremes are larger along the southwestern coast, where large-scale frontal systems dominate during fall season and the mountain ridge generates strong orographic enhancement. The largest hourly precipitation extremes are mostly produced by intense convective showers during summer, and are thus found along the entire southern coast, including the Oslo-region.
NASA Astrophysics Data System (ADS)
Pegram, Geoff; Bardossy, Andras; Sinclair, Scott
2017-04-01
The use of radar measurements for the space time estimation of precipitation has for many decades been a central topic in hydro-meteorology. In this presentation we are interested specifically in daily and sub-daily extreme values of precipitation at gauged or ungauged locations which are important for design. The purpose of the presentation is to develop a methodology to combine daily precipitation observations and radar measurements to estimate sub-daily extremes at point locations. Radar data corrected using precipitation-reflectivity relationships lead to biased estimations of extremes. Different possibilities of correcting systematic errors using the daily observations are investigated. Observed gauged daily amounts are interpolated to un-sampled points and subsequently disaggregated using the sub-daily values obtained by the radar. Different corrections based on the spatial variability and the sub-daily entropy of scaled rainfall distributions are used to provide unbiased corrections of short duration extremes. In addition, a statistical procedure not based on a matching day by day correction is tested. In this last procedure, as we are only interested in rare extremes, low to medium values of rainfall depth were neglected leaving 12 days of ranked daily maxima in each set per year, whose sum typically comprises about 50% of each annual rainfall total. The sum of these 12 day maxima is first interpolated using a Kriging procedure. Subsequently this sum is disaggregated to daily values using a nearest neighbour procedure. The daily sums are then disaggregated by using the relative values of the biggest 12 radar based days in each year. Of course, the timings of radar and gauge maxima can be different, so the new method presented here uses radar for disaggregating daily gauge totals down to 15 min intervals in order to extract the maxima of sub-hourly through to daily rainfall. The methodologies were tested in South Africa, where an S-band radar operated relatively continuously at Bethlehem from 1998 to 2003, whose scan at 1.5 km above ground [CAPPI] overlapped a dense [10 km spacing] set of 45 pluviometers recording in the same 6-year period. This valuable set of data was obtained from each of 37 selected radar pixels [1 km square in plan] which contained a pluviometer, not masked out by the radar foot-print. The pluviometer data were also aggregated to daily totals, for the same purpose. The extremes obtained using disaggregation methods were compared to the observed extremes in a cross validation procedure. The unusual and novel goal was not to obtain the reproduction of the precipitation matching in space and time, but to obtain frequency distributions of the point extremes, which we found to be stable. Published as: Bárdossy, A., and G. G. S. Pegram (2017) Journal of Hydrology, Volume 544, pp 397-406
Estimating the extreme low-temperature event using nonparametric methods
NASA Astrophysics Data System (ADS)
D'Silva, Anisha
This thesis presents a new method of estimating the one-in-N low temperature threshold using a non-parametric statistical method called kernel density estimation applied to daily average wind-adjusted temperatures. We apply our One-in-N Algorithm to local gas distribution companies (LDCs), as they have to forecast the daily natural gas needs of their consumers. In winter, demand for natural gas is high. Extreme low temperature events are not directly related to an LDCs gas demand forecasting, but knowledge of extreme low temperatures is important to ensure that an LDC has enough capacity to meet customer demands when extreme low temperatures are experienced. We present a detailed explanation of our One-in-N Algorithm and compare it to the methods using the generalized extreme value distribution, the normal distribution, and the variance-weighted composite distribution. We show that our One-in-N Algorithm estimates the one-in- N low temperature threshold more accurately than the methods using the generalized extreme value distribution, the normal distribution, and the variance-weighted composite distribution according to root mean square error (RMSE) measure at a 5% level of significance. The One-in- N Algorithm is tested by counting the number of times the daily average wind-adjusted temperature is less than or equal to the one-in- N low temperature threshold.
Screening for Autism in Extremely Preterm Infants: Problems in Interpretation
ERIC Educational Resources Information Center
Moore, Tamanna; Johnson, Samantha; Hennessy, Enid; Marlow, Neil
2012-01-01
Aim: The aim of this article was to report the prevalence of, and risk factors for, positive autism screens using the Modified Checklist for Autism in Toddlers (M-CHAT) in children born extremely preterm in England. Method: All children born at not more than 26 weeks' gestational age in England during 2006 were recruited to the EPICure-2 study. At…
The Extreme Ultraviolet Explorer science instruments development - Lessons learned
NASA Technical Reports Server (NTRS)
Malina, Roger F.; Battel, S.
1991-01-01
The science instruments development project for the Extreme Ultraviolet Explorer (EUVE) satellite is reviewed. Issues discussed include the philosophical basis of the program, the establishment of a tight development team, the approach to planning and phasing activities, the handling of the most difficult technical problems, and the assessment of the work done during the preimplemntation period of the project.
Deformation mechanisms in a coal mine roadway in extremely swelling soft rock.
Li, Qinghai; Shi, Weiping; Yang, Renshu
2016-01-01
The problem of roadway support in swelling soft rock was one of the challenging problems during mining. For most geological conditions, combinations of two or more supporting approaches could meet the requirements of most roadways; however, in extremely swelling soft rock, combined approaches even could not control large deformations. The purpose of this work was to probe the roadway deformation mechanisms in extremely swelling soft rock. Based on the main return air-way in a coal mine, deformation monitoring and geomechanical analysis were conducted, as well as plastic zone mechanical model was analysed. Results indicated that this soft rock was potentially very swelling. When the ground stress acted alone, the support strength needed in situ was not too large and combined supporting approaches could meet this requirement; however, when this potential released, the roadway would undergo permanent deformation. When the loose zone reached 3 m within surrounding rock, remote stress p ∞ and supporting stress P presented a linear relationship. Namely, the greater the swelling stress, the more difficult it would be in roadway supporting. So in this extremely swelling soft rock, a better way to control roadway deformation was to control the releasing of surrounding rock's swelling potential.
NASA Astrophysics Data System (ADS)
Reinstorf, F.
2016-12-01
Extreme weather conditions during the years 2009 - 2011 in combination with changes in the regional water management and possible impacts of climate change led to maximum groundwater levels in large areas of Germany in 2011. This resulted in extensive water logging, with problems especially in urban areas near rivers, where water logging produced huge problems for buildings and infrastructure. The acute situation still exists in many areas and requires the development of solution concepts. Taken the example of the Elbe-Saale-Region in the Federal State of Saxony-Anhalt, were a pilot research project was carried out, the analytical situation, the development of a management tool and the implementation of a groundwater management concept are shown. The central tool is a coupled water budget - groundwater flow model. In combination with sophisticated multi-scale parameter estimation, a high resolution groundwater level simulation was carried out. A decision support process with a very intensive stakeholder interaction combined with high resolution simulations enables the development of a management concept for extreme groundwater situations in consideration of sustainable and environmentally sound solutions mainly on the base of passive measures.
NASA Astrophysics Data System (ADS)
Reinstorf, Frido; Kramer, Stefanie; Koch, Thomas; Seifert, Sven; Monninkhoff, Bertram; Pfützner, Bernd
2017-04-01
Extreme weather conditions during the years 2009 - 2011 in combination with changes in the regional water management and possible impacts of climate change led to maximum groundwater levels in large areas of Germany in 2011. This resulted in extensive water logging, with problems especially in urban areas near rivers, where water logging produced huge problems for buildings and infrastructure. The acute situation still exists in many areas and requires the development of solution concepts. Taken the example of the Elbe-Saale-Region in the Federal State of Saxony-Anhalt, were a pilot research project was carried out, the analytical situation, the development of a management tool and the implementation of a groundwater management concept are shown. The central tool is a coupled water budget - groundwater flow model. In combination with sophisticated multi-scale parameter estimation, a high resolution groundwater level simulation was carried out. A decision support process with a very intensive stakeholder interaction combined with high resolution simulations enables the development of a management concept for extreme groundwater situations in consideration of sustainable and environmentally sound solutions mainly on the base of passive measures.
The Logic of Values Clarification
ERIC Educational Resources Information Center
Kazepides, A. C.
1977-01-01
Traces the origin of the Values Clarification movement in education in Carl Roger's clien-centered therapy and exposes its unwarranted extreme ethical stance. Examines a model episode of values clarification and shows how the theoretical confusions of the Values Clarification proponents are reflected in their actual teaching strategies. (Editor/RK)
A Metastatistical Approach to Satellite Estimates of Extreme Rainfall Events
NASA Astrophysics Data System (ADS)
Zorzetto, E.; Marani, M.
2017-12-01
The estimation of the average recurrence interval of intense rainfall events is a central issue for both hydrologic modeling and engineering design. These estimates require the inference of the properties of the right tail of the statistical distribution of precipitation, a task often performed using the Generalized Extreme Value (GEV) distribution, estimated either from a samples of annual maxima (AM) or with a peaks over threshold (POT) approach. However, these approaches require long and homogeneous rainfall records, which often are not available, especially in the case of remote-sensed rainfall datasets. We use here, and tailor it to remotely-sensed rainfall estimates, an alternative approach, based on the metastatistical extreme value distribution (MEVD), which produces estimates of rainfall extreme values based on the probability distribution function (pdf) of all measured `ordinary' rainfall event. This methodology also accounts for the interannual variations observed in the pdf of daily rainfall by integrating over the sample space of its random parameters. We illustrate the application of this framework to the TRMM Multi-satellite Precipitation Analysis rainfall dataset, where MEVD optimally exploits the relatively short datasets of satellite-sensed rainfall, while taking full advantage of its high spatial resolution and quasi-global coverage. Accuracy of TRMM precipitation estimates and scale issues are here investigated for a case study located in the Little Washita watershed, Oklahoma, using a dense network of rain gauges for independent ground validation. The methodology contributes to our understanding of the risk of extreme rainfall events, as it allows i) an optimal use of the TRMM datasets in estimating the tail of the probability distribution of daily rainfall, and ii) a global mapping of daily rainfall extremes and distributional tail properties, bridging the existing gaps in rain gauges networks.
NASA Astrophysics Data System (ADS)
Brunsell, N. A.; Nippert, J. B.
2011-12-01
As the climate warms, it is generally acknowledged that the number and magnitude of extreme weather events will increase. We examined an ecophysiological model's responses to precipitation and temperature anomalies in relation to the mean and variance of annual precipitation along a pronounced precipitation gradient from eastern to western Kansas. This natural gradient creates a template of potential responses for both the mean and variance of annual precipitation to compare the timescales of carbon and water fluxes. Using data from several Ameriflux sites (KZU and KFS) and a third eddy covariance tower (K4B) along the gradient, BIOME-BGC was used to characterize water and carbon cycle responses to extreme weather events. Changes in the extreme value distributions were based on SRES A1B and A2 scenarios using an ensemble mean of 21 GCMs for the region, downscaled using a stochastic weather generator. We focused on changing the timing and magnitude of precipitation and altering the diurnal and seasonal temperature ranges. Biome-BGC was then forced with daily output from the stochastic weather generator, and we examined how potential changes in these extreme value distributions impact carbon and water cycling at the sites across the Kansas precipitation gradient at time scales ranging from daily to interannual. To decompose the time scales of response, we applied a wavelet based information theory analysis approach. Results indicate impacts in soil moisture memory and carbon allocation processes, which vary in response to both the mean and variance of precipitation along the precipitation gradient. These results suggest a more pronounced focus ecosystem responses to extreme events across a range of temporal scales in order to fully characterize the water and carbon cycle responses to global climate change.
Exact extreme-value statistics at mixed-order transitions.
Bar, Amir; Majumdar, Satya N; Schehr, Grégory; Mukamel, David
2016-05-01
We study extreme-value statistics for spatially extended models exhibiting mixed-order phase transitions (MOT). These are phase transitions that exhibit features common to both first-order (discontinuity of the order parameter) and second-order (diverging correlation length) transitions. We consider here the truncated inverse distance squared Ising model, which is a prototypical model exhibiting MOT, and study analytically the extreme-value statistics of the domain lengths The lengths of the domains are identically distributed random variables except for the global constraint that their sum equals the total system size L. In addition, the number of such domains is also a fluctuating variable, and not fixed. In the paramagnetic phase, we show that the distribution of the largest domain length l_{max} converges, in the large L limit, to a Gumbel distribution. However, at the critical point (for a certain range of parameters) and in the ferromagnetic phase, we show that the fluctuations of l_{max} are governed by novel distributions, which we compute exactly. Our main analytical results are verified by numerical simulations.
NASA Technical Reports Server (NTRS)
Melick, H. C., Jr.; Ybarra, A. H.; Bencze, D. P.
1975-01-01
An inexpensive method is developed to determine the extreme values of instantaneous inlet distortion. This method also provides insight into the basic mechanics of unsteady inlet flow and the associated engine reaction. The analysis is based on fundamental fluid dynamics and statistical methods to provide an understanding of the turbulent inlet flow and quantitatively relate the rms level and power spectral density (PSD) function of the measured time variant total pressure fluctuations to the strength and size of the low pressure regions. The most probable extreme value of the instantaneous distortion is then synthesized from this information in conjunction with the steady state distortion. Results of the analysis show the extreme values to be dependent upon the steady state distortion, the measured turbulence rms level and PSD function, the time on point, and the engine response characteristics. Analytical projections of instantaneous distortion are presented and compared with data obtained by a conventional, highly time correlated, 40 probe instantaneous pressure measurement system.
Probabilistic Open Set Recognition
NASA Astrophysics Data System (ADS)
Jain, Lalit Prithviraj
Real-world tasks in computer vision, pattern recognition and machine learning often touch upon the open set recognition problem: multi-class recognition with incomplete knowledge of the world and many unknown inputs. An obvious way to approach such problems is to develop a recognition system that thresholds probabilities to reject unknown classes. Traditional rejection techniques are not about the unknown; they are about the uncertain boundary and rejection around that boundary. Thus traditional techniques only represent the "known unknowns". However, a proper open set recognition algorithm is needed to reduce the risk from the "unknown unknowns". This dissertation examines this concept and finds existing probabilistic multi-class recognition approaches are ineffective for true open set recognition. We hypothesize the cause is due to weak adhoc assumptions combined with closed-world assumptions made by existing calibration techniques. Intuitively, if we could accurately model just the positive data for any known class without overfitting, we could reject the large set of unknown classes even under this assumption of incomplete class knowledge. For this, we formulate the problem as one of modeling positive training data by invoking statistical extreme value theory (EVT) near the decision boundary of positive data with respect to negative data. We provide a new algorithm called the PI-SVM for estimating the unnormalized posterior probability of class inclusion. This dissertation also introduces a new open set recognition model called Compact Abating Probability (CAP), where the probability of class membership decreases in value (abates) as points move from known data toward open space. We show that CAP models improve open set recognition for multiple algorithms. Leveraging the CAP formulation, we go on to describe the novel Weibull-calibrated SVM (W-SVM) algorithm, which combines the useful properties of statistical EVT for score calibration with one-class and binary support vector machines. Building from the success of statistical EVT based recognition methods such as PI-SVM and W-SVM on the open set problem, we present a new general supervised learning algorithm for multi-class classification and multi-class open set recognition called the Extreme Value Local Basis (EVLB). The design of this algorithm is motivated by the observation that extrema from known negative class distributions are the closest negative points to any positive sample during training, and thus should be used to define the parameters of a probabilistic decision model. In the EVLB, the kernel distribution for each positive training sample is estimated via an EVT distribution fit over the distances to the separating hyperplane between positive training sample and closest negative samples, with a subset of the overall positive training data retained to form a probabilistic decision boundary. Using this subset as a frame of reference, the probability of a sample at test time decreases as it moves away from the positive class. Possessing this property, the EVLB is well-suited to open set recognition problems where samples from unknown or novel classes are encountered at test. Our experimental evaluation shows that the EVLB provides a substantial improvement in scalability compared to standard radial basis function kernel machines, as well as P I-SVM and W-SVM, with improved accuracy in many cases. We evaluate our algorithm on open set variations of the standard visual learning benchmarks, as well as with an open subset of classes from Caltech 256 and ImageNet. Our experiments show that PI-SVM, WSVM and EVLB provide significant advances over the previous state-of-the-art solutions for the same tasks.
A Non-Stationary Approach for Estimating Future Hydroclimatic Extremes Using Monte-Carlo Simulation
NASA Astrophysics Data System (ADS)
Byun, K.; Hamlet, A. F.
2017-12-01
There is substantial evidence that observed hydrologic extremes (e.g. floods, extreme stormwater events, and low flows) are changing and that climate change will continue to alter the probability distributions of hydrologic extremes over time. These non-stationary risks imply that conventional approaches for designing hydrologic infrastructure (or making other climate-sensitive decisions) based on retrospective analysis and stationary statistics will become increasingly problematic through time. To develop a framework for assessing risks in a non-stationary environment our study develops a new approach using a super ensemble of simulated hydrologic extremes based on Monte Carlo (MC) methods. Specifically, using statistically downscaled future GCM projections from the CMIP5 archive (using the Hybrid Delta (HD) method), we extract daily precipitation (P) and temperature (T) at 1/16 degree resolution based on a group of moving 30-yr windows within a given design lifespan (e.g. 10, 25, 50-yr). Using these T and P scenarios we simulate daily streamflow using the Variable Infiltration Capacity (VIC) model for each year of the design lifespan and fit a Generalized Extreme Value (GEV) probability distribution to the simulated annual extremes. MC experiments are then used to construct a random series of 10,000 realizations of the design lifespan, estimating annual extremes using the estimated unique GEV parameters for each individual year of the design lifespan. Our preliminary results for two watersheds in Midwest show that there are considerable differences in the extreme values for a given percentile between conventional MC and non-stationary MC approach. Design standards based on our non-stationary approach are also directly dependent on the design lifespan of infrastructure, a sensitivity which is notably absent from conventional approaches based on retrospective analysis. The experimental approach can be applied to a wide range of hydroclimatic variables of interest.
Spatial variability of extreme rainfall at radar subpixel scale
NASA Astrophysics Data System (ADS)
Peleg, Nadav; Marra, Francesco; Fatichi, Simone; Paschalis, Athanasios; Molnar, Peter; Burlando, Paolo
2018-01-01
Extreme rainfall is quantified in engineering practice using Intensity-Duration-Frequency curves (IDF) that are traditionally derived from rain-gauges and more recently also from remote sensing instruments, such as weather radars. These instruments measure rainfall at different spatial scales: rain-gauge samples rainfall at the point scale while weather radar averages precipitation on a relatively large area, generally around 1 km2. As such, a radar derived IDF curve is representative of the mean areal rainfall over a given radar pixel and neglects the within-pixel rainfall variability. In this study, we quantify subpixel variability of extreme rainfall by using a novel space-time rainfall generator (STREAP model) that downscales in space the rainfall within a given radar pixel. The study was conducted using a unique radar data record (23 years) and a very dense rain-gauge network in the Eastern Mediterranean area (northern Israel). Radar-IDF curves, together with an ensemble of point-based IDF curves representing the radar subpixel extreme rainfall variability, were developed fitting Generalized Extreme Value (GEV) distributions to annual rainfall maxima. It was found that the mean areal extreme rainfall derived from the radar underestimate most of the extreme values computed for point locations within the radar pixel (on average, ∼70%). The subpixel variability of rainfall extreme was found to increase with longer return periods and shorter durations (e.g. from a maximum variability of 10% for a return period of 2 years and a duration of 4 h to 30% for 50 years return period and 20 min duration). For the longer return periods, a considerable enhancement of extreme rainfall variability was found when stochastic (natural) climate variability was taken into account. Bounding the range of the subpixel extreme rainfall derived from radar-IDF can be of major importance for different applications that require very local estimates of rainfall extremes.
Capturing spatial and temporal patterns of widespread, extreme flooding across Europe
NASA Astrophysics Data System (ADS)
Busby, Kathryn; Raven, Emma; Liu, Ye
2013-04-01
Statistical characterisation of physical hazards is an integral part of probabilistic catastrophe models used by the reinsurance industry to estimate losses from large scale events. Extreme flood events are not restricted by country boundaries which poses an issue for reinsurance companies as their exposures often extend beyond them. We discuss challenges and solutions that allow us to appropriately capture the spatial and temporal dependence of extreme hydrological events on a continental-scale, which in turn enables us to generate an industry-standard stochastic event set for estimating financial losses for widespread flooding. By presenting our event set methodology, we focus on explaining how extreme value theory (EVT) and dependence modelling are used to account for short, inconsistent hydrological data from different countries, and how to make appropriate statistical decisions that best characterise the nature of flooding across Europe. The consistency of input data is of vital importance when identifying historical flood patterns. Collating data from numerous sources inherently causes inconsistencies and we demonstrate our robust approach to assessing the data and refining it to compile a single consistent dataset. This dataset is then extrapolated using a parameterised EVT distribution to estimate extremes. Our method then captures the dependence of flood events across countries using an advanced multivariate extreme value model. Throughout, important statistical decisions are explored including: (1) distribution choice; (2) the threshold to apply for extracting extreme data points; (3) a regional analysis; (4) the definition of a flood event, which is often linked with reinsurance industry's hour's clause; and (5) handling of missing values. Finally, having modelled the historical patterns of flooding across Europe, we sample from this model to generate our stochastic event set comprising of thousands of events over thousands of years. We then briefly illustrate how this is applied within a probabilistic model to estimate catastrophic loss curves used by the reinsurance industry.
LSRN: A PARALLEL ITERATIVE SOLVER FOR STRONGLY OVER- OR UNDERDETERMINED SYSTEMS*
Meng, Xiangrui; Saunders, Michael A.; Mahoney, Michael W.
2014-01-01
We describe a parallel iterative least squares solver named LSRN that is based on random normal projection. LSRN computes the min-length solution to minx∈ℝn ‖Ax − b‖2, where A ∈ ℝm × n with m ≫ n or m ≪ n, and where A may be rank-deficient. Tikhonov regularization may also be included. Since A is involved only in matrix-matrix and matrix-vector multiplications, it can be a dense or sparse matrix or a linear operator, and LSRN automatically speeds up when A is sparse or a fast linear operator. The preconditioning phase consists of a random normal projection, which is embarrassingly parallel, and a singular value decomposition of size ⌈γ min(m, n)⌉ × min(m, n), where γ is moderately larger than 1, e.g., γ = 2. We prove that the preconditioned system is well-conditioned, with a strong concentration result on the extreme singular values, and hence that the number of iterations is fully predictable when we apply LSQR or the Chebyshev semi-iterative method. As we demonstrate, the Chebyshev method is particularly efficient for solving large problems on clusters with high communication cost. Numerical results show that on a shared-memory machine, LSRN is very competitive with LAPACK’s DGELSD and a fast randomized least squares solver called Blendenpik on large dense problems, and it outperforms the least squares solver from SuiteSparseQR on sparse problems without sparsity patterns that can be exploited to reduce fill-in. Further experiments show that LSRN scales well on an Amazon Elastic Compute Cloud cluster. PMID:25419094
Actionable Science Lessons Emerging from the Department of Interior Climate Science Center Network
NASA Astrophysics Data System (ADS)
McMahon, G.; Meadow, A. M.; Mikels-Carrasco, J.
2015-12-01
The DOI Advisory Committee on Climate Change and Natural Resource Science (ACCCNRS) has recommended that co-production of actionable science be the core programmatic focus of the Climate Science Center enterprise. Efforts by the Southeast Climate Science Center suggest that the complexity of many climate adaptation decision problems (many stakeholders that can influence implementation of a decision; the problems that can be viewed at many scales in space and time; dynamic objectives with competing values; complex, non-linear systems) complicates development of research-based information that scientists and non-scientists view as comprehensible, trustworthy, legitimate, and accurate. Going forward, organizers of actionable science efforts should consider inclusion of a broad set of stakeholders, beyond formal decisionmakers, and ensure that sufficient resources are available to explore the interests and values of this broader group. Co-produced research endeavors should foster agency and collaboration across a wide range of stakeholders. We recognize that stakeholder agency may be constrained by scientific or political power structures that limit the ability to initiate discussion, make claims, and call things into question. Co-production efforts may need to be preceded by more descriptive assessments that summarize existing climate science in ways that stakeholders can understand and link with their concerns. Such efforts can build rapport and trust among scientists and non-scientists, and may help stakeholders and scientists alike to frame adaptation decision problems amenable to a co-production effort. Finally, university and government researchers operate within an evaluation structure that rewards researcher-driven science that, at the extreme, "throws information over the fence" in the hope that information users will make better decisions. Research evaluation processes must reward more consultative, collaborative, and collegial research approaches if researchers are to widely adopt co-production methods
Calculating p-values and their significances with the Energy Test for large datasets
NASA Astrophysics Data System (ADS)
Barter, W.; Burr, C.; Parkes, C.
2018-04-01
The energy test method is a multi-dimensional test of whether two samples are consistent with arising from the same underlying population, through the calculation of a single test statistic (called the T-value). The method has recently been used in particle physics to search for samples that differ due to CP violation. The generalised extreme value function has previously been used to describe the distribution of T-values under the null hypothesis that the two samples are drawn from the same underlying population. We show that, in a simple test case, the distribution is not sufficiently well described by the generalised extreme value function. We present a new method, where the distribution of T-values under the null hypothesis when comparing two large samples can be found by scaling the distribution found when comparing small samples drawn from the same population. This method can then be used to quickly calculate the p-values associated with the results of the test.
Jonker, Michiel T O
2016-06-01
Octanol-water partition coefficients (KOW ) are widely used in fate and effects modeling of chemicals. Still, high-quality experimental KOW data are scarce, in particular for very hydrophobic chemicals. This hampers reliable assessments of several fate and effect parameters and the development and validation of new models. One reason for the limited availability of experimental values may relate to the challenging nature of KOW measurements. In the present study, KOW values for 13 polycyclic aromatic hydrocarbons were determined with the gold standard "slow-stirring" method (log KOW 4.6-7.2). These values were then used as reference data for the development of an alternative method for measuring KOW . This approach combined slow stirring and equilibrium sampling of the extremely low aqueous concentrations with polydimethylsiloxane-coated solid-phase microextraction fibers, applying experimentally determined fiber-water partition coefficients. It resulted in KOW values matching the slow-stirring data very well. Therefore, the method was subsequently applied to a series of 17 moderately to extremely hydrophobic petrochemical compounds. The obtained KOW values spanned almost 6 orders of magnitude, with the highest value measuring 10(10.6) . The present study demonstrates that the hydrophobicity domain within which experimental KOW measurements are possible can be extended with the help of solid-phase microextraction and that experimentally determined KOW values can exceed the proposed upper limit of 10(9) . Environ Toxicol Chem 2016;35:1371-1377. © 2015 SETAC. © 2015 SETAC.
Aerodynamic Shape Optimization Using A Real-Number-Encoded Genetic Algorithm
NASA Technical Reports Server (NTRS)
Holst, Terry L.; Pulliam, Thomas H.
2001-01-01
A new method for aerodynamic shape optimization using a genetic algorithm with real number encoding is presented. The algorithm is used to optimize three different problems, a simple hill climbing problem, a quasi-one-dimensional nozzle problem using an Euler equation solver and a three-dimensional transonic wing problem using a nonlinear potential solver. Results indicate that the genetic algorithm is easy to implement and extremely reliable, being relatively insensitive to design space noise.
NASA Astrophysics Data System (ADS)
Pierini, J. O.; Restrepo, J. C.; Aguirre, J.; Bustamante, A. M.; Velásquez, G. J.
2017-04-01
A measure of the variability in seasonal extreme streamflow was estimated for the Colombian Caribbean coast, using monthly time series of freshwater discharge from ten watersheds. The aim was to detect modifications in the streamflow monthly distribution, seasonal trends, variance and extreme monthly values. A 20-year length time moving window, with 1-year successive shiftments, was applied to the monthly series to analyze the seasonal variability of streamflow. The seasonal-windowed data were statistically fitted through the Gamma distribution function. Scale and shape parameters were computed using the Maximum Likelihood Estimation (MLE) and the bootstrap method for 1000 resample. A trend analysis was performed for each windowed-serie, allowing to detect the window of maximum absolute values for trends. Significant temporal shifts in seasonal streamflow distribution and quantiles (QT), were obtained for different frequencies. Wet and dry extremes periods increased significantly in the last decades. Such increase did not occur simultaneously through the region. Some locations exhibited continuous increases only at minimum QT.
Precipitation extremes and their relation to climatic indices in the Pacific Northwest USA
NASA Astrophysics Data System (ADS)
Zarekarizi, Mahkameh; Rana, Arun; Moradkhani, Hamid
2018-06-01
There has been focus on the influence of climate indices on precipitation extremes in the literature. Current study presents the evaluation of the precipitation-based extremes in Columbia River Basin (CRB) in the Pacific Northwest USA. We first analyzed the precipitation-based extremes using statistically (ten GCMs) and dynamically downscaled (three GCMs) past and future climate projections. Seven precipitation-based indices that help inform about the flood duration/intensity are used. These indices help in attaining first-hand information on spatial and temporal scales for different service sectors including energy, agriculture, forestry etc. Evaluation of these indices is first performed in historical period (1971-2000) followed by analysis of their relation to large scale tele-connections. Further we mapped these indices over the area to evaluate the spatial variation of past and future extremes in downscaled and observational data. The analysis shows that high values of extreme indices are clustered in either western or northern parts of the basin for historical period whereas the northern part is experiencing higher degree of change in the indices for future scenario. The focus is also on evaluating the relation of these extreme indices to climate tele-connections in historical period to understand their relationship with extremes over CRB. Various climate indices are evaluated for their relationship using Principal Component Analysis (PCA) and Singular Value Decomposition (SVD). Results indicated that, out of 13 climate tele-connections used in the study, CRB is being most affected inversely by East Pacific (EP), Western Pacific (WP), East Atlantic (EA) and North Atlaentic Oscillation (NAO).
Stride search: A general algorithm for storm detection in high resolution climate data
Bosler, Peter Andrew; Roesler, Erika Louise; Taylor, Mark A.; ...
2015-09-08
This article discusses the problem of identifying extreme climate events such as intense storms within large climate data sets. The basic storm detection algorithm is reviewed, which splits the problem into two parts: a spatial search followed by a temporal correlation problem. Two specific implementations of the spatial search algorithm are compared. The commonly used grid point search algorithm is reviewed, and a new algorithm called Stride Search is introduced. Stride Search is designed to work at all latitudes, while grid point searches may fail in polar regions. Results from the two algorithms are compared for the application of tropicalmore » cyclone detection, and shown to produce similar results for the same set of storm identification criteria. The time required for both algorithms to search the same data set is compared. Furthermore, Stride Search's ability to search extreme latitudes is demonstrated for the case of polar low detection.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bosler, Peter Andrew; Roesler, Erika Louise; Taylor, Mark A.
This article discusses the problem of identifying extreme climate events such as intense storms within large climate data sets. The basic storm detection algorithm is reviewed, which splits the problem into two parts: a spatial search followed by a temporal correlation problem. Two specific implementations of the spatial search algorithm are compared. The commonly used grid point search algorithm is reviewed, and a new algorithm called Stride Search is introduced. Stride Search is designed to work at all latitudes, while grid point searches may fail in polar regions. Results from the two algorithms are compared for the application of tropicalmore » cyclone detection, and shown to produce similar results for the same set of storm identification criteria. The time required for both algorithms to search the same data set is compared. Furthermore, Stride Search's ability to search extreme latitudes is demonstrated for the case of polar low detection.« less
Advanced imaging in acute and chronic deep vein thrombosis
Karande, Gita Yashwantrao; Sanchez, Yadiel; Baliyan, Vinit; Mishra, Vishala; Ganguli, Suvranu; Prabhakar, Anand M.
2016-01-01
Deep venous thrombosis (DVT) affecting the extremities is a common clinical problem. Prompt imaging aids in rapid diagnosis and adequate treatment. While ultrasound (US) remains the workhorse of detection of extremity venous thrombosis, CT and MRI are commonly used as the problem-solving tools either to visualize the thrombosis in central veins like superior or inferior vena cava (IVC) or to test for the presence of complications like pulmonary embolism (PE). The cross-sectional modalities also offer improved visualization of venous collaterals. The purpose of this article is to review the established modalities used for characterization and diagnosis of DVT, and further explore promising innovations and recent advances in this field. PMID:28123971
A Zone for Deliberation? Methodological Challenges in Fields of Political Unrest
ERIC Educational Resources Information Center
Westrheim, Kariane; Lillejord, Solvi
2007-01-01
This article outlines certain problems and challenges facing the qualitative researcher who enters fields that are either extremely difficult to access or potentially hostile towards outsiders. Problems and dilemmas in such contexts are highlighted by reference to fieldwork research among PKK (Kurdistan Worker's Party) guerrillas in North…
Fuelwood Problems and Solutions
D. Evan Mercer; John Soussan
1992-01-01
Concern over the "fuelwood crisis" facing the world's poor has been widespread since the late 1970s (Eckholm et al. 1984; Soussan 1988; Agarwal 1986). At first the problem was frequently overstated. In the extreme, analysts (foresters, economists, and others) in many countries made erroneous projections of the rapid total destruction of the biomass...
The Place and Purpose of Combinatorics
ERIC Educational Resources Information Center
Hurdle, Zach; Warshauer, Max; White, Alex
2016-01-01
The desire to persuade students to avoid strictly memorizing formulas is a recurring theme throughout discussions of curriculum and problem solving. In combinatorics, a branch of discrete mathematics, problems can be easy to write--identify a few categories, add a few restrictions, specify an outcome--yet extremely challenging to solve. A lesson…
NASA Astrophysics Data System (ADS)
Ivashkin, V. V.; Krylov, I. V.
2015-09-01
A method to optimize the flight trajectories to the asteroid Apophis that allows reliably to form a set of Pontryagin extremals for various boundary conditions of the flight, as well as effectively to search for a global problem optimum amongst its elements, is developed.
Feuerstein, Michael; Huang, Grant D; Ortiz, Jose M; Shaw, William S; Miller, Virginia I; Wood, Patricia M
2003-08-01
An integrated case management (ICM) approach (ergonomic and problem-solving intervention) to work-related upper-extremity disorders was examined in relation to patient satisfaction, future symptom severity, function, and return to work (RTW). Federal workers with work-related upper-extremity disorder workers' compensation claims (n = 205) were randomly assigned to usual care or ICM intervention. Patient satisfaction was assessed after the 4-month intervention period. Questionnaires on clinical outcomes and ergonomic exposure were administered at baseline and at 6- and 12-months postintervention. Time from intervention to RTW was obtained from an administrative database. ICM group assignment was significantly associated with greater patient satisfaction. Regression analyses found higher patient satisfaction levels predicted decreased symptom severity and functional limitations at 6 months and a shorter RTW. At 12 months, predictors of positive outcomes included male gender, lower distress, lower levels of reported ergonomic exposure, and receipt of ICM. Findings highlight the utility of targeting workplace ergonomic and problem solving skills.
Zheng, Wei; Yan, Xiaoyong; Zhao, Wei; Qian, Chengshan
2017-12-20
A novel large-scale multi-hop localization algorithm based on regularized extreme learning is proposed in this paper. The large-scale multi-hop localization problem is formulated as a learning problem. Unlike other similar localization algorithms, the proposed algorithm overcomes the shortcoming of the traditional algorithms which are only applicable to an isotropic network, therefore has a strong adaptability to the complex deployment environment. The proposed algorithm is composed of three stages: data acquisition, modeling and location estimation. In data acquisition stage, the training information between nodes of the given network is collected. In modeling stage, the model among the hop-counts and the physical distances between nodes is constructed using regularized extreme learning. In location estimation stage, each node finds its specific location in a distributed manner. Theoretical analysis and several experiments show that the proposed algorithm can adapt to the different topological environments with low computational cost. Furthermore, high accuracy can be achieved by this method without setting complex parameters.
Characteristics and present trends of wave extremes in the Mediterranean Sea
NASA Astrophysics Data System (ADS)
Pino, Cosimo; Lionello, Piero; Galati, Maria Barbara
2010-05-01
Wind generated surface waves are an important factor characterizing marine storminess and the marine environment. This contribution considers characteristics and trends of SWH (Significant Wave Height) extremes (both high and low extremes, such as dead calm duration are analyzed). The data analysis is based on a 44-year long simulation (1958-2001) of the wave field in the Mediterranean Sea. The quality of the model simulation is controlled using satellite data. The results show the different characteristics of the different parts of the basin with the variability being higher in the western (where the highest values are produced) than in the eastern areas of the basin (where absence of wave is a rare condition). In fact, both duration of storms and of dead calm episodes is larger in the east than in the west part of the Mediterranean. The African coast and the southern Ionian Sea are the areas were exceptional values of SWH are expected to occur in correspondence with exceptional meteorological events. Significant trends of storm characteristics are present only in sparse areas and suggest a decrease of both storm intensity and duration (a marginal increase of storm intensity is present in the center of the Mediterranean). The statistics of extremes and high SWH values is substantially steady during the second half of the 20th century. The influence of the large-scale teleconnection patterns (TlcP) that are known to be relevant for the Mediterranean climate on the intensity and spatial distribution of extreme SWH (Significant Wave Height) has been investigated. The analysis was focused on the monthly scale analysing the variability of links along the annual cycle. The considered TlcP are the North Atlantic Oscillation, the East-Atlantic / West-Russian pattern and the Scandinavian pattern and their effect on the intensity and the frequency of high/low SWH conditions. The results show it is difficult to establish a dominant TlcP for SWH extremes, because all 4 patterns considered are important for at least few months in the year and none of them is important for the whole year. High extremes in winter and fall are more influenced by the TlcPs than in other seasons and low extremes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ashouri, Hamed; Sorooshian, Soroosh; Hsu, Kuo-Lin
This study evaluates the performance of NASA's Modern-Era Retrospective Analysis for Research and Applications (MERRA) precipitation product in reproducing the trend and distribution of extreme precipitation events. Utilizing the extreme value theory, time-invariant and time-variant extreme value distributions are developed to model the trends and changes in the patterns of extreme precipitation events over the contiguous United States during 1979-2010. The Climate Prediction Center (CPC)U.S.Unified gridded observation data are used as the observational dataset. The CPC analysis shows that the eastern and western parts of the United States are experiencing positive and negative trends in annual maxima, respectively. The continental-scalemore » patterns of change found in MERRA seem to reasonably mirror the observed patterns of change found in CPC. This is not previously expected, given the difficulty in constraining precipitation in reanalysis products. MERRA tends to overestimate the frequency at which the 99th percentile of precipitation is exceeded because this threshold tends to be lower in MERRA, making it easier to be exceeded. This feature is dominant during the summer months. MERRAtends to reproduce spatial patterns of the scale and location parameters of the generalized extreme value and generalized Pareto distributions. However, MERRA underestimates these parameters, particularly over the Gulf Coast states, leading to lower magnitudes in extreme precipitation events. Two issues in MERRA are identified: 1)MERRAshows a spurious negative trend in Nebraska andKansas, which ismost likely related to the changes in the satellite observing system over time that has apparently affected the water cycle in the central United States, and 2) the patterns of positive trend over theGulf Coast states and along the East Coast seem to be correlated with the tropical cyclones in these regions. The analysis of the trends in the seasonal precipitation extremes indicates that the hurricane and winter seasons are contributing the most to these trend patterns in the southeastern United States. The increasing annual trend simulated by MERRA in the Gulf Coast region is due to an incorrect trend in winter precipitation extremes.« less
Reach a nonlinear consensus for MAS via doubly stochastic quadratic operators
NASA Astrophysics Data System (ADS)
Abdulghafor, Rawad; Turaev, Sherzod; Zeki, Akram; Al-Shaikhli, Imad
2018-06-01
This technical note addresses the new nonlinear protocol class of doubly stochastic quadratic operators (DSQOs) for coordination of consensus problem in multi-agent systems (MAS). We derive the conditions for ensuring that every agent reaches consensus on a desired rate of the group's decision where the group decision value in its agent's initial statuses varies. Besides that, we investigate a nonlinear protocol sub-class of extreme DSQO (EDSQO) to reach a consensus for MAS to a common value with nonlinear low-complexity rules and fast time convergence if the interactions for each agent are not selfish. In addition, to extend the results to reach a consensus and to avoid the selfish case we specify a general class of DSQO for reaching a consensus under any given case of initial states. The case that MAS reach a consensus by DSQO is if each member of the agent group has positive interactions of DSQO (PDSQO) with the others. The convergence of both EDSQO and PDSQO classes is found to be directed towards the centre point. Finally, experimental simulations are given to support the analysis from theoretical aspect.
Vibration isolation using extreme geometric nonlinearity
NASA Astrophysics Data System (ADS)
Virgin, L. N.; Santillan, S. T.; Plaut, R. H.
2008-08-01
A highly deformed, slender beam (or strip), attached to a vertically oscillating base, is used in a vibration isolation application to reduce the motion of a supported mass. The isolator is a thin strip that is bent so that the two ends are clamped together, forming a loop. The clamped ends are attached to an excitation source and the supported system is attached at the loop midpoint directly above the base. The strip is modeled as an elastica, and the resulting nonlinear boundary value problem is solved numerically using a shooting method. First the equilibrium shapes of the loop with varying static loads and lengths are studied. The analysis reveals a large degree of stiffness tunability; the stiffness is dependent on the geometric configuration, which itself is determined by the supported mass, loop length, and loop self-weight. Free vibration frequencies and mode shapes are also found. Finally, the case of forced vibration is studied, and the displacement transmissibility over a large range of forcing frequencies is determined for varying parameter values. Experiments using polycarbonate strips are conducted to verify equilibrium and dynamic behavior.
Cumulative hazard: The case of nuisance flooding
NASA Astrophysics Data System (ADS)
Moftakhari, Hamed R.; AghaKouchak, Amir; Sanders, Brett F.; Matthew, Richard A.
2017-02-01
The cumulative cost of frequent events (e.g., nuisance floods) over time may exceed the costs of the extreme but infrequent events for which societies typically prepare. Here we analyze the likelihood of exceedances above mean higher high water and the corresponding property value exposure for minor, major, and extreme coastal floods. Our results suggest that, in response to sea level rise, nuisance flooding (NF) could generate property value exposure comparable to, or larger than, extreme events. Determining whether (and when) low cost, nuisance incidents aggregate into high cost impacts and deciding when to invest in preventive measures are among the most difficult decisions for policymakers. It would be unfortunate if efforts to protect societies from extreme events (e.g., 0.01 annual probability) left them exposed to a cumulative hazard with enormous costs. We propose a Cumulative Hazard Index (CHI) as a tool for framing the future cumulative impact of low cost incidents relative to infrequent extreme events. CHI suggests that in New York, NY, Washington, DC, Miami, FL, San Francisco, CA, and Seattle, WA, a careful consideration of socioeconomic impacts of NF for prioritization is crucial for sustainable coastal flood risk management.
Extreme events and event size fluctuations in biased random walks on networks.
Kishore, Vimal; Santhanam, M S; Amritkar, R E
2012-05-01
Random walk on discrete lattice models is important to understand various types of transport processes. The extreme events, defined as exceedences of the flux of walkers above a prescribed threshold, have been studied recently in the context of complex networks. This was motivated by the occurrence of rare events such as traffic jams, floods, and power blackouts which take place on networks. In this work, we study extreme events in a generalized random walk model in which the walk is preferentially biased by the network topology. The walkers preferentially choose to hop toward the hubs or small degree nodes. In this setting, we show that extremely large fluctuations in event sizes are possible on small degree nodes when the walkers are biased toward the hubs. In particular, we obtain the distribution of event sizes on the network. Further, the probability for the occurrence of extreme events on any node in the network depends on its "generalized strength," a measure of the ability of a node to attract walkers. The generalized strength is a function of the degree of the node and that of its nearest neighbors. We obtain analytical and simulation results for the probability of occurrence of extreme events on the nodes of a network using a generalized random walk model. The result reveals that the nodes with a larger value of generalized strength, on average, display lower probability for the occurrence of extreme events compared to the nodes with lower values of generalized strength.
Caldwell-Harris, Catherine L; Ayçiçegi, Ayse
2006-09-01
Because humans need both autonomy and interdependence, persons with either an extreme collectivist orientation (allocentrics) or extreme individualist values (idiocentrics) may be at risk for possession of some features of psychopathology. Is an extreme personality style a risk factor primarily when it conflicts with the values of the surrounding society? Individualism-collectivism scenarios and a battery of clinical and personality scales were administered to nonclinical samples of college students in Boston and Istanbul. For students residing in a highly individualistic society (Boston), collectivism scores were positively correlated with depression, social anxiety, obsessive-compulsive disorder and dependent personality. Individualism scores, particularly horizontal individualism, were negatively correlated with these same scales. A different pattern was obtained for students residing in a collectivist culture, Istanbul. Here individualism (and especially horizontal individualism) was positively correlated with scales for paranoid, schizoid, narcissistic, borderline and antisocial personality disorder. Collectivism (particularly vertical collectivism) was associated with low report of symptoms on these scales. These results indicate that having a personality style which conflicts with the values of society is associated with psychiatric symptoms. Having an orientation inconsistent with societal values may thus be a risk factor for poor mental health.
NASA Astrophysics Data System (ADS)
Xu, Ying; Gao, Xuejie; Giorgi, Filippo; Zhou, Botao; Shi, Ying; Wu, Jie; Zhang, Yongxiang
2018-04-01
Future changes in the 50-yr return level for temperature and precipitation extremes over mainland China are investigated based on a CMIP5 multi-model ensemble for RCP2.6, RCP4.5 and RCP8.5 scenarios. The following indices are analyzed: TXx and TNn (the annual maximum and minimum of daily maximum and minimum surface temperature), RX5day (the annual maximum consecutive 5-day precipitation) and CDD (maximum annual number of consecutive dry days). After first validating the model performance, future changes in the 50-yr return values and return periods for these indices are investigated along with the inter-model spread. Multi-model median changes show an increase in the 50-yr return values of TXx and a decrease for TNn, more specifically, by the end of the 21st century under RCP8.5, the present day 50-yr return period of warm events is reduced to 1.2 yr, while extreme cold events over the country are projected to essentially disappear. A general increase in RX5day 50-yr return values is found in the future. By the end of the 21st century under RCP8.5, events of the present RX5day 50-yr return period are projected to reduce to < 10 yr over most of China. Changes in CDD-50 show a dipole pattern over China, with a decrease in the values and longer return periods in the north, and vice versa in the south. Our study also highlights the need for further improvements in the representation of extreme events in climate models to assess the future risks and engineering design related to large-scale infrastructure in China.
Extreme air-sea surface turbulent fluxes in mid latitudes - estimation, origins and mechanisms
NASA Astrophysics Data System (ADS)
Gulev, Sergey; Natalia, Tilinina
2014-05-01
Extreme turbulent heat fluxes in the North Atlantic and North Pacific mid latitudes were estimated from the modern era and first generation reanalyses (NCEP-DOE, ERA-Interim, MERRA NCEP-CFSR, JRA-25) for the period from 1979 onwards. We used direct surface turbulent flux output as well as reanalysis state variables from which fluxes have been computed using COARE-3 bulk algorithm. For estimation of extreme flux values we analyzed surface flux probability density distribution which was approximated by Modified Fisher-Tippett distribution. In all reanalyses extreme turbulent heat fluxes amount to 1500-2000 W/m2 (for the 99th percentile) and can exceed 2000 W/m2 for higher percentiles in the western boundary current extension (WBCE) regions. Different reanalyses show significantly different shape of MFT distribution, implying considerable differences in the estimates of extreme fluxes. The highest extreme turbulent latent heat fluxes are diagnosed in NCEP-DOE, ERA-Interim and NCEP-CFSR reanalyses with the smallest being in MERRA. These differences may not necessarily reflect the differences in mean values. Analysis shows that differences in statistical properties of the state variables are the major source of differences in the shape of PDF of fluxes and in the estimates of extreme fluxes while the contribution of computational schemes used in different reanalyses is minor. The strongest differences in the characteristics of probability distributions of surface fluxes and extreme surface flux values between different reanalyses are found in the WBCE extension regions and high latitudes. In the next instance we analyzed the mechanisms responsible for forming surface turbulent fluxes and their potential role in changes of midlatitudinal heat balance. Midlatitudinal cyclones were considered as the major mechanism responsible for extreme turbulent fluxes which are typically occur during the cold air outbreaks in the rear parts of cyclones when atmospheric conditions provide locally high winds and air-sea temperature gradients. For this purpose we linked characteristics of cyclone activity over the midlatitudinal oceans with the extreme surface turbulent heat fluxes. Cyclone tracks and parameters of cyclone life cycle (deepening rates, propagation velocities, life time and clustering) were derived from the same reanalyses using state of the art numerical tracking algorithm. The main questions addressed in this study are (i) through which mechanisms extreme surface fluxes are associated with cyclone activity? and (ii) which types of cyclones are responsible for forming extreme turbulent fluxes? Our analysis shows that extreme surface fluxes are typically associated not with cyclones themselves but rather with cyclone-anticyclone interaction zones. This implies that North Atlantic and North Pacific series of intense cyclones do not result in the anomalous surface fluxes. Alternatively, extreme fluxes are most frequently associated with blocking situations, particularly with the intensification of the Siberian and North American Anticyclones providing cold-air outbreaks over WBC regions.
Advanced Dynamically Adaptive Algorithms for Stochastic Simulations on Extreme Scales
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xiu, Dongbin
2017-03-03
The focus of the project is the development of mathematical methods and high-performance computational tools for stochastic simulations, with a particular emphasis on computations on extreme scales. The core of the project revolves around the design of highly efficient and scalable numerical algorithms that can adaptively and accurately, in high dimensional spaces, resolve stochastic problems with limited smoothness, even containing discontinuities.
NASA Astrophysics Data System (ADS)
Rueda, A.; Alvarez Antolinez, J. A.; Hegermiller, C.; Serafin, K.; Anderson, D. L.; Ruggiero, P.; Barnard, P.; Erikson, L. H.; Vitousek, S.; Camus, P.; Tomas, A.; Gonzalez, M.; Mendez, F. J.
2016-02-01
Long-term coastal evolution and coastal flooding hazards are the result of the non-linear interaction of multiple oceanographic, hydrological, geological and meteorological forcings (e.g., astronomical tide, monthly mean sea level, large-scale storm surge, dynamic wave set-up, shoreline evolution, backshore erosion). Additionally, interannual variability and trends in storminess and sea level rise are climate drivers that must be considered. Moreover, the chronology of the hydraulic boundary conditions plays an important role since a collection of consecutive minor storm events can have more impact than the 100-yr return level event. Therefore, proper modeling of shoreline erosion, beach recovery and coastal flooding should consider the sequence of storms, the multivariate nature of the hydrodynamic forcings, and the different time scales of interest (seasonality, interannual and decadal variability). To address this `beautiful problem', we propose a hybrid approach that combines: (a) numerical hydrodynamic and morphodynamic models (SWAN for wave transformation, a shoreline change model, X-Beach for modeling infragravity waves and erosion of the backshore during extreme events and RFSM-EDA (Jamieson et al, 2012) for high resolution flooding of the coastal hinterland); (b) long-term data bases (observational and hindcast) of sea state parameters, astronomical tides and non-tidal residuals; and (c) statistical downscaling techniques, non-linear data mining, and extreme value models. The statistical downscaling approaches for multivariate variables are based on circulation patterns (Espejo et al., 2014), the chronology of the circulation patterns (Guanche et al, 2013) and the event hydrographs of multivariate extremes, resulting in a time-dependent climate emulator of hydraulic boundary conditions for coupled simulations of the coastal change and flooding models. ReferencesEspejo et al (2014) Spectral ocean wave climate variability based on circulation patterns, J Phys Oc, doi: 10.1175/JPO-D-13-0276.1 Guanche et al (2013) Autoregressive logistic regression applied to atmospheric circulation patterns, Clim Dyn, doi: 10.1007/s00382-013-1690-3 Jamieson et al (2012) A highly efficient 2D flood model with sub-element topography, Proc. Of the Inst Civil Eng., 165(10), 581-595
Linearized stability of extreme black holes
NASA Astrophysics Data System (ADS)
Burko, Lior M.; Khanna, Gaurav
2018-03-01
Extreme black holes have been argued to be unstable, in the sense that under linearized gravitational perturbations of the extreme Kerr spacetime the Weyl scalar ψ4 blows up along their event horizons at very late advanced times. We show numerically, by solving the Teukolsky equation in 2 +1 D , that all algebraically independent curvature scalar polynomials approach limits that exist when advanced time along the event horizon approaches infinity. Therefore, the horizons of extreme black holes are stable against linearized gravitational perturbations. We argue that the divergence of ψ4 is a consequence of the choice of a fixed tetrad, and that in a suitable dynamical tetrad all Weyl scalars, including ψ4, approach their background extreme Kerr values. We make similar conclusions also for the case of scalar field perturbations of extreme Kerr.
Dry seasons identified in oak tree-ring chronology in the Czech Lands over the last millennium
NASA Astrophysics Data System (ADS)
Dobrovolny, Petr; Brazdil, Rudolf; Büntgen, Ulf; Rybnicek, Michal; Kolar, Tomas; Reznickova, Ladislava; Valasek, Hubert; Kotyza, Oldrich
2015-04-01
There is growing evidence on amplification of hydrological regimes as a consequence of rising temperatures, increase in evaporation and changes in circulation patterns. These processes may be responsible for higher probability of hydroclimatic extremes occurrence in regional scale. Extreme events such as floods or droughts are rare from their definition and for better understanding of possible changes in the frequency and intensity of their occurrence, long-term proxy archives may be analysed. Recently several tree ring width chronologies were compiled from hardwood species occurring in lowland positions and their analysis proved that they are moisture-sensitive and suitable for hydroclimate reconstructions. Here, we introduce a new oak (Quercus sp) ring width (RW) dataset for the Czech Republic and the last 1250 years. We explain the process of oak chronology standardization that was based on several only slightly different de-trending techniques and subsequent chronology development steps. We hypothesize that the most severe RW increment reductions (negative extremes) reflect extremely dry spring-summer conditions. Negative extremes were assigned for years in which transformed oak RWs were lower than the minus 1.5 standard deviation. To verify our hypothesis, we compare typical climatic conditions in negative extreme years with climatology of the reference period 1961-1990. Comparison was done for various instrumental measurements (1805-2012), existing proxy reconstructions (1500-1804) and also for documentary evidence from historical archives (before 1500). We found that years of negative extremes are characterized with distinctly above average spring (MAM) and summer (JJA) air temperatures and below average precipitation amounts. Typical sea level pressure spatial distribution in those years shows positive pressure anomaly over British Isles and Northern Sea, the pattern that synoptically corresponds to blocking anticyclone bringing to Central Europe warm air from SW and low precipitation totals with higher probability of drought occurrence. Our results provide consistent physical explanation of extremely dry seasons occurring in Central Europe. However, direct comparisons of individual RW extreme seasons with existing documentary evidence show the complexity the problem as some extremes identified in oak RW chronology were not confirmed in documentary archives and vice versa. We discuss possible causes of such differences related to the fact that various proxies may have problems to record real intensity or duration of extreme events e.g. due to non-linear response of proxy data to climate drivers or due to shift in seasonality.
NASA Astrophysics Data System (ADS)
Ries, H.; Moseley, C.; Haensler, A.
2012-04-01
Reanalyses depict the state of the atmosphere as a best fit in space and time of many atmospheric observations in a physically consistent way. By essentially solving the data assimilation problem in a very accurate manner, reanalysis results can be used as reference for model evaluation procedures and as forcing data sets for different model applications. However, the spatial resolution of the most common and accepted reanalysis data sets (e.g. JRA25, ERA-Interim) ranges from approximately 124 km to 80 km. This resolution is too coarse to simulate certain small scale processes often associated with extreme events. In addition, many models need higher resolved forcing data ( e.g. land-surface models, tools for identifying and assessing hydrological extremes). Therefore we downscaled the ERA-Interim reanalysis over the EURO-CORDEX-Domain for the time period 1989 to 2008 to a horizontal resolution of approximately 12 km. The downscaling is performed by nudging REMO-simulations to lower and lateral boundary conditions of the reanalysis, and by re-initializing the model every 24 hours ("REMO in forecast mode"). In this study the three following questions will be addressed: 1.) Does the REMO poor man's reanalysis meet the needs (accuracy, extreme value distribution) in validation and forcing? 2.) What lessons can be learned about the model used for downscaling? As REMO is used as a pure downscaling procedure, any systematic deviations from ERA-Interim result from poor process modelling but not from predictability limitations. 3.) How much small scale information generated by the downscaling model is lost with frequent initializations? A comparison to a simulation that is performed in climate mode will be presented.
Mehler, Katrin; Oberthuer, André; Lang-Roth, Ruth; Kribs, Angela
2014-01-01
Very immature preterm infants are at risk of developing symptomatic or severe infection if cytomegalovirus is transmitted via breast milk. It is still a matter of debate whether human cytomegalovirus (HCMV) infection may lead to long-term sequelae. We hypothesized that symptomatic and severe HCMV infection transmitted via breast milk affects extremely immature infants at a very high rate. In 2012, untreated breast milk was fed to extremely low birth weight infants after parental informed consent was obtained. We retrospectively analyzed data on HCMV infection of infants born in 2012 between 22 and 24 weeks of gestation. 17 infants were born to HCMV IgG-seropositive mothers. 11 (65%) of these were diagnosed with symptomatic infection. In all cases, thrombocytopenia was the reason to analyze the infant's urine. HCMV infection was diagnosed at a median time of 12 weeks after birth. In 5 (45%) infants, thrombocytopenia was the only symptom and resolved without antiviral therapy or platelet transfusion. 6 (55%) infants developed sepsis-like disease with mildly elevated CRP values and showed signs of respiratory failure. 3 (27%) were able to be stabilized on CPAP, 3 (27%) had to be intubated and mechanically ventilated. 4 children were treated with ganciclovir and/or valganciclovir. 55% failed otoacoustic emissions and/or automated auditory brainstem response testing at discharge. In very immature infants born at the border of viability and suffering from multiple preexisting problems, HCMV infection may trigger a severe deterioration of the clinical course. © 2013 S. Karger AG, Basel.
NASA Astrophysics Data System (ADS)
Guan, Wen; Li, Li; Jin, Weiqi; Qiu, Su; Zou, Yan
2015-10-01
Extreme-Low-Light CMOS has been widely applied in the field of night-vision as a new type of solid image sensor. But if the illumination in the scene has drastic changes or the illumination is too strong, Extreme-Low-Light CMOS can't both clearly present the high-light scene and low-light region. According to the partial saturation problem in the field of night-vision, a HDR image fusion algorithm based on the Laplace Pyramid was researched. The overall gray value and the contrast of the low light image is very low. We choose the fusion strategy based on regional average gradient for the top layer of the long exposure image and short exposure image, which has rich brightness and textural features. The remained layers which represent the edge feature information of the target are based on the fusion strategy based on regional energy. In the process of source image reconstruction with Laplacian pyramid image, we compare the fusion results with four kinds of basal images. The algorithm is tested using Matlab and compared with the different fusion strategies. We use information entropy, average gradient and standard deviation these three objective evaluation parameters for the further analysis of the fusion result. Different low illumination environment experiments show that the algorithm in this paper can rapidly get wide dynamic range while keeping high entropy. Through the verification of this algorithm features, there is a further application prospect of the optimized algorithm. Keywords: high dynamic range imaging, image fusion, multi-exposure image, weight coefficient, information fusion, Laplacian pyramid transform.
Atmospheric River Frequency and Intensity Changes in CMIP5 Climate Model Projections
NASA Astrophysics Data System (ADS)
Warner, M.; Mass, C.; Salathe, E. P., Jr.
2012-12-01
Most extreme precipitation events that occur along the North American west coast are associated with narrow plumes of above-average water vapor concentration that stretch from the tropics or subtropics to the West Coast. These events generally occur during the wet season (October-March) and are referred to as atmospheric rivers (AR). ARs can cause major river management problems, damage from flooding or landslides, and loss of life. It is currently unclear how these events will change in frequency and intensity as a result of climate change in the coming century. While climate model global mean precipitation match observations reasonably well in historical runs, precipitation frequency and intensity is generally poorly represented at local scales; however, synoptic-scale features are more realistically simulated by climate models, and AR events can be identified by extremely high values of integrated water vapor flux at points near the West Coast. There have been many recent studies indicating changes in synoptic-scale features under climate change that could have meaningful impacts on the frequency and intensity of ARs. In this study, a suite of CMIP5 models are used to analyze predicted changes in frequency and intensity of AR events impacting the West Coast from the contemporary period (1970-1999) to the end of this century (2070-2099). Generally, integrated water vapor is predicted to increase in these models (both the mean and extremes) while low-level wind decreases and upper-level wind increases. This study aims to determine the influence of these changes on precipitation intensity in AR events in future climate simulations.
Detection of Extremes with AIRS and CrIS
NASA Technical Reports Server (NTRS)
Aumann, Hartmut H.; Manning, Evan M.; Behrangi, Ali
2013-01-01
Climate change is expected to be detected first as changes in extreme values rather than in mean values. The availability of data of from two instruments in the same orbit, AIRS data for the past eleven years and AIRS and CrIS data from the past year, provides an opportunity to evaluate this using examples of climate relevance: Desertification, seen as changes in hot extremes, severe storm, seen as a change in extremely cold clouds and the warming of the polar zone. We use AIRS to establish trends for the 1%tile, the mean and 99%tile brightness temperatures measured with the 900 cm(exp -1) channel from AIRS for the past 11 years. This channel is in the clearest part of the 11 micron atmospheric window. Substantial trends are seen for land and ocean, which in the case of the 1%tile (cold) extremes are related to the current shift of deep convection from ocean to land. Changes are also seen in the 99%tile for day tropical land, but their interpretation is at present unclear. We also see dramatic changes for the mean and 99%tile of the North Polar area. The trends are an order of magnitude larger than the instrument trend of about 3 mK/year. We use the statistical distribution from the past year derived from AIRS to evaluate the accuracy of continuing the trends established with AIRS with CrIS data. We minimize the concern about differences in the spectral response functions by limiting the analysis to the channel at 900 cm(exp -1).While the two instruments agree within 100 mK for the global day/night land/ocean mean values, there are significant differences when evaluating the1% and 99%tiles. We see a consistent warm bias in the CrIS data relative to AIRS for the 1%tile (extremely cold, cloudy) data in the tropical zone, particularly for tropical land, but the bias is not day/night land/ocean consistent. At this point the difference appears to be due to differences in the radiometric response of AIRS and CrIS to differences in the day/night land/ocean cloud types. Unless the effect can be mitigated by a future reprocessing the CrIS data, it will significantly complicate the concatenation of the AIRS and CrIS data records for the continuation of trends in extreme values.
Existence and non-uniqueness of similarity solutions of a boundary-layer problem
NASA Technical Reports Server (NTRS)
Hussaini, M. Y.; Lakin, W. D.
1986-01-01
A Blasius boundary value problem with inhomogeneous lower boundary conditions f(0) = 0 and f'(0) = - lambda with lambda strictly positive was considered. The Crocco variable formulation of this problem has a key term which changes sign in the interval of interest. It is shown that solutions of the boundary value problem do not exist for values of lambda larger than a positive critical value lambda. The existence of solutions is proven for 0 lambda lambda by considering an equivalent initial value problem. It is found however that for 0 lambda lambda, solutions of the boundary value problem are nonunique. Physically, this nonuniqueness is related to multiple values of the skin friction.
Existence and non-uniqueness of similarity solutions of a boundary layer problem
NASA Technical Reports Server (NTRS)
Hussaini, M. Y.; Lakin, W. D.
1984-01-01
A Blasius boundary value problem with inhomogeneous lower boundary conditions f(0) = 0 and f'(0) = - lambda with lambda strictly positive was considered. The Crocco variable formulation of this problem has a key term which changes sign in the interval of interest. It is shown that solutions of the boundary value problem do not exist for values of lambda larger than a positive critical value lambda. The existence of solutions is proven for 0 lambda lambda by considering an equivalent initial value problem. It is found however that for 0 lambda lambda, solutions of the boundary value problem are nonunique. Physically, this nonuniqueness is related to multiple values of the skin friction.
Rare event simulation in radiation transport
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kollman, Craig
1993-10-01
This dissertation studies methods for estimating extremely small probabilities by Monte Carlo simulation. Problems in radiation transport typically involve estimating very rare events or the expected value of a random variable which is with overwhelming probability equal to zero. These problems often have high dimensional state spaces and irregular geometries so that analytic solutions are not possible. Monte Carlo simulation must be used to estimate the radiation dosage being transported to a particular location. If the area is well shielded the probability of any one particular particle getting through is very small. Because of the large number of particles involved,more » even a tiny fraction penetrating the shield may represent an unacceptable level of radiation. It therefore becomes critical to be able to accurately estimate this extremely small probability. Importance sampling is a well known technique for improving the efficiency of rare event calculations. Here, a new set of probabilities is used in the simulation runs. The results are multiple by the likelihood ratio between the true and simulated probabilities so as to keep the estimator unbiased. The variance of the resulting estimator is very sensitive to which new set of transition probabilities are chosen. It is shown that a zero variance estimator does exist, but that its computation requires exact knowledge of the solution. A simple random walk with an associated killing model for the scatter of neutrons is introduced. Large deviation results for optimal importance sampling in random walks are extended to the case where killing is present. An adaptive ``learning`` algorithm for implementing importance sampling is given for more general Markov chain models of neutron scatter. For finite state spaces this algorithm is shown to give with probability one, a sequence of estimates converging exponentially fast to the true solution.« less
NASA Astrophysics Data System (ADS)
Kasprzyk, J. R.; Reed, P. M.; Characklis, G. W.; Kirsch, B. R.
2010-12-01
This paper proposes and demonstrates a new interactive framework for sensitivity-informed de Novo programming, in which a learning approach to formulating decision problems can confront the deep uncertainty within water management problems. The framework couples global sensitivity analysis using Sobol’ variance decomposition with multiobjective evolutionary algorithms (MOEAs) to generate planning alternatives and test their robustness to new modeling assumptions and scenarios. We explore these issues within the context of a risk-based water supply management problem, where a city seeks the most efficient use of a water market. The case study examines a single city’s water supply in the Lower Rio Grande Valley (LRGV) in Texas, using both a 10-year planning horizon and an extreme single-year drought scenario. The city’s water supply portfolio comprises a volume of permanent rights to reservoir inflows and use of a water market through anticipatory thresholds for acquiring transfers of water through optioning and spot leases. Diagnostic information from the Sobol’ variance decomposition is used to create a sensitivity-informed problem formulation testing different decision variable configurations, with tradeoffs for the formulation solved using a MOEA. Subsequent analysis uses the drought scenario to expose tradeoffs between long-term and short-term planning and illustrate the impact of deeply uncertain assumptions on water availability in droughts. The results demonstrate water supply portfolios’ efficiency, reliability, and utilization of transfers in the water supply market and show how to adaptively improve the value and robustness of our problem formulations by evolving our definition of optimality to discover key tradeoffs.
NASA Astrophysics Data System (ADS)
Otero, L. J.; Ortiz-Royero, J. C.; Ruiz-Merchan, J. K.; Higgins, A. E.; Henriquez, S. A.
2016-02-01
The aim of this study is to determine the contribution and importance of cold fronts and storms to extreme waves in different areas of the Colombian Caribbean in an attempt to determine the extent of the threat posed by the flood processes to which these coastal populations are exposed. Furthermore, the study wishes to establish the actions to which coastal engineering constructions should be subject. In the calculation of maritime constructions, the most important parameter is the height of the wave. For this reason, it is necessary to establish the design wave height to which a coastal engineering structure should be resistant. This wave height varies according to the return period considered. The significant height values for the areas focused on in the study were calculated in accordance with Gumbel's extreme value methodology. The methodology was evaluated using data from the reanalysis of the spectral National Oceanic and Atmospheric Administration (NOAA) WAVEWATCH III® (WW3) model for 15 points along the 1600 km of the Colombian Caribbean coastline (continental and insular) between the years 1979 and 2009. The results demonstrated that the extreme waves caused by tropical cyclones and those caused by cold fronts have different effects along the Colombian Caribbean coast. Storms and hurricanes are of greater importance in the Guajira Peninsula (Alta Guajira). In the central area (consisting of Baja Guajira, and the cities of Santa Marta, Barranquilla, and Cartagena), the strong impact of cold fronts on extreme waves is evident. However, in the southern region of the Colombian Caribbean coast (ranging from the Gulf of Morrosquillo to the Gulf of Urabá), the extreme values of wave heights are lower than in the previously mentioned regions, despite being dominated mainly by the passage of cold fronts. Extreme waves in the San Andrés and Providencia insular region present a different dynamic from that in the continental area due to their geographic location. The wave heights in the extreme regime are similar in magnitude to those found in Alta Guajira, but the extreme waves associated with the passage of cold fronts in this region have lower return periods than those associated with the hurricane season.
NASA Astrophysics Data System (ADS)
Etemadi, Halimeh; Samadi, S. Zahra; Sharifikia, Mohammad; Smoak, Joseph M.
2016-10-01
Mangrove wetlands exist in the transition zone between terrestrial and marine environments and have remarkable ecological and socio-economic value. This study uses climate change downscaling to address the question of non-stationarity influences on mangrove variations (expansion and contraction) within an arid coastal region. Our two-step approach includes downscaling models and uncertainty assessment, followed by a non-stationary and trend procedure using the Extreme Value Analysis (extRemes code). The Long Ashton Research Station Weather Generator (LARS-WG) model along with two different general circulation model (GCMs) (MIRH and HadCM3) were used to downscale climatic variables during current (1968-2011) and future (2011-2030, 2045-2065, and 2080-2099) periods. Parametric and non-parametric bootstrapping uncertainty tests demonstrated that the LARS-WGS model skillfully downscaled climatic variables at the 95 % significance level. Downscaling results using MIHR model show that minimum and maximum temperatures will increase in the future (2011-2030, 2045-2065, and 2080-2099) during winter and summer in a range of +4.21 and +4.7 °C, and +3.62 and +3.55 °C, respectively. HadCM3 analysis also revealed an increase in minimum (˜+3.03 °C) and maximum (˜+3.3 °C) temperatures during wet and dry seasons. In addition, we examined how much mangrove area has changed during the past decades and, thus, if climate change non-stationarity impacts mangrove ecosystems. Our results using remote sensing techniques and the non-parametric Mann-Whitney two-sample test indicated a sharp decline in mangrove area during 1972,1987, and 1997 periods ( p value = 0.002). Non-stationary assessment using the generalized extreme value (GEV) distributions by including mangrove area as a covariate further indicated that the null hypothesis of the stationary climate (no trend) should be rejected due to the very low p values for precipitation ( p value = 0.0027), minimum ( p value = 0.000000029) and maximum ( p value = 0.00016) temperatures. Based on non-stationary analysis and an upward trend in downscaled temperature extremes, climate change may control mangrove development in the future.
1982-02-08
is printed in any year-month block when the extreme value Is based on an in- complete month (at least one day missing for the month). When a month has...means, standard deviations, and total number of valid observations for each month and annual (all months). An asterisk (*) is printed n each data block...becomes the extreme or monthly total in any of these tables it is printed as "TRACE." Continued on Reverse Side Values ’or means and standard
Optical rogue-wave-like extreme value fluctuations in fiber Raman amplifiers.
Hammani, Kamal; Finot, Christophe; Dudley, John M; Millot, Guy
2008-10-13
We report experimental observation and characterization of rogue wave-like extreme value statistics arising from pump-signal noise transfer in a fiber Raman amplifier. Specifically, by exploiting Raman amplification with an incoherent pump, the amplified signal is shown to develop a series of temporal intensity spikes whose peak power follows a power-law probability distribution. The results are interpreted using a numerical model of the Raman gain process using coupled nonlinear Schrödinger equations, and the numerical model predicts results in good agreement with experiment.
Random walkers with extreme value memory: modelling the peak-end rule
NASA Astrophysics Data System (ADS)
Harris, Rosemary J.
2015-05-01
Motivated by the psychological literature on the ‘peak-end rule’ for remembered experience, we perform an analysis within a random walk framework of a discrete choice model where agents’ future choices depend on the peak memory of their past experiences. In particular, we use this approach to investigate whether increased noise/disruption always leads to more switching between decisions. Here extreme value theory illuminates different classes of dynamics indicating that the long-time behaviour is dependent on the scale used for reflection; this could have implications, for example, in questionnaire design.
Varieties of preschool hyperactivity: multiple pathways from risk to disorder.
Sonuga-Barke, Edmund J S; Auerbach, Judith; Campbell, Susan B; Daley, David; Thompson, Margaret
2005-03-01
In this paper we examine the characteristics of preschool attention deficit hyperactivity disorder (ADHD) from both mental disorder and developmental psychopathology points of view. The equivalence of preschool and school-aged hyperactivity as a behavioral dimension is highlighted together with the potential value of extending the use of the ADHD diagnostic category to the preschool period where these behaviours take an extreme and impairing form (assuming age appropriate diagnostic items and thresholds can be developed). At the same time, the importance of identifying pathways between risk and later ADHD is emphasized. Developmental discontinuity and heterogeneity are identified as major characteristics of these pathways. We argue that models that distinguish among different developmental types of early-emerging problems are needed. An illustrative taxonomy of four developmental pathways implicating preschool hyperactivity is presented to provide a framework for future research.
Understanding student use of mathematics in IPLS with the Math Epistemic Games Survey
NASA Astrophysics Data System (ADS)
Eichenlaub, Mark; Hemingway, Deborah; Redish, Edward F.
2017-01-01
We present the Math Epistemic Games Survey (MEGS), a new concept inventory on the use of mathematics in introductory physics for the life sciences. The survey asks questions that are often best-answered via techniques commonly-valued in physics instruction, including dimensional analysis, checking special or extreme cases, understanding scaling relationships, interpreting graphical representations, estimation, and mapping symbols onto physical meaning. MEGS questions are often rooted in quantitative biology. We present preliminary data on the validation and administration of the MEGS in a large, introductory physics for the life sciences course at the University of Maryland, as well as preliminary results on the clustering of questions and responses as a guide to student resource activation in problem solving. This material is based upon work supported by the US National Science Foundation under Award No. 15-04366.
Non-localization of eigenfunctions for Sturm-Liouville operators and applications
NASA Astrophysics Data System (ADS)
Liard, Thibault; Lissy, Pierre; Privat, Yannick
2018-02-01
In this article, we investigate a non-localization property of the eigenfunctions of Sturm-Liouville operators Aa = -∂xx + a (ṡ) Id with Dirichlet boundary conditions, where a (ṡ) runs over the bounded nonnegative potential functions on the interval (0 , L) with L > 0. More precisely, we address the extremal spectral problem of minimizing the L2-norm of a function e (ṡ) on a measurable subset ω of (0 , L), where e (ṡ) runs over all eigenfunctions of Aa, at the same time with respect to all subsets ω having a prescribed measure and all L∞ potential functions a (ṡ) having a prescribed essentially upper bound. We provide some existence and qualitative properties of the minimizers, as well as precise lower and upper estimates on the optimal value. Several consequences in control and stabilization theory are then highlighted.
Extreme climatic events change the dynamics and invasibility of semi-arid annual plant communities.
Jiménez, Milagros A; Jaksic, Fabian M; Armesto, Juan J; Gaxiola, Aurora; Meserve, Peter L; Kelt, Douglas A; Gutiérrez, Julio R
2011-12-01
Extreme climatic events represent disturbances that change the availability of resources. We studied their effects on annual plant assemblages in a semi-arid ecosystem in north-central Chile. We analysed 130 years of precipitation data using generalised extreme-value distribution to determine extreme events, and multivariate techniques to analyse 20 years of plant cover data of 34 native and 11 exotic species. Extreme drought resets the dynamics of the system and renders it susceptible to invasion. On the other hand, by favouring native annuals, moderately wet events change species composition and allow the community to be resilient to extreme drought. The probability of extreme drought has doubled over the last 50 years. Therefore, investigations on the interaction of climate change and biological invasions are relevant to determine the potential for future effects on the dynamics of semi-arid annual plant communities. 2011 Blackwell Publishing Ltd/CNRS.
Use of historical information in extreme storm surges frequency analysis
NASA Astrophysics Data System (ADS)
Hamdi, Yasser; Duluc, Claire-Marie; Deville, Yves; Bardet, Lise; Rebour, Vincent
2013-04-01
The prevention of storm surge flood risks is critical for protection and design of coastal facilities to very low probabilities of failure. The effective protection requires the use of a statistical analysis approach having a solid theoretical motivation. Relating extreme storm surges to their frequency of occurrence using probability distributions has been a common issue since 1950s. The engineer needs to determine the storm surge of a given return period, i.e., the storm surge quantile or design storm surge. Traditional methods for determining such a quantile have been generally based on data from the systematic record alone. However, the statistical extrapolation, to estimate storm surges corresponding to high return periods, is seriously contaminated by sampling and model uncertainty if data are available for a relatively limited period. This has motivated the development of approaches to enlarge the sample extreme values beyond the systematic period. The nonsystematic data occurred before the systematic period is called historical information. During the last three decades, the value of using historical information as a nonsystematic data in frequency analysis has been recognized by several authors. The basic hypothesis in statistical modeling of historical information is that a perception threshold exists and that during a giving historical period preceding the period of tide gauging, all exceedances of this threshold have been recorded. Historical information prior to the systematic records may arise from high-sea water marks left by extreme surges on the coastal areas. It can also be retrieved from archives, old books, earliest newspapers, damage reports, unpublished written records and interviews with local residents. A plotting position formula, to compute empirical probabilities based on systematic and historical data, is used in this communication paper. The objective of the present work is to examine the potential gain in estimation accuracy with the use of historical information (to the Brest tide gauge located in the French Atlantic coast). In addition, the present work contributes to addressing the problem of the presence of outliers in data sets. Historical data are generally imprecise, and their inaccuracy should be properly accounted for in the analysis. However, as several authors believe, even with substantial uncertainty in the data, the use of historical information is a viable mean to improve estimates of rare events related to extreme environmental conditions. The preliminary results of this study suggest that the use of historical information increases the representativity of an outlier in the systematic data. It is also shown that the use of historical information, specifically the perception sea water level, can be considered as a reliable solution for the optimal planning and design of facilities to withstand extreme environmental conditions, which will occur during its lifetime, with an appropriate optimum of risk level. Findings are of practical relevance for applications in storm surge risk analysis and flood management.
Interpersonal Problems and Developmental Trajectories of Binge Eating Disorder
Blomquist, Kerstin K.; Ansell, Emily B.; White, Marney A.; Masheb, Robin M.; Grilo, Carlos M.
2012-01-01
Objective To explore associations between specific interpersonal constructs and the developmental progression of behaviors leading to binge eating disorder (BED). Method Eighty-four consecutively evaluated, treatment-seeking obese (BMI ≥ 30) men and women with BED were assessed with structured diagnostic and clinical interviews and completed a battery of established measures to assess the current and developmental eating- and weight-related variables as well as interpersonal functioning. Results Using the interpersonal circumplex structural summary method, amplitude, elevation, the affiliation dimension, and the quadratic coefficient for the dominance dimension were associated with eating and weight-related developmental variables. The amplitude coefficient and more extreme interpersonal problems on the dominance dimension (quadratic)—i.e., problems with being extremely high (domineering) or low in dominance (submissive)—were significantly associated with ayounger age at onset of binge eating, BED, and overweight as well as accounted for significant variance in age at binge eating, BED, and overweight onset. Greater interpersonal problems with having an overly affiliative interpersonal style were significantly associated with, and accounted for significant variance in, ayounger age at diet onset. Discussion Findings provide further support for the importance of interpersonal problems among adults with BED and converge with recent work highlighting the importance of specific types of interpersonal problems for understanding heterogeneity and different developmental trajectories of individuals with BED. PMID:22727087
NASA Astrophysics Data System (ADS)
Garcia-Fernandez, Mariano; Assatourians, Karen; Jimenez, Maria-Jose
2018-01-01
Extreme natural hazard events have the potential to cause significant disruption to critical infrastructure (CI) networks. Among them, earthquakes represent a major threat as sudden-onset events with limited, if any, capability of forecast, and high damage potential. In recent years, the increased exposure of interdependent systems has heightened concern, motivating the need for a framework for the management of these increased hazards. The seismic performance level and resilience of existing non-nuclear CIs can be analyzed by identifying the ground motion input values leading to failure of selected key elements. Main interest focuses on the ground motions exceeding the original design values, which should correspond to low probability occurrence. A seismic hazard methodology has been specifically developed to consider low-probability ground motions affecting elongated CI networks. The approach is based on Monte Carlo simulation, which allows for building long-duration synthetic earthquake catalogs to derive low-probability amplitudes. This approach does not affect the mean hazard values and allows obtaining a representation of maximum amplitudes that follow a general extreme-value distribution. This facilitates the analysis of the occurrence of extremes, i.e., very low probability of exceedance from unlikely combinations, for the development of, e.g., stress tests, among other applications. Following this methodology, extreme ground-motion scenarios have been developed for selected combinations of modeling inputs including seismic activity models (source model and magnitude-recurrence relationship), ground motion prediction equations (GMPE), hazard levels, and fractiles of extreme ground motion. The different results provide an overview of the effects of different hazard modeling inputs on the generated extreme motion hazard scenarios. This approach to seismic hazard is at the core of the risk analysis procedure developed and applied to European CI transport networks within the framework of the European-funded INFRARISK project. Such an operational seismic hazard framework can be used to provide insight in a timely manner to make informed risk management or regulating further decisions on the required level of detail or on the adoption of measures, the cost of which can be balanced against the benefits of the measures in question.
A Conservative Inverse Normal Test Procedure for Combining P-Values in Integrative Research.
ERIC Educational Resources Information Center
Saner, Hilary
1994-01-01
The use of p-values in combining results of studies often involves studies that are potentially aberrant. This paper proposes a combined test that permits trimming some of the extreme p-values. The trimmed statistic is based on an inverse cumulative normal transformation of the ordered p-values. (SLD)
Slice sampling technique in Bayesian extreme of gold price modelling
NASA Astrophysics Data System (ADS)
Rostami, Mohammad; Adam, Mohd Bakri; Ibrahim, Noor Akma; Yahya, Mohamed Hisham
2013-09-01
In this paper, a simulation study of Bayesian extreme values by using Markov Chain Monte Carlo via slice sampling algorithm is implemented. We compared the accuracy of slice sampling with other methods for a Gumbel model. This study revealed that slice sampling algorithm offers more accurate and closer estimates with less RMSE than other methods . Finally we successfully employed this procedure to estimate the parameters of Malaysia extreme gold price from 2000 to 2011.
Extremely low-frequency magnetic fields of transformers and possible biological and health effects.
Sirav, Bahriye; Sezgin, Gaye; Seyhan, Nesrin
2014-12-01
Physiological processes in organisms can be influenced by extremely low-frequency (ELF) electromagnetic energy. Biological effect studies have great importance; as well as measurement studies since they provide information on the real exposure situations. In this study, the leakage magnetic fields around a transformer were measured in an apartment building in Küçükçekmece, Istanbul, and the measurement results were evaluated with respect to the international exposure standards. The transformer station was on the bottom floor of a three-floor building. It was found that people living and working in the building were exposed to ELF magnetic fields higher than the threshold magnetic field value of the International Agency for Research on Cancer (IARC). Many people living in this building reported health complaints such as immunological problems of their children. There were child-workers working in the textile factories located in the building. Safe distances or areas for these people should be recommended. Protective measures could be implemented to minimize these exposures. Further residential exposure studies are needed to demonstrate the exposure levels of ELF magnetic fields. Precautions should, therefore, be taken either to reduce leakage or minimize the exposed fields. Shielding techniques should be used to minimize the leakage magnetic fields in such cases.
Multiresolution modeling with a JMASS-JWARS HLA Federation
NASA Astrophysics Data System (ADS)
Prince, John D.; Painter, Ron D.; Pendell, Brian; Richert, Walt; Wolcott, Christopher
2002-07-01
CACI, Inc.-Federal has built, tested, and demonstrated the use of a JMASS-JWARS HLA Federation that supports multi- resolution modeling of a weapon system and its subsystems in a JMASS engineering and engagement model environment, while providing a realistic JWARS theater campaign-level synthetic battle space and operational context to assess the weapon system's value added and deployment/employment supportability in a multi-day, combined force-on-force scenario. Traditionally, acquisition analyses require a hierarchical suite of simulation models to address engineering, engagement, mission and theater/campaign measures of performance, measures of effectiveness and measures of merit. Configuring and running this suite of simulations and transferring the appropriate data between each model is both time consuming and error prone. The ideal solution would be a single simulation with the requisite resolution and fidelity to perform all four levels of acquisition analysis. However, current computer hardware technologies cannot deliver the runtime performance necessary to support the resulting extremely large simulation. One viable alternative is to integrate the current hierarchical suite of simulation models using the DoD's High Level Architecture in order to support multi- resolution modeling. An HLA integration eliminates the extremely large model problem, provides a well-defined and manageable mixed resolution simulation and minimizes VV&A issues.
- XSUMMER- Transcendental functions and symbolic summation in FORM
NASA Astrophysics Data System (ADS)
Moch, S.; Uwer, P.
2006-05-01
Harmonic sums and their generalizations are extremely useful in the evaluation of higher-order perturbative corrections in quantum field theory. Of particular interest have been the so-called nested sums, where the harmonic sums and their generalizations appear as building blocks, originating for example, from the expansion of generalized hypergeometric functions around integer values of the parameters. In this paper we discuss the implementation of several algorithms to solve these sums by algebraic means, using the computer algebra system FORM. Program summaryTitle of program:XSUMMER Catalogue identifier:ADXQ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXQ_v1_0 Program obtainable from:CPC Program Library, Queen's University of Belfast, N. Ireland License:GNU Public License and FORM License Computers:all Operating system:all Program language:FORM Memory required to execute:Depending on the complexity of the problem, recommended at least 64 MB RAM No. of lines in distributed program, including test data, etc.:9854 No. of bytes in distributed program, including test data, etc.:126 551 Distribution format:tar.gz Other programs called:none External files needed:none Nature of the physical problem:Systematic expansion of higher transcendental functions in a small parameter. The expansions arise in the calculation of loop integrals in perturbative quantum field theory. Method of solution:Algebraic manipulations of nested sums. Restrictions on complexity of the problem:Usually limited only by the available disk space. Typical running time:Dependent on the complexity of the problem.
Díaz, Francisca P; Frugone, Matías; Gutiérrez, Rodrigo A; Latorre, Claudio
2016-03-09
Climate controls on the nitrogen cycle are suggested by the negative correlation between precipitation and δ(15)N values across different ecosystems. For arid ecosystems this is unclear, as water limitation among other factors can confound this relationship. We measured herbivore feces, foliar and soil δ(15)N and δ(13)C values and chemically characterized soils (pH and elemental composition) along an elevational/climatic gradient in the Atacama Desert, northern Chile. Although very positive δ(15)N values span the entire gradient, soil δ(15)N values show a positive correlation with aridity as expected. In contrast, foliar δ(15)N values and herbivore feces show a hump-shaped relationship with elevation, suggesting that plants are using a different N source, possibly of biotic origin. Thus at the extreme limits of plant life, biotic interactions may be just as important as abiotic processes, such as climate in explaining ecosystem δ(15)N values.
NASA Astrophysics Data System (ADS)
Díaz, Francisca P.; Frugone, Matías; Gutiérrez, Rodrigo A.; Latorre, Claudio
2016-03-01
Climate controls on the nitrogen cycle are suggested by the negative correlation between precipitation and δ15N values across different ecosystems. For arid ecosystems this is unclear, as water limitation among other factors can confound this relationship. We measured herbivore feces, foliar and soil δ15N and δ13C values and chemically characterized soils (pH and elemental composition) along an elevational/climatic gradient in the Atacama Desert, northern Chile. Although very positive δ15N values span the entire gradient, soil δ15N values show a positive correlation with aridity as expected. In contrast, foliar δ15N values and herbivore feces show a hump-shaped relationship with elevation, suggesting that plants are using a different N source, possibly of biotic origin. Thus at the extreme limits of plant life, biotic interactions may be just as important as abiotic processes, such as climate in explaining ecosystem δ15N values.
Díaz, Francisca P.; Frugone, Matías; Gutiérrez, Rodrigo A.; Latorre, Claudio
2016-01-01
Climate controls on the nitrogen cycle are suggested by the negative correlation between precipitation and δ15N values across different ecosystems. For arid ecosystems this is unclear, as water limitation among other factors can confound this relationship. We measured herbivore feces, foliar and soil δ15N and δ13C values and chemically characterized soils (pH and elemental composition) along an elevational/climatic gradient in the Atacama Desert, northern Chile. Although very positive δ15N values span the entire gradient, soil δ15N values show a positive correlation with aridity as expected. In contrast, foliar δ15N values and herbivore feces show a hump-shaped relationship with elevation, suggesting that plants are using a different N source, possibly of biotic origin. Thus at the extreme limits of plant life, biotic interactions may be just as important as abiotic processes, such as climate in explaining ecosystem δ15N values. PMID:26956399
Measurement of HRQL using EQ-5D in patients with type 2 diabetes mellitus in Japan.
Sakamaki, Hiroyuki; Ikeda, Shunya; Ikegami, Naoki; Uchigata, Yasuko; Iwamoto, Yasuhiko; Origasa, Hideki; Otani, Toshiki; Otani, Yoichi
2006-01-01
We measured the health-related quality of life (HRQL) of diabetes mellitus patients using the Japanese version of EQ-5D, and examined the relationship between clinical condition and health status. A study was conducted on 220 patients with type 2 diabetes mellitus at a hospital in Saitama Prefecture on the day of their visit from November 17 to December 24, 1998. Patients evaluated their health status using five dimensions (5D) and a visual analog scale (VAS). The EQ-5D score was calculated based on the 5D responses using the Japanese version of the value set. There were no responses of "extreme problem." The frequency of "some problem" was significantly higher in patients with complications than in those without for mobility (27.4% and 14.4%) and anxiety/depression (25.7% and 13.5%). The mean EQ-5D score was 0.846 (95% confidence interval [CI] 0.817-0.874) in patients with complications versus 0.884 (95% CI 0.855-0.914) in those without complications. There was no statistically significant difference between VAS scores according to the presence or absence of diabetic complications, but a significant difference in VAS scores was seen according to the presence or absence of retinopathy. These findings suggest the value of measuring health status in diabetes mellitus patients, because it is able to comprehensively evaluate the patient's health condition, and add another dimension to the subjective symptoms and laboratory data.
Assessing the Value of Information for Identifying Optimal Floodplain Management Portfolios
NASA Astrophysics Data System (ADS)
Read, L.; Bates, M.; Hui, R.; Lund, J. R.
2014-12-01
Floodplain management is a complex portfolio problem that can be analyzed from an integrated perspective incorporating traditionally structural and nonstructural options. One method to identify effective strategies for preparing, responding to, and recovering from floods is to optimize for a portfolio of temporary (emergency) and permanent floodplain management options. A risk-based optimization approach to this problem assigns probabilities to specific flood events and calculates the associated expected damages. This approach is currently limited by: (1) the assumption of perfect flood forecast information, i.e. implementing temporary management activities according to the actual flood event may differ from optimizing based on forecasted information and (2) the inability to assess system resilience across a range of possible future events (risk-centric approach). Resilience is defined here as the ability of a system to absorb and recover from a severe disturbance or extreme event. In our analysis, resilience is a system property that requires integration of physical, social, and information domains. This work employs a 3-stage linear program to identify the optimal mix of floodplain management options using conditional probabilities to represent perfect and imperfect flood stages (forecast vs. actual events). We assess the value of information in terms of minimizing damage costs for two theoretical cases - urban and rural systems. We use portfolio analysis to explore how the set of optimal management options differs depending on whether the goal is for the system to be risk-adverse to a specified event or resilient over a range of events.
Some new methods and results in examination of distribution of rare strongest events
NASA Astrophysics Data System (ADS)
Pisarenko, Vladilen; Rodkin, Mikhail
2016-04-01
In the study of disaster statistics the examination of the distribution tail - the range of rare strongest events - appears to be the mostly difficult and the mostly important problem. We discuss here this problem using two different approaches. In the first one we use the limit distributions of the theory of extreme values for parameterization of behavior of the distribution tail. Our method consists in estimation of the maximum size Mmax(T) (e.g. magnitude, earthquake energy, PGA value, victims or economic losses from catastrophe, etc.) that will occur in a prescribed future time interval T. In this particular case we combine the historical earthquake catalogs with instrumental ones since historical catalogs cover much longer time periods and thus can essentially improve seismic statistics in the higher magnitude domain. We apply here this technique to two historical Japan catalogs (the Usami earthquake catalog 599-1884, and the Utsu catalog, 1885-1925) and to the instrumental JMA catalog (1926-2014). We have compared the parameters of historical catalogs with ones derived from the instrumental JMA catalog and have found that the Usami catalog is incompatible with the instrumental one, whereas the Utsu catalog is statistically compatible in the higher magnitude domain with the JMA catalog. In all examined cases the effect of the "bending down" of the graph of strong earthquake recurrence was found as the typical of the seismic regime. Another method is connected with the use of the multiplicative cascade model (that in some aspects is an analogue of the ETAS model). It is known that the ordinary Gutenberg-Richter law of earthquake recurrence can be imitated within the scheme of multiplicative cascade in which the seismic regime is treated as a sequence of a large number of episodes of avalanche-like relaxation, randomly occurring on the set of metastable subsystems. This model simulates such well known regularity of the seismic regime as a decrease in b-value in connection with the strong earthquakes occurrence. If the memory of the system is taken into account the cascade model simulates the Omori law of aftershock number decay, the existence of the foreshock activity and the seismic cycle. We use here the cascade model to imitate the effect of "bending down" of the graph of strong earthquake recurrence and the possibility of occurrence of characteristic earthquakes. The results are compared with the seismicity and the physical conditions of occurrence of characteristic earthquakes are suggested. Examples of mutual interpretation of results obtained in the case of the use of theory of extreme values and of the use of the cascade model are presented.
How Historical Information Can Improve Extreme Value Analysis of Coastal Water Levels
NASA Astrophysics Data System (ADS)
Le Cozannet, G.; Bulteau, T.; Idier, D.; Lambert, J.; Garcin, M.
2016-12-01
The knowledge of extreme coastal water levels is useful for coastal flooding studies or the design of coastal defences. While deriving such extremes with standard analyses using tide gauge measurements, one often needs to deal with limited effective duration of observation which can result in large statistical uncertainties. This is even truer when one faces outliers, those particularly extreme values distant from the others. In a recent work (Bulteau et al., 2015), we investigated how historical information of past events reported in archives can reduce statistical uncertainties and relativize such outlying observations. We adapted a Bayesian Markov Chain Monte Carlo method, initially developed in the hydrology field (Reis and Stedinger, 2005), to the specific case of coastal water levels. We applied this method to the site of La Rochelle (France), where the storm Xynthia in 2010 generated a water level considered so far as an outlier. Based on 30 years of tide gauge measurements and 8 historical events since 1890, the results showed a significant decrease in statistical uncertainties on return levels when historical information is used. Also, Xynthia's water level no longer appeared as an outlier and we could have reasonably predicted the annual exceedance probability of that level beforehand (predictive probability for 2010 based on data until the end of 2009 of the same order of magnitude as the standard estimative probability using data until the end of 2010). Such results illustrate the usefulness of historical information in extreme value analyses of coastal water levels, as well as the relevance of the proposed method to integrate heterogeneous data in such analyses.
Lower Extremity Stiffness Changes after Concussion in Collegiate Football Players.
Dubose, Dominique F; Herman, Daniel C; Jones, Deborah L; Tillman, Susan M; Clugston, James R; Pass, Anthony; Hernandez, Jorge A; Vasilopoulos, Terrie; Horodyski, Marybeth; Chmielewski, Terese L
2017-01-01
Recent research indicates that a concussion increases the risk of musculoskeletal injury. Neuromuscular changes after concussion might contribute to the increased risk of injury. Many studies have examined gait postconcussion, but few studies have examined more demanding tasks. This study compared changes in stiffness across the lower extremity, a measure of neuromuscular function, during a jump-landing task in athletes with a concussion (CONC) to uninjured athletes (UNINJ). Division I football players (13 CONC and 26 UNINJ) were tested pre- and postseason. A motion capture system recorded subjects jumping on one limb from a 25.4-cm step onto a force plate. Hip, knee, and ankle joint stiffness were calculated from initial contact to peak joint flexion using the regression line slopes of the joint moment versus the joint angle plots. Leg stiffness was (peak vertical ground reaction force [PVGRF]/lower extremity vertical displacement) from initial contact to peak vertical ground reaction force. All stiffness values were normalized to body weight. Values from both limbs were averaged. General linear models compared group (CONC, UNINJ) differences in the changes of pre- and postseason stiffness values. Average time from concussion to postseason testing was 49.9 d. The CONC group showed an increase in hip stiffness (P = 0.03), a decrease in knee (P = 0.03) and leg stiffness (P = 0.03), but no change in ankle stiffness (P = 0.65) from pre- to postseason. Lower extremity stiffness is altered after concussion, which could contribute to an increased risk of lower extremity injury. These data provide further evidence of altered neuromuscular function after concussion.
Lower Extremity Stiffness Changes following Concussion in Collegiate Football Players
DuBose, Dominique F.; Herman, Daniel C.; Jones, Debi L.; Tillman, Susan M.; Clugston, James R.; Pass, Anthony; Hernandez, Jorge A.; Vasilopoulos, Terrie; Horodyski, MaryBeth; Chmielewski, Terese L.
2016-01-01
Purpose Recent research indicates that a concussion increases risk of musculoskeletal injury. Neuromuscular changes following concussion might contribute to the increased risk of injury. Many studies have examined gait post-concussion, but few studies have examined more demanding tasks. This study compared changes in stiffness across the lower extremity, a measure of neuromuscular function, during a jump-landing task in athletes with a concussion (CONC) to uninjured athletes (UNINJ). Methods Division I football players (13 CONC, 26 UNINJ) were tested pre- and post-season. A motion-capture system recorded subjects jumping on one limb from a 25.4 cm step onto a force plate. Hip, knee, and ankle joint stiffness were calculated from initial contact to peak joint flexion using the regression line slopes of the joint moment versus joint angle plots. Leg stiffness was (peak vertical ground reaction force (PVGRF)/lower extremity vertical displacement) from initial contact to PVGRF. All stiffness values were normalized to bodyweight. Values from both limbs were averaged. General linear models compared group (CONC, UNINJ) differences in the changes of pre- and post-season stiffness values. Results Average time from concussion to post-season testing was 49.9 days. The CONC group showed an increase in hip stiffness (p=0.03), a decrease in knee (p=0.03) and leg stiffness (p=0.03), but no change in ankle stiffness (p=0.65) from pre- to post-season. Conclusion Lower extremity stiffness is altered following concussion, which could contribute to an increased risk of lower extremity injury. These data provide further evidence of altered neuromuscular function after concussion. PMID:27501359
[Problems of ethics, deontology and esthetics in pathology practice].
Shmurun, R I
1997-01-01
Relationships between pathologists on the one hand and clinicians, patients asking for histological slides, relatives of the deceased persons on the other are extremely complicated and need special attention. Pathology service should not be considered as second-rate. Ethic problems in the pathology service are not yet properly dealt in the literature.
How Can "Problem Subjects" Be Made Less of a Problem?
ERIC Educational Resources Information Center
Warwick, Philip; Ottewill, Roger
2004-01-01
Many higher education courses, across the whole range of disciplines, include subjects that are somewhat problematic because they appear to be unrelated to other components of the curriculum. Students frequently perceive them as irrelevant and boring. This makes it extremely difficult for lecturers to stimulate interest and results in an…
The social class gradient in health in Spain and the health status of the Spanish Roma.
La Parra Casado, Daniel; Gil González, Diana; de la Torre Esteve, María
2016-10-01
To determine the social class gradient in health in general Spain population and the health status of the Spanish Roma. The National Health Survey of Spanish Roma 2006 (sample size = 993 people; average age: 33.6 years; 53.1% women) and the National Health Surveys for Spain 2003 (sample size: 21,650 people; average age: 45.5 years; 51.2% women) and 2006 (sample size: 29,478 people; average age: 46 years; 50.7% women) are compared. Several indicators were chosen: self-perceived health, activity limitation, chronic diseases, hearing and sight problems, caries, and obesity. Analysis was based on age-standardised rates and logistic regression models. According to most indicators, Roma's health is worse than that of social class IV-V (manual workers). Some indicators show a remarkable difference between Roma and social class IV-V: experiencing three or more health problems, sight problems, and caries, in both sexes, and hearing problems and obesity, in women. Roma people are placed on an extreme position on the social gradient in health, a situation of extreme health inequality.
NASA Technical Reports Server (NTRS)
Whitaker, Mike
1991-01-01
Severe precipitation static problems affecting the communication equipment onboard the P-3B aircraft were recently studied. The study was conducted after precipitation static created potential safety-of-flight problems on Naval Reserve aircraft. A specially designed flight test program was conducted in order to measure, record, analyze, and characterize potential precipitation static problem areas. The test program successfully characterized the precipitation static interference problems while the P-3B was flown in moderate to extreme precipitation conditions. Data up to 400 MHz were collected on the effects of engine charging, precipitation static, and extreme cross fields. These data were collected using a computer controlled acquisition system consisting of a signal generator, RF spectrum and audio analyzers, data recorders, and instrumented static dischargers. The test program is outlined and the computer controlled data acquisition system is described in detail which was used during flight and ground testing. The correlation of test results is also discussed which were recorded during the flight test program and those measured during ground testing.
INFORMS Section on Location Analysis Dissertation Award Submission
DOE Office of Scientific and Technical Information (OSTI.GOV)
Waddell, Lucas
This research effort can be summarized by two main thrusts, each of which has a chapter of the dissertation dedicated to it. First, I pose a novel polyhedral approach for identifying polynomially solvable in- stances of the QAP based on an application of the reformulation-linearization technique (RLT), a general procedure for constructing mixed 0-1 linear reformulations of 0-1 pro- grams. The feasible region to the continuous relaxation of the level-1 RLT form is a polytope having a highly specialized structure. Every binary solution to the QAP is associated with an extreme point of this polytope, and the objective function valuemore » is preserved at each such point. However, there exist extreme points that do not correspond to binary solutions. The key insight is a previously unnoticed and unexpected relationship between the polyhedral structure of the continuous relaxation of the level-1 RLT representation and various classes of readily solvable instances. Specifically, we show that a variety of apparently unrelated solvable cases of the QAP can all be categorized in the following sense: each such case has an objective function which ensures that an optimal solution to the continuous relaxation of the level-1 RLT form occurs at a binary extreme point. Interestingly, there exist instances that are solvable by the level-1 RLT form which do not satisfy the conditions of these cases, so that the level-1 form theoretically identifies a richer family of solvable instances. Second, I focus on instances of the QAP known in the literature as linearizable. An instance of the QAP is defined to be linearizable if and only if the problem can be equivalently written as a linear assignment problem that preserves the objective function value at all feasible solutions. I provide an entirely new polyheral-based perspective on the concept of linearizable by showing that an instance of the QAP is linearizable if and only if a relaxed version of the continuous relaxation of the level-1 RLT form is bounded. We also shows that the level-1 RLT form can identify a richer family of solvable instances than those deemed linearizable by demonstrating that the continuous relaxation of the level-1 RLT form can have an optimal binary solution for instances that are not linearizable. As a byproduct, I use this theoretical framework to explicity, in closed form, characterize the dimensions of the level-1 RLT form and various other problem relaxations.« less
Temperature control system for optical elements in astronomical instrumentation
NASA Astrophysics Data System (ADS)
Verducci, Orlando; de Oliveira, Antonio C.; Ribeiro, Flávio F.; Vital de Arruda, Márcio; Gneiding, Clemens D.; Fraga, Luciano
2014-07-01
Extremely low temperatures may damage the optical components assembled inside of an astronomical instrument due to the crack in the resin or glue used to attach lenses and mirrors. The environment, very cold and dry, in most of the astronomical observatories contributes to this problem. This paper describes the solution implemented at SOAR for remotely monitoring and controlling temperatures inside of a spectrograph, in order to prevent a possible damage of the optical parts. The system automatically switches on and off some heat dissipation elements, located near the optics, as the measured temperature reaches a trigger value. This value is set to a temperature at which the instrument is not operational to prevent malfunction and only to protect the optics. The software was developed with LabVIEWTM and based on an object-oriented design that offers flexibility and ease of maintenance. As result, the system is able to keep the internal temperature of the instrument above a chosen limit, except perhaps during the response time, due to inertia of the temperature. This inertia can be controlled and even avoided by choosing the correct amount of heat dissipation and location of the thermal elements. A log file records the measured temperature values by the system for operation analysis.
A hierarchical wavefront reconstruction algorithm for gradient sensors
NASA Astrophysics Data System (ADS)
Bharmal, Nazim; Bitenc, Urban; Basden, Alastair; Myers, Richard
2013-12-01
ELT-scale extreme adaptive optics systems will require new approaches tocompute the wavefront suitably quickly, when the computational burden ofapplying a MVM is no longer practical. An approach is demonstrated here whichis hierarchical in transforming wavefront slopes from a WFS into a wavefront,and then to actuator values. First, simple integration in 1D is used to create1D-wavefront estimates with unknown starting points at the edges of independentspatial domains. Second, these starting points are estimated globally. By thesestarting points are a sub-set of the overall grid where wavefront values are tobe estimated, sparse representations are produced and numerical complexity canbe chosen by the spacing of the starting point grid relative to the overallgrid. Using a combination of algebraic expressions, sparse representation, anda conjugate gradient solver, the number of non-parallelized operations forreconstruction on a 100x100 sub-aperture sized problem is ~600,000 or O(N^3/2),which is approximately the same as for each thread of a MVM solutionparallelized over 100 threads. To reduce the effects of noise propagationwithin each domain, a noise reduction algorithm can be applied which ensuresthe continuity of the wavefront. To apply this additional step has a cost of~1,200,000 operations. We conclude by briefly discussing how the final step ofconverting from wavefront to actuator values can be achieved.
Spatial analysis of extreme precipitation deficit as an index for atmospheric drought in Belgium
NASA Astrophysics Data System (ADS)
Zamani, Sepideh; Van De Vyver, Hans; Gobin, Anne
2014-05-01
The growing concern among the climate scientists is that the frequency of weather extremes will increase as a result of climate change. European society, for example, is particularly vulnerable to changes in the frequency and intensity of extreme events such as heat waves, heavy precipitation, droughts, and wind storms, as seen in recent years [1,2]. A more than 50% of the land is occupied by managed ecosystem (agriculture, forestry) in Belgium. Moreover, among the many extreme weather conditions, drought counts to have a substantial impact on the agriculture and ecosystem of the affected region, because its most immediate consequence is a fall in crop production. Besides the technological advances, a reliable estimation of weather conditions plays a crucial role in improving the agricultural productivity. The above mentioned reasons provide a strong motivation for a research on the drought and its impacts on the economical and agricultural aspects in Belgium. The main purpose of the presented work is to map atmospheric drought Return-Levels (RL), as first insight for agricultural drought, employing spatial modelling approaches. The likelihood of future drought is studied on the basis of precipitation deficit indices for four vegetation types: water (W), grass (G), deciduous (D) and coniferous forests (C) is considered. Extreme Value Theory (EVT) [3,4,5] as a branch of probability and statistics, is dedicated to characterize the behaviour of extreme observations. The tail behaviour of the EVT distributions provide important features about return levels. EVT distributions are applicable in many study areas such as: hydrology, environmental research and meteorology, insurance and finance. Spatial Generalized Extreme Value (GEV) distributions, as a branch of EVT, are applied to annual maxima of drought at 13 hydro-meteorological stations across Belgium. Superiority of the spatial GEV model is that a region can be modelled merging the individual time series of observations from isolated sites and using a common regression model based on climatological/geographical covariates. The behaviour of the fitted spatial GEV-distribution is heavy-tailed with γ ≡ 0.3 over Belgium. A comparison between the RL-maps using GEV model and the ones obtained from Universal Kriging (UK) confirms the reliability of the spatial GEV model in explaining atmospheric drought in Belgium. References [1] Beniston, M., Stephenson, D. B., Christensen, O. B., Ferro, C. A. T., Frei, C., Goyette, S., Halsnaes, K., Holt, T., Jylhü, K., Koffi, B., Palutikoff, J., Schöll, R., Semmler, T., and Woth, K. (2007), Future extreme events in European climate; an exploration of Regional Climate Model projections. Climatic Change, 81, 71-95. [2] Solomon, S., D. Qin, M. Manning, Z. Chen, M. Marquis, K.B. Averyt, M. Tignor and H.L. Miller (Eds.)] (2007), king Group I to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change. Cambridge University Press, Cambridge, United Kingdom and New York, NY, USA, 996 pp. [3] Coles, S. (2001), An Introduction to Statistical Modeling of Extreme Values, Springer-Verlag Heidelberg, Germany. [4] Embrechts, P., C. Klüppelberg, and T. Mikosch (1997), Modelling Extremal Events for Insurance and Finance, Springer-Verlag, Berlin. [5] Smith, R., (2004), Statistics of extremes, with application in environment, insurance and finance, in : Extreme Values in Finance, Telecommunications and the Environment, edited by: Finkenstadt, B. and Rootzen, H., 373-388, Chapman and Hall CRC Press, London.
Environmental hazards, hot, cold, altitude, and sun.
Dhillon, Sundeep
2012-09-01
There has been an increase in both recreational and adventure travel to extreme environments. Humans can successfully acclimatize to and perform reasonably well in extreme environments, provided that sufficient time is given for acclimatization (where possible) and that they use appropriate behavior. This is aided by a knowledge of the problems likely to be encountered and their prevention, recognition, and treatment. Crown Copyright © 2012. Published by Elsevier Inc. All rights reserved.
Countering Extremism: An Understanding of the Problem, the Process and Some Solutions
2015-06-01
Radicalization into Terrorism.” 20 extremist movement “could not be developed , evolved or sustained over time and place” without the support of...efforts to research, develop , and implement a variety of methods and models to prevent and counter violent extremism (CVE). Despite these efforts...but how to better understand the radicalization process in order to develop effective strategies to prevent radicalization from occurring. Faced
From lepton protoplasm to the genesis of hadrons
NASA Astrophysics Data System (ADS)
Eliseev, S. M.; Kosmachev, O. S.
2016-01-01
Theory of matter under extreme conditions opens a new stage in particle physics. It is necessary here to combine Dirac's elementary particle physics with Prigogine's dynamics of nonequilibrium systems. In the article we discuss the problem of the hierarchy of complexity. What can be considered as the lowest level of the organization of extreme matter on the basis of which the self-organization of the complex form occur?
Professional musicians with craniomandibular dysfunctions treated with oral splints.
Steinmetz, Anke; Ridder, Paul H; Methfessel, Götz; Muche, Burkhard
2009-10-01
Craniomandibular dysfunction (CMD) symptoms occur frequently in violin/viola and wind players and can be associated with pain in the neck, shoulders and arm. In the current study, the effect of oral splint treatment of CMD on reducing pain and symptoms especially in these areas was investigated. Thirty (30) musicians undergoing CMD treatment with oral splints participated in this study. They completed a questionnaire that addressed CMD symptoms, localization of pain, and subjective changes in symptoms. Pain in the shoulder and/or upper extremity was the most frequent symptom reported by 83% of subjects, followed by neck pain (80%) and pain in the teeth/TMJ regions (63%). Treatment with oral splints contributed to a significant decrease in neck pain in 91%, teeth/TMJ pain in 83%, and shoulder and upper extremity pain in 76% of the musicians. Eighty percent (80%) of the patients reported improvement of their predominant symptoms. CMD can be a potential cause for pain in the neck, shoulders, and upper extremities of musicians. It is paramount that musicians with musculoskeletal problems be examined for CMD symptoms. Treatment with oral splints seems to be valuable. Further prospective, randomized controlled studies are necessary to confirm efficacy of oral splint treatment in CMD-associated pain and problems in the neck, shoulder, and the upper extremities in musicians.
Badtieva, V A; Kniazeva, T A; Apkhanova, T V
2010-01-01
The present review of the literature data highlights modern approaches to and major trends in diagnostics and conservative treatment of lymphedema of the lower extremities based on the generalized world experience. Patients with lymphedema of the lower extremities comprise a "difficult to manage" group because the disease is characterized by steady progression and marked refractoriness to various conservative therapeutic modalities creating problems for both the patient and the attending physician. Modern methods for the diagnosis of lymphedema are discussed with special reference to noninvasive and minimally invasive techniques (such as lymphoscintiography, computed tomography, MRT, laser Doppler flowmetry, etc.). During the last 20 years, combined conservative therapy has been considered as the method of choice for the management of different stages and forms of lymphedema of the lower extremities in foreign clinics. The basis of conservative therapy is constituted by manual lymph drainage (MLD), compression bandages using short-stretch materials, physical exercises, and skin care (using the method of M. Foldi). Also reviewed are the main physiobalneotherapeutic methods traditionally widely applied for the treatment of lymphedema of the lower extremities in this country. Original methods for the same purpose developed by the authors are described including modifications of cryotherapy, pulsed matrix laserotherapy, hydro- and balneotherapy. Mechanisms of their therapeutic action on the main pathogenetic factors responsible for the development of lymphedema (with special reference to lymph transport and formation) are discussed. The principles of combined application of physiotherapeutic methods for the rehabilitative treatment of patients presenting with lymphedema of the lower extremities are briefly substantiated. Special emphasis is laid on their influence on major components of the pathological process.
Semi-supervised tracking of extreme weather events in global spatio-temporal climate datasets
NASA Astrophysics Data System (ADS)
Kim, S. K.; Prabhat, M.; Williams, D. N.
2017-12-01
Deep neural networks have been successfully applied to solve problem to detect extreme weather events in large scale climate datasets and attend superior performance that overshadows all previous hand-crafted methods. Recent work has shown that multichannel spatiotemporal encoder-decoder CNN architecture is able to localize events in semi-supervised bounding box. Motivated by this work, we propose new learning metric based on Variational Auto-Encoders (VAE) and Long-Short-Term-Memory (LSTM) to track extreme weather events in spatio-temporal dataset. We consider spatio-temporal object tracking problems as learning probabilistic distribution of continuous latent features of auto-encoder using stochastic variational inference. For this, we assume that our datasets are i.i.d and latent features is able to be modeled by Gaussian distribution. In proposed metric, we first train VAE to generate approximate posterior given multichannel climate input with an extreme climate event at fixed time. Then, we predict bounding box, location and class of extreme climate events using convolutional layers given input concatenating three features including embedding, sampled mean and standard deviation. Lastly, we train LSTM with concatenated input to learn timely information of dataset by recurrently feeding output back to next time-step's input of VAE. Our contribution is two-fold. First, we show the first semi-supervised end-to-end architecture based on VAE to track extreme weather events which can apply to massive scaled unlabeled climate datasets. Second, the information of timely movement of events is considered for bounding box prediction using LSTM which can improve accuracy of localization. To our knowledge, this technique has not been explored neither in climate community or in Machine Learning community.
NASA Astrophysics Data System (ADS)
Faranda, D.; Yiou, P.; Alvarez-Castro, M. C. M.
2015-12-01
A combination of dynamical systems and statistical techniques allows for a robust assessment of the dynamical properties of the mid-latitude atmospheric circulation. Extremes at different spatial and time scales are not only associated to exceptionally intense weather structures (e.g. extra-tropical cyclones) but also to rapid changes of circulation regimes (thunderstorms, supercells) or the extreme persistence of weather structure (heat waves, cold spells). We will show how the dynamical systems theory of recurrence combined to the extreme value theory can take into account the spatial and temporal dependence structure of the mid-latitude circulation structures and provide information on the statistics of extreme events.
NASA Astrophysics Data System (ADS)
Raschke, Mathias
2016-02-01
In this short note, I comment on the research of Pisarenko et al. (Pure Appl. Geophys 171:1599-1624, 2014) regarding the extreme value theory and statistics in the case of earthquake magnitudes. The link between the generalized extreme value distribution (GEVD) as an asymptotic model for the block maxima of a random variable and the generalized Pareto distribution (GPD) as a model for the peaks over threshold (POT) of the same random variable is presented more clearly. Inappropriately, Pisarenko et al. (Pure Appl. Geophys 171:1599-1624, 2014) have neglected to note that the approximations by GEVD and GPD work only asymptotically in most cases. This is particularly the case with truncated exponential distribution (TED), a popular distribution model for earthquake magnitudes. I explain why the classical models and methods of the extreme value theory and statistics do not work well for truncated exponential distributions. Consequently, these classical methods should be used for the estimation of the upper bound magnitude and corresponding parameters. Furthermore, I comment on various issues of statistical inference in Pisarenko et al. and propose alternatives. I argue why GPD and GEVD would work for various types of stochastic earthquake processes in time, and not only for the homogeneous (stationary) Poisson process as assumed by Pisarenko et al. (Pure Appl. Geophys 171:1599-1624, 2014). The crucial point of earthquake magnitudes is the poor convergence of their tail distribution to the GPD, and not the earthquake process over time.
NASA Astrophysics Data System (ADS)
Watkins, N. W.
2013-01-01
I review the hierarchy of approaches to complex systems, focusing particularly on stochastic equations. I discuss how the main models advocated by the late Benoit Mandelbrot fit into this classification, and how they continue to contribute to cross-disciplinary approaches to the increasingly important problems of correlated extreme events and unresolved scales. The ideas have broad importance, with applications ranging across science areas as diverse as the heavy tailed distributions of intense rainfall in hydrology, after which Mandelbrot named the "Noah effect"; the problem of correlated runs of dry summers in climate, after which the "Joseph effect" was named; and the intermittent, bursty, volatility seen in finance and fluid turbulence.
Optimal bounds and extremal trajectories for time averages in nonlinear dynamical systems
NASA Astrophysics Data System (ADS)
Tobasco, Ian; Goluskin, David; Doering, Charles R.
2018-02-01
For any quantity of interest in a system governed by ordinary differential equations, it is natural to seek the largest (or smallest) long-time average among solution trajectories, as well as the extremal trajectories themselves. Upper bounds on time averages can be proved a priori using auxiliary functions, the optimal choice of which is a convex optimization problem. We prove that the problems of finding maximal trajectories and minimal auxiliary functions are strongly dual. Thus, auxiliary functions provide arbitrarily sharp upper bounds on time averages. Moreover, any nearly minimal auxiliary function provides phase space volumes in which all nearly maximal trajectories are guaranteed to lie. For polynomial equations, auxiliary functions can be constructed by semidefinite programming, which we illustrate using the Lorenz system.
Common foot problems in diabetic foot clinic.
Tantisiriwat, Natthiya; Janchai, Siriporn
2008-07-01
To study common foot problems presented in diabetic foot clinic. A retrospectively review of out patient department records and diabetic foot evaluation forms of patients who visited the diabetic foot clinic at King Chulalongkorn Memorial Hospital between 2004 and 2006. Of all diabetic patients, 70 men and 80 women with the average age of 63.8 years were included in this study. About 32% of all reported cases had lower extremity amputation in which the toe was the most common level. Foot problems were evaluated and categorized in four aspects, dermatological, neurological, musculoskeletal, and vascular, which were 67.30%, 79.3%, 74.0%, and 39.3% respectively. More than half of the patients had skin dryness, nail problem and callus formation. Fifty six percent had the abnormal plantar pressure area, which was presented as callus. The great toe was the most common site of callus formation, which was correlated with gait cycle. The current ulcer was 18.8%, which was presented mostly at heel and great toe. Three-fourth of the patients (75.3%) had lost protective sensation, measured by the 5.07 monofilament testing. The most common problem found in musculoskeletal system was limited motion of the joint (44.0%). Claw toe or hammer toe were reported as 32.0% whereas the other deformities were bunnion (12.0%), charcot joint (6.0%) and flat feet (5.3%). The authors classified patients based on category risk to further lower extremity amputation into four groups. Forty-seven percent had highest risk for having further amputation because they had lost protective sensation from monofilament testing, previous current ulcer, or history of amputation. Only half of the patients had previous foot care education. Multidisciplinary diabetic foot care including patient education (proper foot care and footwear), early detection, effective management of foot problems, and scheduled follow-up must be emphasized to prevent diabetes-related lower extremities amputation.
One-dimensional Gromov minimal filling problem
NASA Astrophysics Data System (ADS)
Ivanov, Alexandr O.; Tuzhilin, Alexey A.
2012-05-01
The paper is devoted to a new branch in the theory of one-dimensional variational problems with branching extremals, the investigation of one-dimensional minimal fillings introduced by the authors. On the one hand, this problem is a one-dimensional version of a generalization of Gromov's minimal fillings problem to the case of stratified manifolds. On the other hand, this problem is interesting in itself and also can be considered as a generalization of another classical problem, the Steiner problem on the construction of a shortest network connecting a given set of terminals. Besides the statement of the problem, we discuss several properties of the minimal fillings and state several conjectures. Bibliography: 38 titles.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Chao ..; Singh, Vijay P.; Mishra, Ashok K.
2013-02-06
This paper presents an improved brivariate mixed distribution, which is capable of modeling the dependence of daily rainfall from two distinct sources (e.g., rainfall from two stations, two consecutive days, or two instruments such as satellite and rain gauge). The distribution couples an existing framework for building a bivariate mixed distribution, the theory of copulae and a hybrid marginal distribution. Contributions of the improved distribution are twofold. One is the appropriate selection of the bivariate dependence structure from a wider admissible choice (10 candidate copula families). The other is the introduction of a marginal distribution capable of better representing lowmore » to moderate values as well as extremes of daily rainfall. Among several applications of the improved distribution, particularly presented here is its utility for single-site daily rainfall simulation. Rather than simulating rainfall occurrences and amounts separately, the developed generator unifies the two processes by generalizing daily rainfall as a Markov process with autocorrelation described by the improved bivariate mixed distribution. The generator is first tested on a sample station in Texas. Results reveal that the simulated and observed sequences are in good agreement with respect to essential characteristics. Then, extensive simulation experiments are carried out to compare the developed generator with three other alternative models: the conventional two-state Markov chain generator, the transition probability matrix model and the semi-parametric Markov chain model with kernel density estimation for rainfall amounts. Analyses establish that overall the developed generator is capable of reproducing characteristics of historical extreme rainfall events and is apt at extrapolating rare values beyond the upper range of available observed data. Moreover, it automatically captures the persistence of rainfall amounts on consecutive wet days in a relatively natural and easy way. Another interesting observation is that the recognized ‘overdispersion’ problem in daily rainfall simulation ascribes more to the loss of rainfall extremes than the under-representation of first-order persistence. The developed generator appears to be a sound option for daily rainfall simulation, especially in particular hydrologic planning situations when rare rainfall events are of great importance.« less
NASA Astrophysics Data System (ADS)
Castiglioni, S.; Toth, E.
2009-04-01
In the calibration procedure of continuously-simulating models, the hydrologist has to choose which part of the observed hydrograph is most important to fit, either implicitly, through the visual agreement in manual calibration, or explicitly, through the choice of the objective function(s). Changing the objective functions it is in fact possible to emphasise different kind of errors, giving them more weight in the calibration phase. The objective functions used for calibrating hydrological models are generally of the quadratic type (mean squared error, correlation coefficient, coefficient of determination, etc) and are therefore oversensitive to high and extreme error values, that typically correspond to high and extreme streamflow values. This is appropriate when, like in the majority of streamflow forecasting applications, the focus is on the ability to reproduce potentially dangerous flood events; on the contrary, if the aim of the modelling is the reproduction of low and average flows, as it is the case in water resource management problems, this may result in a deterioration of the forecasting performance. This contribution presents the results of a series of automatic calibration experiments of a continuously-simulating rainfall-runoff model applied over several real-world case-studies, where the objective function is chosen so to highlight the fit of average and low flows. In this work a simple conceptual model will be used, of the lumped type, with a relatively low number of parameters to be calibrated. The experiments will be carried out for a set of case-study watersheds in Central Italy, covering an extremely wide range of geo-morphologic conditions and for whom at least five years of contemporary daily series of streamflow, precipitation and evapotranspiration estimates are available. Different objective functions will be tested in calibration and the results will be compared, over validation data, against those obtained with traditional squared functions. A companion work presents the results, over the same case-study watersheds and observation periods, of a system-theoretic model, again calibrated for reproducing average and low streamflows.
A lexicon based method to search for extreme opinions
Gamallo, Pablo
2018-01-01
Studies in sentiment analysis and opinion mining have been focused on many aspects related to opinions, namely polarity classification by making use of positive, negative or neutral values. However, most studies have overlooked the identification of extreme opinions (most negative and most positive opinions) in spite of their vast significance in many applications. We use an unsupervised approach to search for extreme opinions, which is based on the automatic construction of a new lexicon containing the most negative and most positive words. PMID:29799867
Interstellar absorption of the extreme ultraviolet flux from two hot white dwarfs
NASA Technical Reports Server (NTRS)
Cash, W.; Bowyer, S.; Lampton, M.
1979-01-01
Photometric upper limits on the 300 A flux from the hot white dwarfs Feige 24 and G191-B2B are presented. The limits, which were obtained with a rocket-borne extreme ultraviolet imaging telescope, are interpreted as lower limits on the density of the intervening interstellar matter. The limits are used to investigate the state of interstellar gas within 100 pc. A local clumpiness factor, which is of value in planning future extreme ultraviolet observations, is derived.
2017-12-01
anger and violence of those groups improperly judged as “inferior,” especially when a society fails to instill a sense of communal values in the overall...more likely when people experience ‘collective strains’ that are: 1. High in Magnitude with civilians affected; 2. Unjust; 3. Inflicted by... American efforts to counter violent extremism, “if properly implemented, can help sway young people from radicalizing, thereby saving lives and enabling
A lexicon based method to search for extreme opinions.
Almatarneh, Sattam; Gamallo, Pablo
2018-01-01
Studies in sentiment analysis and opinion mining have been focused on many aspects related to opinions, namely polarity classification by making use of positive, negative or neutral values. However, most studies have overlooked the identification of extreme opinions (most negative and most positive opinions) in spite of their vast significance in many applications. We use an unsupervised approach to search for extreme opinions, which is based on the automatic construction of a new lexicon containing the most negative and most positive words.
Rankin, Jeffery W.; Kwarciak, Andrew M.; Richter, W. Mark; Neptune, Richard R.
2010-01-01
Manual wheelchair propulsion has been linked to a high incidence of overuse injury and pain in the upper extremity, which may be caused by the high load requirements and low mechanical efficiency of the task. Previous studies have suggested that poor mechanical efficiency may be due to a low effective handrim force (i.e. applied force that is not directed tangential to the handrim). As a result, studies attempting to reduce upper extremity demand have used various measures of force effectiveness (e.g. fraction effective force, FEF) as a guide for modifying propulsion technique, developing rehabilitation programs and configuring wheelchairs. However, the relationship between FEF and upper extremity demand is not well understood. The purpose of this study was to use forward dynamics simulations of wheelchair propulsion to determine the influence of FEF on upper extremity demand by quantifying individual muscle stress, work and handrim force contributions at different values of FEF. Simulations maximizing and minimizing FEF resulted in higher average muscle stresses (23% and 112%) and total muscle work (28% and 71%) compared to a nominal FEF simulation. The maximal FEF simulation also shifted muscle use from muscles crossing the elbow to those at the shoulder (e.g. rotator cuff muscles), placing greater demand on shoulder muscles during propulsion. The optimal FEF value appears to represent a balance between increasing push force effectiveness to increase mechanical efficiency and minimizing upper extremity demand. Thus, care should be taken in using force effectiveness as a metric to reduce upper extremity demand. PMID:20674921
Wang, Liming; Wei, Jingjing; Su, Zhaohui
2011-12-20
High contact angle hysteresis on polyelectrolyte multilayers (PEMs) ion-paired with hydrophobic perfluorooctanoate anions is reported. Both the bilayer number of PEMs and the ionic strength of deposition solutions have significant influence on contact angle hysteresis: higher ionic strength and greater bilayer number cause increased contact angle hysteresis values. The hysteresis values of ~100° were observed on smooth PEMs and pinning of the receding contact line on hydrophilic defects is implicated as the cause of hysteresis. Surface roughness can be used to further tune the contact angle hysteresis on the PEMs. A surface with extremely high contact angle hysteresis of 156° was fabricated when a PEM was deposited on a rough substrate coated with submicrometer scale silica spheres. It was demonstrated that this extremely high value of contact angle hysteresis resulted from the penetration of water into the rough asperities on the substrate. The same substrate hydrophobized by chemical vapor deposition of 1H,1H,2H,2H-perfluorooctyltriethoxysilane exhibits high advancing contact angle and low hysteresis. © 2011 American Chemical Society
NASA Astrophysics Data System (ADS)
Shih, Hong-Yan; Goldenfeld, Nigel
Experiments on transitional turbulence in pipe flow seem to show that turbulence is a transient metastable state since the measured mean lifetime of turbulence puffs does not diverge asymptotically at a critical Reynolds number. Yet measurements reveal that the lifetime scales with Reynolds number in a super-exponential way reminiscent of extreme value statistics, and simulations and experiments in Couette and channel flow exhibit directed percolation type scaling phenomena near a well-defined transition. This universality class arises from the interplay between small-scale turbulence and a large-scale collective zonal flow, which exhibit predator-prey behavior. Why is asymptotically divergent behavior not observed? Using directed percolation and a stochastic individual level model of predator-prey dynamics related to transitional turbulence, we investigate the relation between extreme value statistics and power law critical behavior, and show that the paradox is resolved by carefully defining what is measured in the experiments. We theoretically derive the super-exponential scaling law, and using finite-size scaling, show how the same data can give both super-exponential behavior and power-law critical scaling.
Preliminary research of a novel center-driven robot for upper extremity rehabilitation.
Cao, Wujing; Zhang, Fei; Yu, Hongliu; Hu, Bingshan; Meng, Qiaoling
2018-01-19
Loss of upper limb function often appears after stroke. Robot-assisted systems are becoming increasingly common in upper extremity rehabilitation. Rehabilitation robot provides intensive motor therapy, which can be performed in a repetitive, accurate and controllable manner. This study aims to propose a novel center-driven robot for upper extremity rehabilitation. A new power transmission mechanism is designed to transfer the power to elbow and shoulder joints from three motors located on the base. The forward and inverse kinematics equations of the center-driven robot (CENTROBOT) are deduced separately. The theoretical values of the scope of joint movements are obtained with the Denavit-Hartenberg parameters method. A prototype of the CENTROBOT is developed and tested. The elbow flexion/extension, shoulder flexion/extension and shoulder adduction/abduction can be realized of the center-driven robot. The angles value of joints are in conformity with the theoretical value. The CENTROBOT reduces the overall size of the robot arm, the influence of motor noise, radiation and other adverse factors by setting all motors on the base. It can satisfy the requirements of power and movement transmission of the robot arm.
NASA Astrophysics Data System (ADS)
Shen, L.; Mickley, L. J.; Gilleland, E.
2016-04-01
We develop a statistical model using extreme value theory to estimate the 2000-2050 changes in ozone episodes across the United States. We model the relationships between daily maximum temperature (Tmax) and maximum daily 8 h average (MDA8) ozone in May-September over 2003-2012 using a Point Process (PP) model. At ~20% of the sites, a marked decrease in the ozone-temperature slope occurs at high temperatures, defined as ozone suppression. The PP model sometimes fails to capture ozone-Tmax relationships, so we refit the ozone-Tmax slope using logistic regression and a generalized Pareto distribution model. We then apply the resulting hybrid-extreme value theory model to projections of Tmax from an ensemble of downscaled climate models. Assuming constant anthropogenic emissions at the present level, we find an average increase of 2.3 d a-1 in ozone episodes (>75 ppbv) across the United States by the 2050s, with a change of +3-9 d a-1 at many sites.
An Initial Model of Requirements Traceability an Empirical Study
1992-09-22
procedures have been used extensively in the study of human problem-solving, including such areas as general problem-solving behavior, physics problem...heen doing unless you have traceability." " Humans don’t go back to the requirements enough." "Traceabi!ity should be extremely helpful with...by constraints on its usage: ("Traceability needs to be something that humans can work with, not just a whip held over people." "Traceability should
Heijman, Jordi; Algalarrondo, Vincent; Voigt, Niels; Melka, Jonathan; Wehrens, Xander H T; Dobrev, Dobromir; Nattel, Stanley
2016-04-01
Atrial fibrillation (AF) is an extremely common clinical problem associated with increased morbidity and mortality. Current antiarrhythmic options include pharmacological, ablation, and surgical therapies, and have significantly improved clinical outcomes. However, their efficacy remains suboptimal, and their use is limited by a variety of potentially serious adverse effects. There is a clear need for improved therapeutic options. Several decades of research have substantially expanded our understanding of the basic mechanisms of AF. Ectopic firing and re-entrant activity have been identified as the predominant mechanisms for arrhythmia initiation and maintenance. However, it has become clear that the clinical factors predisposing to AF and the cellular and molecular mechanisms involved are extremely complex. Moreover, all AF-promoting and maintaining mechanisms are dynamically regulated and subject to remodelling caused by both AF and cardiovascular disease. Accordingly, the initial presentation and clinical progression of AF patients are enormously heterogeneous. An understanding of arrhythmia mechanisms is widely assumed to be the basis of therapeutic innovation, but while this assumption seems self-evident, we are not aware of any papers that have critically examined the practical contributions of basic research into AF mechanisms to arrhythmia management. Here, we review recent insights into the basic mechanisms of AF, critically analyse the role of basic research insights in the development of presently used anti-AF therapeutic options and assess the potential value of contemporary experimental discoveries for future therapeutic innovation. Finally, we highlight some of the important challenges to the translation of basic science findings to clinical application. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2015. For permissions please email: journals.permissions@oup.com.
Child, neglect and oral health
2013-01-01
Background Despite advancements in oral health policies, dental caries still a problem. The lack of parents/caregiver’s care regarding child’s oral health, which characterizes neglect, may lead to a high prevalence of caries. Therefore, the objective of this study was to analyze the relation between dental caries and neglect in five year-old children. Methods Quantitative study performed in two different moments. First, the children underwent oral examinations and physical inspection. Then, a semi-structured interview was performed with parents of children with high and low caries rate. Results In all, 149 physical inspections and oral exams were performed. The number of decayed, missing and filled teeth – dmf-t was 2.75 (SD 2.83); 16 children had extremely high values (dmf-t ≥7), 85 intermediate values (1 ≤ dmf-t ≥ 6) and 48 extremely low (dmf-t = 0). Nearly all caregivers were female (96.7%; n = 29), mostly mothers (93.3%; n = 28). Associations were found between caries experience and reason of the last consultation (p = 0.011), decayed teeth and child’s oral health perception (p = 0.001). There was a trend towards a significant association between general health and decayed teeth (p = 0.079), general hygiene and caries experience (p = 0.083), and caries experience and number of times the child brushes the teeth (p = 0.086). Conclusion There’s a relation between caries experience and children’s oral health perception by caregivers, as well as between caries experience and children’s access to dental care. There is a trend towards association between caries experience and risk factors suggestive of neglect. PMID:24238222
NASA Astrophysics Data System (ADS)
Liu, Wei; Li, Ying-jun; Jia, Zhen-yuan; Zhang, Jun; Qian, Min
2011-01-01
In working process of huge heavy-load manipulators, such as the free forging machine, hydraulic die-forging press, forging manipulator, heavy grasping manipulator, large displacement manipulator, measurement of six-dimensional heavy force/torque and real-time force feedback of the operation interface are basis to realize coordinate operation control and force compliance control. It is also an effective way to raise the control accuracy and achieve highly efficient manufacturing. Facing to solve dynamic measurement problem on six-dimensional time-varying heavy load in extremely manufacturing process, the novel principle of parallel load sharing on six-dimensional heavy force/torque is put forward. The measuring principle of six-dimensional force sensor is analyzed, and the spatial model is built and decoupled. The load sharing ratios are analyzed and calculated in vertical and horizontal directions. The mapping relationship between six-dimensional heavy force/torque value to be measured and output force value is built. The finite element model of parallel piezoelectric six-dimensional heavy force/torque sensor is set up, and its static characteristics are analyzed by ANSYS software. The main parameters, which affect load sharing ratio, are analyzed. The experiments for load sharing with different diameters of parallel axis are designed. The results show that the six-dimensional heavy force/torque sensor has good linearity. Non-linearity errors are less than 1%. The parallel axis makes good effect of load sharing. The larger the diameter is, the better the load sharing effect is. The results of experiments are in accordance with the FEM analysis. The sensor has advantages of large measuring range, good linearity, high inherent frequency, and high rigidity. It can be widely used in extreme environments for real-time accurate measurement of six-dimensional time-varying huge loads on manipulators.
Modeling Future Fire danger over North America in a Changing Climate
NASA Astrophysics Data System (ADS)
Jain, P.; Paimazumder, D.; Done, J.; Flannigan, M.
2016-12-01
Fire danger ratings are used to determine wildfire potential due to weather and climate factors. The Fire Weather Index (FWI), part of the Canadian Forest Fire Danger Rating System (CFFDRS), incorporates temperature, relative humidity, windspeed and precipitation to give a daily fire danger rating that is used by wildfire management agencies in an operational context. Studies using GCM output have shown that future wildfire danger will increase in a warming climate. However, these studies are somewhat limited by the coarse spatial resolution (typically 100-400km) and temporal resolution (typically 6-hourly to monthly) of the model output. Future wildfire potential over North America based on FWI is calculated using output from the Weather, Research and Forecasting (WRF) model, which is used to downscale future climate scenarios from the bias-corrected Community Climate System Model (CCSM) under RCP8.5 scenarios at a spatial resolution of 36km. We consider five eleven year time slices: 1990-2000, 2020-2030, 2030-2040, 2050-2060 and 2080-2090. The dynamically downscaled simulation improves determination of future extreme weather by improving both spatial and temporal resolution over most GCM models. To characterize extreme fire weather we calculate annual numbers of spread days (days for which FWI > 19) and annual 99th percentile of FWI. Additionally, an extreme value analysis based on the peaks-over-threshold method allows us to calculate the return values for extreme FWI values.
NASA Astrophysics Data System (ADS)
Martucci, G.; Carniel, S.; Chiggiato, J.; Sclavo, M.; Lionello, P.; Galati, M. B.
2010-06-01
The study is a statistical analysis of sea states timeseries derived using the wave model WAM forced by the ERA-40 dataset in selected areas near the Italian coasts. For the period 1 January 1958 to 31 December 1999 the analysis yields: (i) the existence of a negative trend in the annual- and winter-averaged sea state heights; (ii) the existence of a turning-point in late 80's in the annual-averaged trend of sea state heights at a site in the Northern Adriatic Sea; (iii) the overall absence of a significant trend in the annual-averaged mean durations of sea states over thresholds; (iv) the assessment of the extreme values on a time-scale of thousand years. The analysis uses two methods to obtain samples of extremes from the independent sea states: the r-largest annual maxima and the peak-over-threshold. The two methods show statistical differences in retrieving the return values and more generally in describing the significant wave field. The r-largest annual maxima method provides more reliable predictions of the extreme values especially for small return periods (<100 years). Finally, the study statistically proves the existence of decadal negative trends in the significant wave heights and by this it conveys useful information on the wave climatology of the Italian seas during the second half of the 20th century.