Sample records for severe windstorm testing

  1. Quantifying the Extremity of Windstorms for Regions Featuring Infrequent Events

    NASA Astrophysics Data System (ADS)

    Walz, M. A.; Leckebusch, G. C.; Kruschke, T.; Rust, H.; Ulbrich, U.

    2017-12-01

    This paper introduces the Distribution-Independent Storm Severity Index (DI-SSI). The DI-SSI represents an approach to quantify the severity of exceptional surface wind speeds of large scale windstorms that is complementary to the Storm Severity Index (SSI) introduced by Leckebusch et al. (2008). While the SSI approaches the extremeness of a storm from a meteorological and potential loss (impact) perspective, the DI-SSI defines the severity in a more climatological perspective. The idea is to assign equal index values to wind speeds of the same singularity (e.g. the 99th percentile) under consideration of the shape of the tail of the local wind speed climatology. Especially in regions at the edge of the classical storm track the DI-SSI shows more equitable severity estimates, e.g. for the extra-tropical cyclone Klaus. Here were compare the integral severity indices for several prominent windstorm in the European domain and discuss the advantages and disadvantages of the respective index. In order to compare the indices, their relation with the North Atlantic Oscillation (NAO) is studied, which is one of the main large scale drivers for the intensity of European windstorms. Additionally we can identify a significant relationship between the frequency and intensity of windstorms for large parts of the European domain.

  2. Two outstanding windstorms on 7 December 1868 and 26/27 October 1870 in the Czech Lands: course, extent, impacts

    NASA Astrophysics Data System (ADS)

    Brázdil, Rudolf; Stucki, Peter; Szabó, Péter; Dobrovolný, Petr; Řezníčková, Ladislava; Kotyza, Oldřich; Valášek, Hubert; Dolák, Lukáš; Zahradníček, Pavel; Suchánková, Silvie

    2017-04-01

    Because of relatively short series of wind-speed measurements (starting in the Czech Lands during the first half of the 20th century), documentary evidence (chronicles and memories, economic and financial reports, newspapers, forestry journals etc.) represents an important source of information for the study of past outstanding windstorms. Two such windstorms on 7 December 1868 and 26/27 October 1870, most damaging windstorm of the 19th century, are presented with respect to their course, spatial extent and damaging impacts. Combining documentary data and systematic meteorological observations (wind force and direction) with information derived from an atmospheric reanalysis dataset allows the hurricane-force severity of both windstorm to be attributed to the passage of a cold front, during the day on 7 December 1868 or during the night on 26/27 October 1870. The occurrence time influenced human loss: at least 27 fatalities and 38 largely seriously injured in the first case compared to documented five fatalities and five injured in the second case. Severe dame to building and other structures as well as forest damage were documented for 237 places and 174 places (plus 28 city quarters in Prague) respectively. The 1868 windstorm damaged at least 8 million cubic metres of timber, which is arguably more than has been lost to any single similar event since in the Czech Lands. The 1870 windstorm totally devastated particularly many forested areas of the Šumava Mts. in south-west Bohemia. Because 1870 windstorm followed only shortly upon a previous event in 1868, the enormous quantity of windthrown wood in forests, which simply could not be fast-processed, contributed significantly to a subsequent bark-beetle infestation calamity in the 1870s. In certain forest stands, imprints of these aggregate effects appear to this day. The (Central) European scale of both windstorms is also well documented by meteorological and documentary data from other countries. (This work was supported by Czech Science Foundation, project no. 15-11805S "Windstorms in the Czech Lands during the past 500 years".)

  3. Floods and windstorms in the Czech Republic during the past millennium: synthesis of documentary and instrumental data

    NASA Astrophysics Data System (ADS)

    Brázdil, R.; Dobrovolný, P.; Valášek, H.; Kotyza, O.

    2009-09-01

    Floods and windstorms are the most disastrous natural events occurring on the territory of the Czech Republic. Study of their frequency, severity, seasonality, causes and impacts in the long-term scale is important for saving of human lives and diminishing of material losses. Information related to these phenomena from the period of instrumental hydrological and meteorological measurements can be significantly extended by using documentary evidence going back to the 12th century. Basic types of documentary evidence with information about floods and windstorms are presented and methodological problems of elaboration of such evidence are discussed. Synoptic causes of floods and windstorms in the Czech Republic are demonstrated. Series of these phenomena created for the instrumental and pre-instrumental period are finally used for compilation of synthesis series, namely for floods of the main rivers in the Czech Republic (the Vltava, the Ohře, the Elbe, the Odra and the Morava) and for windstorms divided according to the type of event, extent and character of damage. Moreover, the most disastrous events ("floods and windstorms of the century”) are particularly analyzed. Finally, floods and windstorms are discussed in the context of past long-term climate variability.

  4. The second most disastrous windstorm of the nineteenth century in the Czech Lands, 26-27 October 1870

    NASA Astrophysics Data System (ADS)

    Brázdil, Rudolf; Stucki, Peter; Szabó, Péter; Dobrovolný, Petr; Řezníčková, Ladislava; Kotyza, Oldřich; Valášek, Hubert; Dolák, Lukáš; Zahradníček, Pavel; Suchánková, Silvie

    2018-05-01

    One of the most disastrous windstorms to take place over the Czech Lands occurred on the night of 26/27 October 1870. It is here analysed through the use of documentary data (narrative sources, newspapers, forestry journals, printed documents) and systematic meteorological observations (wind force and direction). Combining this evidence with information derived from an atmospheric reanalysis dataset allows the severity of the windstorm to be attributed to the passage of a cold front, a frontal system associated with a secondary low in a typically storm-prone synoptic environment. Its social impacts were characterised by great material damage, particularly to buildings and other structures, trees and forests. These are recorded not only for 174 places around the countryside and lesser settlements of the Czech Lands, but also for 28 city quarters in Prague, the capital city. The windstorm occurred in the night hours, so only a few people were killed or injured. However, the 1870 windstorm totally devastated many forested areas of the Šumava Mts. in south-west Bohemia. Damage to forests in other parts of the Czech Lands was also severe, but difficult to quantify exactly for lack of high-resolution spatial data. Because this windstorm followed only shortly upon a previous similarly disastrous wind event on 7 December 1868, the enormous quantity of windthrown wood in forests, which simply could not be fast-processed, contributed significantly to a subsequent bark-beetle infestation calamity in the 1870s. In certain forest stands, imprints of these aggregate effects appear to this day. The central-European scale of 1870 windstorm is also well documented by meteorological and documentary data from Germany, Austria and Slovakia.

  5. European Wintertime Windstorms and its Links to Large-Scale Variability Modes

    NASA Astrophysics Data System (ADS)

    Befort, D. J.; Wild, S.; Walz, M. A.; Knight, J. R.; Lockwood, J. F.; Thornton, H. E.; Hermanson, L.; Bett, P.; Weisheimer, A.; Leckebusch, G. C.

    2017-12-01

    Winter storms associated with extreme wind speeds and heavy precipitation are the most costly natural hazard in several European countries. Improved understanding and seasonal forecast skill of winter storms will thus help society, policy-makers and (re-) insurance industry to be better prepared for such events. We firstly assess the ability to represent extra-tropical windstorms over the Northern Hemisphere of three seasonal forecast ensemble suites: ECMWF System3, ECMWF System4 and GloSea5. Our results show significant skill for inter-annual variability of windstorm frequency over parts of Europe in two of these forecast suites (ECMWF-S4 and GloSea5) indicating the potential use of current seasonal forecast systems. In a regression model we further derive windstorm variability using the forecasted NAO from the seasonal model suites thus estimating the suitability of the NAO as the only predictor. We find that the NAO as the main large-scale mode over Europe can explain some of the achieved skill and is therefore an important source of variability in the seasonal models. However, our results show that the regression model fails to reproduce the skill level of the directly forecast windstorm frequency over large areas of central Europe. This suggests that the seasonal models also capture other sources of variability/predictability of windstorms than the NAO. In order to investigate which other large-scale variability modes steer the interannual variability of windstorms we develop a statistical model using a Poisson GLM. We find that the Scandinavian Pattern (SCA) in fact explains a larger amount of variability for Central Europe during the 20th century than the NAO. This statistical model is able to skilfully reproduce the interannual variability of windstorm frequency especially for the British Isles and Central Europe with correlations up to 0.8.

  6. Extra-tropical Cyclones and Windstorms in Seasonal Forecasts

    NASA Astrophysics Data System (ADS)

    Leckebusch, Gregor C.; Befort, Daniel J.; Weisheimer, Antje; Knight, Jeff; Thornton, Hazel; Roberts, Julia; Hermanson, Leon

    2015-04-01

    Severe damages and large insured losses over Europe related to natural phenomena are mostly caused by extra-tropical cyclones and their related windstorm fields. Thus, an adequate representation of these events in seasonal prediction systems and reliable forecasts up to a season in advance would be of high value for society and economy. In this study, state-of-the-art seasonal forecast prediction systems are analysed (ECMWF, UK Met Office) regarding the general climatological representation and the seasonal prediction of extra-tropical cyclones and windstorms during the core winter season (DJF) with a lead time of up to four months. Two different algorithms are used to identify cyclones and windstorm events in these datasets. Firstly, we apply a cyclone identification and tracking algorithm based on the Laplacian of MSLP and secondly, we use an objective wind field tracking algorithm to identify and track continuous areas of extreme high wind speeds (cf. Leckebusch et al., 2008), which can be related to extra-tropical winter cyclones. Thus, for the first time, we can analyse the forecast of severe wind events near to the surface caused by extra-tropical cyclones. First results suggest a successful validation of the spatial climatological distributions of wind storm and cyclone occurrence in the seasonal forecast systems in comparison with reanalysis data (ECMWF-ERA40 & ERAInterim) in general. However, large biases are found for some areas. The skill of the seasonal forecast systems in simulating the year-to-year variability of the frequency of severe windstorm events and cyclones is investigated using the ranked probability skill score. Positive skill is found over large parts of the Northern Hemisphere as well as for the most intense extra-tropical cyclones and its related wind fields.

  7. Windstorm damage in Boundary Waters Canoe Area Wilderness (Minnesota, USA): Evaluating landscape-level risk factors

    Treesearch

    W. Keith Moser; Mark D. Nelson

    2009-01-01

    Ecosystem management requires an understanding of disturbance processes and their influence on forests. One of these disturbances is damage due to severe wind events. In an ideal model, assessing risk of windstorm damage to a forested ecosystem entails defining tree-, stand-, and landscape-level factors that influence response and recovery. Data are not always...

  8. Generation of a Catalogue of European Windstorms

    NASA Astrophysics Data System (ADS)

    Varino, Filipa; Baptiste Granier, Jean; Bordoy, Roger; Arbogast, Philippe; Joly, Bruno; Riviere, Gwendal; Fandeur, Marie-Laure; Bovy, Henry; Mitchell-Wallace, Kirsten; Souch, Claire

    2016-04-01

    The probability of multiple wind-storm events within a year is crucial to any (re)insurance company writing European wind business. Indeed, the volatility of losses is enhanced by the clustering of storms (cyclone families), as occurred in early 1990 (Daria, Vivian, Wiebke), December 1999 (Lothar, Martin) or December 2015 (Desmond, Eva, Frank), among others. In order to track winter extratropical cyclones, we use the maximum relative vorticity at 850 hPa of the new-released long-term ERA-20C reanalysis from the ECMWF since the beginning of the 20th Century until 2010. We develop an automatic procedure to define events. We then quantify the severity of each storm using loss and meteorological indices at country and Europe-wide level. Validation against market losses for the period 1970-2010 is undertaken before considering the severity and frequency of European windstorms for the 110 years period.

  9. Assessing forest windthrow damage using single-date, post-event airborne laser scanning data

    Treesearch

    Gherardo Chirici; Francesca Bottalico; Francesca Giannetti; Barbara Del Perugia; Davide Travaglini; Susanna Nocentini; Erico Kutchartt; Enrico Marchi; Cristiano Foderi; Marco Fioravanti; Lorenzo Fattorini; Lorenzo Bottai; Ronald McRoberts; Erik Næsset; Piermaria Corona; Bernardo Gozzini

    2017-01-01

    One of many possible climate change effects in temperate areas is the increase of frequency and severity of windstorms; thus, fast and cost efficient new methods are needed to evaluate wind-induced damages in forests. We present a method for assessing windstorm damages in forest landscapes based on a two-stage sampling strategy using single-date, post-event airborne...

  10. No upward trend in normalised windstorm losses in Europe: 1970-2008

    NASA Astrophysics Data System (ADS)

    Barredo, J. I.

    2010-01-01

    On 18 January 2007, windstorm Kyrill battered Europe with hurricane-force winds killing 47 people and causing 10 billion US in damage. Kyrill poses several questions: is Kyrill an isolated or exceptional case? Have there been events costing as much in the past? This paper attempts to put Kyrill into an historical context by examining large historical windstorm event losses in Europe for the period 1970-2008 across 29 European countries. It asks the question what economic losses would these historical events cause if they were to recur under 2008 societal conditions? Loss data were sourced from reinsurance firms and augmented with historical reports, peer-reviewed articles and other ancillary sources. Following the same conceptual approach outlined in previous studies, the data were then adjusted for changes in population, wealth, and inflation at the country level and for inter-country price differences using purchasing power parity. The analyses reveal no trend in the normalised windstorm losses and confirm increasing disaster losses are driven by societal factors and increasing exposure.

  11. Main drive selection for the Windstorm Simulation Center

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lacy, J.M.; Earl, J.S.

    1998-02-01

    Operated by the Partnership for Natural Disaster Reduction, the Windstorm Simulation Center (WSC) will be a structural test center dedicated to studying the performance of civil structural systems subjected to hurricanes, tornadoes, and other storm winds. Within the WSC, a bank of high-power fans, the main drive, will produce the high velocity wind necessary to reproduce these storms. Several options are available for the main drive, each with advantages and liabilities. This report documents a study to identify and evaluate all candidates available, and to select the most promising system such that the best possible combination of real-world performance attributesmore » is achieved at the best value. Four broad classes of candidate were identified: electric motors, turbofan aircraft engines, turboshaft aircraft engines, and turboshaft industrial engines. Candidate systems were evaluated on a basis of technical feasibility, availability, power, installed cost, and operating cost.« less

  12. Windstorm of the eighteenth century in the Czech Lands: course, extent, impacts

    NASA Astrophysics Data System (ADS)

    Brázdil, Rudolf; Szabó, Péter; Dobrovolný, Petr; Řezníčková, Ladislava; Kotyza, Oldřich; Suchánková, Silvie; Valášek, Hubert

    2017-07-01

    This paper addresses the course, extent, and impacts of a windstorm that occurred on 20-21 December 1740, in the Czech Lands. The analysis is based on documentary data included in chronicles, "books of memory", memoirs, damage reports, urbaria, and cadastral records, as well as secondary sources. The windstorm started with a thunderstorm in the afternoon of 20 December, continued during the night, and was followed by a flood. It also appeared in documentary data from Bavaria, Thuringia, Saxony, Silesia, Slovakia, and Hungary. The event may be related to a cyclone north-west of the Czech territory moving to the east with an intense western flow over central Europe. The storm did great material damage to houses, farm buildings, churches, and forests and is recorded in various documentary sources for 85 places in the Czech Lands. The windstorm had a significant influence on the development of local plantation forestry (discussed in greater detail). Judging by territorial extent and damage done, this windstorm, compared to other similar events, has been classified as "the windstorm of the eighteenth century" in the Czech Lands. This contribution demonstrates the potential of documentary evidence for the elucidation of heavy windstorms in the pre-instrumental period in Europe.

  13. The contribution of sting-jet windstorms to extreme wind risk in the North Atlantic

    NASA Astrophysics Data System (ADS)

    Hart, Neil C.; Gray, Suzanne L.; Clark, Peter A.

    2016-04-01

    Windstorms are a major winter weather risk for many countries in Europe. These storms are predominantly associated with explosively-developing extratropical cyclones that track across the region. A substantial body of literature exists on the synoptic-scale dynamics, predictability and climatology of such storms. More recently, interest in the mesoscale variability of the most damaging winds has led to a focus on the role of sting jets in enhancing windstorm severity. We present a present-era climatology of North Atlantic cyclones that had potential to produce sting jets. Considering only explosively-developing cyclones, those with sting-jet potential are more likely to have higher relative vorticity and associated low-level wind maxima. Furthermore, the strongest winds for sting-jet cyclones are more often in the cool sector, behind the cold front, when compared with other explosively-developing cyclones which commonly have strong warm-sector winds too. The tracks of sting-jet cyclones, and explosively-developing cyclones in general, show little offset from the climatological storm track. While rare over Europe, sting-jet cyclones are relatively frequent within the main storm track with up to one third of extratropical cyclones exhibiting sting-jet potential. Thus, the rarity and, until recently, lack of description of sting-jet windstorms is more due to the climatological storm track location away from highly-populated land masses, than due to an actual rarity of such storms in nature.

  14. Case study of a severe windstorm over Slovakia and Hungary on 25 June 2008

    NASA Astrophysics Data System (ADS)

    Simon, André; Kaňák, Ján; Sokol, Alois; Putsay, Mária; Uhrínová, Lucia; Csirmaz, Kálmán; Okon, Ľuboslav; Habrovský, Richard

    2011-06-01

    A system of thunderstorms approached the Slovakia and Hungary in the late evening hours of 25 June 2008, causing extensive damage and peak wind gusts up to 40 m/s. This study examines the macro- and mesosynoptic conditions for the windstorm using soundings, analyses, and forecasts of numerical models (ALADIN, ECMWF). A derecho-like character of the event is discussed. Meteosat Second Generation imagery and convective indices inferred from satellite and model data are used to assess the humidity distribution and the conditional instability of the thunderstorm environment. An intrusion of the environmental dry air into the convective system and intensification of downdrafts is considered to be one of the reasons for the damaging winds observed at some areas. This is supported by the radar imagery showing a sudden drop of radar reflectivity and creation of line echo wave patterns and bow echoes. A numerical simulation provided by the non-hydrostatic MM5 model indicated the development of meso-γ scale vortices embedded in the convective system. The genesis and a possible role of such vortices in creating rear-inflow jets and intensifying the low level winds are investigated with the help of the vorticity equation and several other diagnostic parameters. In addition, the effect of various physical parameterisations on the forecast of the windstorm is evaluated.

  15. Extreme Windstorms and Related Impacts on Iberia

    NASA Astrophysics Data System (ADS)

    Liberato, Margarida L. R.; Ordóñez, Paulina; Pinto, Joaquim G.; Ramos, Alexandre M.; Karremann, Melanie K.; Trigo, Isabel F.

    2014-05-01

    Extreme windstorms are one of the major natural catastrophes in the mid latitudes, one of the most costly natural hazards in Europe and are responsible for substantial economic damages and even fatalities. During the recent winters, the Iberian Peninsula was hit by severe (wind) storms such as Klaus (January 2009), Xynthia (February 2010) and Gong (January 2013) which exhibited uncommon characteristics. They were all explosive extratropical cyclones formed over the mid-Atlantic, travelling then eastwards at lower latitudes than usual along the edge of the dominant North Atlantic storm track. In this work we present a windstorm catalogue for the Iberian Peninsula, where the characteristics of the potentially more destructive windstorms for the 1979-2012 period are identified. For this purpose, the potential impact of high winds over the Iberian Peninsula is assessed by using a daily damage index based on maximum wind speeds that exceeds the local 98th percentile threshold. Then, the characteristics of extratropical cyclones associated with these events are analyzed. Results indicate that these are fast moving, intense cyclones, typically located near the northwestern tip of the Iberian Peninsula. This work was partially supported by FEDER (Fundo Europeu de Desenvolvimento Regional) funds through the COMPETE (Programa Operacional Factores de Competitividade) and by national funds through FCT (Fundação para a Ciência e a Tecnologia, Portugal) under project STORMEx FCOMP-01-0124-FEDER- 019524 (PTDC/AAC-CLI/121339/2010). A. M. Ramos was also supported by a FCT postdoctoral Grant (FCT/DFRH/SFRH/BPD/84328/2012).

  16. A statistical model for Windstorm Variability over the British Isles based on Large-scale Atmospheric and Oceanic Mechanisms

    NASA Astrophysics Data System (ADS)

    Kirchner-Bossi, Nicolas; Befort, Daniel J.; Wild, Simon B.; Ulbrich, Uwe; Leckebusch, Gregor C.

    2016-04-01

    Time-clustered winter storms are responsible for a majority of the wind-induced losses in Europe. Over last years, different atmospheric and oceanic large-scale mechanisms as the North Atlantic Oscillation (NAO) or the Meridional Overturning Circulation (MOC) have been proven to drive some significant portion of the windstorm variability over Europe. In this work we systematically investigate the influence of different large-scale natural variability modes: more than 20 indices related to those mechanisms with proven or potential influence on the windstorm frequency variability over Europe - mostly SST- or pressure-based - are derived by means of ECMWF ERA-20C reanalysis during the last century (1902-2009), and compared to the windstorm variability for the European winter (DJF). Windstorms are defined and tracked as in Leckebusch et al. (2008). The derived indices are then employed to develop a statistical procedure including a stepwise Multiple Linear Regression (MLR) and an Artificial Neural Network (ANN), aiming to hindcast the inter-annual (DJF) regional windstorm frequency variability in a case study for the British Isles. This case study reveals 13 indices with a statistically significant coupling with seasonal windstorm counts. The Scandinavian Pattern (SCA) showed the strongest correlation (0.61), followed by the NAO (0.48) and the Polar/Eurasia Pattern (0.46). The obtained indices (standard-normalised) are selected as predictors for a windstorm variability hindcast model applied for the British Isles. First, a stepwise linear regression is performed, to identify which mechanisms can explain windstorm variability best. Finally, the indices retained by the stepwise regression are used to develop a multlayer perceptron-based ANN that hindcasted seasonal windstorm frequency and clustering. Eight indices (SCA, NAO, EA, PDO, W.NAtl.SST, AMO (unsmoothed), EA/WR and Trop.N.Atl SST) are retained by the stepwise regression. Among them, SCA showed the highest linear coefficient, followed by SST in western Atlantic, AMO and NAO. The explanatory regression model (considering all time steps) provided a Coefficient of Determination (R^2) of 0.75. A predictive version of the linear model applying a leave-one-out cross-validation (LOOCV) shows an R2 of 0.56 and a relative RMSE of 4.67 counts/season. An ANN-based nonlinear hindcast model for the seasonal windstorm frequency is developed with the aim to improve the stepwise hindcast ability and thus better predict a time-clustered season over the case study. A 7 node-hidden layer perceptron is set, and the LOOCV procedure reveals a R2 of 0.71. In comparison to the stepwise MLR the RMSE is reduced a 20%. This work shows that for the British Isles case study, most of the interannual variability can be explained by certain large-scale mechanisms, considering also nonlinear effects (ANN). This allows to discern a time-clustered season from a non-clustered one - a key issue for applications e.g., in the (re)insurance industry.

  17. Serendipitous data following a severe windstorm in an old-growth pine stand

    Treesearch

    D.C. Bragg; J.D. Riddle

    2014-01-01

    Reliable dimensional data for old-growth pine-dominated forests in the Gulf Coastal Plain of Arkansas are hard to find, but sometimes unfortunate circumstances provide good opportunities to acquire this information. On July 11, 2013, a severe thunderstorm with high winds struck the Levi Wilcoxon Demonstration Forest (LWDF) near Hamburg, Arkansas. This storm uprooted or...

  18. Impacts of Climate Change On The Occurrence of Extreme Events: The Mice Project

    NASA Astrophysics Data System (ADS)

    Palutikof, J. P.; Mice Team

    It is widely accepted that climate change due to global warming will have substan- tial impacts on the natural environment, and on human activities. Furthermore, it is increasingly recognized that changes in the severity and frequency of extreme events, such as windstorm and flood, are likely to be more important than changes in the average climate. The EU-funded project MICE (Modelling the Impacts of Climate Extremes) commenced in January 2002. It seeks to identify the likely changes in the occurrence of extremes of rainfall, temperature and windstorm due to global warm- ing, using information from climate models as a basis, and to study the impacts of these changes in selected European environments. The objectives are: a) to evaluate, by comparison with gridded and station observations, the ability of climate models to successfully reproduce the occurrence of extremes at the required spatial and temporal scales. b) to analyse model output with respect to future changes in the occurrence of extremes. Statistical analyses will determine changes in (i) the return periods of ex- tremes, (ii) the joint probability of extremes (combinations of damaging events such as windstorm followed by heavy rain), (iii) the sequential behaviour of extremes (whether events are well-separated or clustered) and (iv) the spatial patterns of extreme event occurrence across Europe. The range of uncertainty in model predictions will be ex- plored by analysing changes in model experiments with different spatial resolutions and forcing scenarios. c) to determine the impacts of the predicted changes in extremes occurrence on selected activity sectors: agriculture (Mediterranean drought), commer- cial forestry and natural forest ecosystems (windstorm and flood in northern Europe, fire in the Mediterranean), energy use (temperature extremes), tourism (heat stress and Mediterranean beach holidays, changes in the snow pack and winter sports ) and civil protection/insurance (windstorm and flood). Impacts will be evaluated through a combination of techniques ranging from quantitative analyses through to expert judge- ment. Throughout the project, a continuing dialogue with stakeholders and end-users will be maintained.

  19. Cyclones and extreme windstorm events over Europe under climate change: Global and regional climate model diagnostics

    NASA Astrophysics Data System (ADS)

    Leckebusch, G. C.; Ulbrich, U.

    2003-04-01

    More than any changes of the climate system mean state conditions, the development of extreme events may influence social, economic and legal aspects of our society. This linkage results from the impact of extreme climate events (natural hazards) on environmental systems which again are directly linked to human activities. Prominent examples from the recent past are the record breaking rainfall amounts of August 2002 in central Europe which produced widespread floodings or the wind storm Lothar of December 1999. Within the MICE (Modelling the Impact of Climate Extremes) project framework an assessment of the impact of changes in extremes will be done. The investigation is carried out for several different impact categories as agriculture, energy use and property damage. Focus is laid on the diagnostics of GCM and RCM simulations under different climate change scenarios. In this study we concentrate on extreme windstorms and their relationship to cyclone activity in the global HADCM3 as well as in the regional HADRM3 model under two climate change scenarios (SRESA2a, B2a). In order to identify cyclones we used an objective algorithm from Murry and Simmonds which was widely tested under several different conditions. A slight increase in the occurrence of systems is identified above northern parts of central Europe for both scenarios. For more severe systems (core pressure < 990 hPa) we find an increase for western Europe. Strong wind events can be defined via different percentile values of the windspeed (e.g. above the 95 percentile). By this means the relationship between strong wind events and cyclones is also investigated. For several regions (e.g. Germany, France, Spain) a shift to more deep cyclones connected with an increasing number of strong wind events is found.

  20. Variability of floods, droughts and windstorms over the past 500 years in Central Europe based on documentary and instrumental data

    NASA Astrophysics Data System (ADS)

    Brazdil, Rudolf

    2016-04-01

    Hydrological and meteorological extremes (HMEs) in Central Europe during the past 500 years can be reconstructed based on instrumental and documentary data. Documentary data about weather and related phenomena represent the basic source of information for historical climatology and hydrology, dealing with reconstruction of past climate and HMEs, their perception and impacts on human society. The paper presents the basic distribution of documentary data on (i) direct descriptions of HMEs and their proxies on the one hand and on (ii) individual and institutional data sources on the other. Several groups of documentary evidence such as narrative written records (annals, chronicles, memoirs), visual daily weather records, official and personal correspondence, special prints, financial and economic records (with particular attention to taxation data), newspapers, pictorial documentation, chronograms, epigraphic data, early instrumental observations, early scientific papers and communications are demonstrated with respect to extraction of information about HMEs, which concerns usually of their occurrence, severity, seasonality, meteorological causes, perception and human impacts. The paper further presents the analysis of 500-year variability of floods, droughts and windstorms on the base of series, created by combination of documentary and instrumental data. Results, advantages and drawbacks of such approach are documented on the examples from the Czech Lands. The analysis of floods concentrates on the River Vltava (Prague) and the River Elbe (Děčín) which show the highest frequency of floods occurring in the 19th century (mainly of winter synoptic type) and in the second half of the 16th century (summer synoptic type). Reported are also the most disastrous floods (August 1501, March and August 1598, February 1655, June 1675, February 1784, March 1845, February 1862, September 1890, August 2002) and the European context of floods in the severe winter 1783/84. Drought fluctuations in the Czech Lands are represented by the chronology of drought frequency on the one hand and by the reconstructed series of drought indices (SPI, SPEI, Z-Index and PDSI) on the other. Wind extremes are documented on the example of Czech windstorm chronology derived from documentary data (including tornadoes) with an example of "windstorm of the 18th century" (20-21 December 1740). Finally, scientific potential and perspectives of historical-climatological (historical-hydrological) research of HMEs are presented.

  1. Serial Clustering of North Atlantic Cyclones and Wind Storms: A New Identification Base and Sensitivity to Intensity and Intra-Seasonal Variability

    NASA Astrophysics Data System (ADS)

    Leckebusch, G. C.; Kirchner-Bossi, N. O.; Befort, D. J.; Ulbrich, U.

    2015-12-01

    Time-clustered mid-latitude winter storms are responsible for a large portion of the overall windstorm-related damage in Europe. Thus, its study entails a high meteorological interest, while its outcome can result in a crucial utility for the (re)insurance industry. In addition to existing cyclone-based studies, here we use an event identification approach based on surface near wind speeds only, to investigate windstorm clustering and compare it to cyclone clustering. Specifically, cyclone and windstorm tracks are identified for winter 1979-2013 (Oct-Mar), to perform two sensitivity analyses on event-clustering in the North Atlantic using ERA-Interim Reanalysis. First, the link between clustering and cyclone intensity is analysed and compared to windstorms. Secondly, the sensitivity of clustering on intra-seasonal time scales is investigated, for both cyclones and windstorms. The wind-based approach reveals additional regions of clustering over Western Europe, which could be related to extreme damages, showing the added value of investigating wind field derived tracks in addition to that of cyclone tracks. Previous studies indicate a higher degree of clustering for stronger cyclones. However, our results show that this assumption is not always met. Although a positive relationship is confirmed for the clustering centre located over Iceland, clustering off the coast of the Iberian Peninsula behaves opposite. Even though this region shows the highest clustering, most of its signal is due to cyclones with intensities below the 70th percentile of the Laplacian of MSLP. Results on the sensitivity of clustering to the time of the winter season (Oct-Mar) show a temporal evolution of the clustering patterns, for both windstorms and cyclones. Compared to all cyclones, clustering of windstorms and strongest cyclones culminate around February, while all cyclone clustering peak in December to January.

  2. Decadal predictability of winter windstorm frequency in Eastern Europe

    NASA Astrophysics Data System (ADS)

    Höschel, Ines; Grieger, Jens; Ulbrich, Uwe

    2017-04-01

    Winter windstorms are one of the most impact relevant extreme-weather events in Europe. This study is focussed on windstorm frequency in Eastern Europe at multi-year time scale. Individual storms are identified by using 6-hourly 10m-wind-fields. The impact-oriented tracking algorithm is based on the exceedance of the local 98 percentile of wind speed and a minimum duration of 18 hours. Here, storm frequency is the number of 1000km-footprints of identified windstorms touching the location during extended boreal winter from October to March. The temporal development of annual storm frequencies in Eastern Europe shows variations on a six to fifteen years period. Higher than normal windstorm frequency occurred end of the 1950s and in beginning of the seventies, while lower than normal frequency were around 1960 and in the forties, for example. The correlation between bandpass filtered storm frequency and North Atlantic sea surface temperature shows a significant pattern with a positive correlation in the subtropical East Atlantic and significant negative correlations in the Gulfstream region. The relationship between these multi-year variations and predictability on decadal time scales is discussed. The resulting skill of winter wind storms in the German decadal prediction system MiKlip, based on the numerical earth system model MPI-ESM, will be presented.

  3. The impact of clustering of extreme European windstorm events on (re)insurance market portfolios

    NASA Astrophysics Data System (ADS)

    Mitchell-Wallace, Kirsten; Alvarez-Diaz, Teresa

    2010-05-01

    Traditionally the occurrence of windstorm loss events in Europe has been considered as independent. However, a number of significant losses close in space and time indicates that this assumption may need to be revised. Under particular atmospheric conditions multiple loss-causing cyclones can occur in succession, affecting similar geographic regions and, therefore, insurance markets. A notable example is of Lothar and Martin in France in December 1999. Although the existence of cyclone families is well-known by meteorologists, there has been limited research into occurrence of serial windstorms. However, climate modelling research is now providing the ability to explore the physical drivers of clustering, and to improve understanding of the hazard aspect of catastrophe modelling. While analytics tools, including catastrophe models, may incorporate assumptions regarding the influence of dependency through statistical means, the most recent research outputs provide a new strand of information with the potential to re-assess the probabilistic loss potential in light of clustering and to provide an additional view on probable maximum losses to windstorm-exposed portfolios across regions such as Northwest Europe. There is however, a need for the testing of these new techniques within operational (re)insurance applications, and this paper provide an overview of the most current clustering research, including the 2009 paper by Vitolo et. al., in relation to reinsurance risk modelling, and to assess the potential impact of such additional information on the overall risk assessment process. We examine the consequences of the serial clustering of extra-tropical cyclones demonstrated by Vitolo et al. (2009) from the perspective of a large European reinsurer, examining potential implications for: • Pricing • Accumulation And • Capital adequacy

  4. A history of wind erosion prediction models in the United States Department of Agriculture Prior to the Wind Erosion Prediction System

    USDA-ARS?s Scientific Manuscript database

    The Great Plains experienced an influx of settlers in the late 1850s to 1900. Periodic drought was hard on both settlers and the soil and caused severe wind erosion. The period known as the Dirty Thirties, 1931 to 1939, produced many severe windstorms, and the resulting dusty sky over Washington, D....

  5. Synoptic Scale North American Weather Tracks and the Formation of North Atlantic Windstorms

    NASA Astrophysics Data System (ADS)

    Baum, A. J.; Godek, M. L.

    2014-12-01

    Each winter, dozens of fatalities occur when intense North Atlantic windstorms impact Western Europe. Forecasting the tracks of these storms in the short term is often problematic, but long term forecasts provide an even greater challenge. Improved prediction necessitates the ability to identify these low pressure areas at formation and understand commonalities that distinguish these storms from other systems crossing the Atlantic, such as where they develop. There is some evidence that indicates the majority of intense windstorms that reach Europe have origins far west, as low pressure systems that develop over the North American continent. This project aims to identify the specific cyclogenesis regions in North America that produce a significantly greater number of dangerous storms. NOAA Ocean Prediction Center surface pressure reanalysis maps are used to examine the tracks of storms. Strong windstorms are characterized by those with a central pressure of less than 965 hPa at any point in their life cycle. Tracks are recorded using a coding system based on source region, storm track and dissipation region. The codes are analyzed to determine which region contains the most statistical significance with respect to strong Atlantic windstorm generation. The resultant set of codes also serves as a climatology of North Atlantic extratropical cyclones. Results indicate that a number of windstorms favor cyclogenesis regions off the east coast of the United States. A large number of strong storms that encounter east coast cyclogenesis zones originate in the central mountain region, around Colorado. These storms follow a path that exits North America around New England and subsequently travel along the Canadian coast. Some of these are then primed to become "bombs" over the open Atlantic Ocean.

  6. Dynamical structure and risk assessment of 20th Century Windstorms

    NASA Astrophysics Data System (ADS)

    Varino, Filipa; Philippe, Arbogast; Bruno, Joly; Gwendal, Rivière; Marie-Laure, Fandeur; Henry, Bovy; Jean-Baptiste, Granier; Mitchell-Wallace, Kirsten

    2017-04-01

    Windstorms play an important role in weather variability over western Europe. Strong winds associated with fronts and sting jets can lead to several social and economic damages. However, in addition to wind intensity, the displacement speed of the storm, its area and position are also important factors in determining loss. In this study we focus on windstorms associated with the highest damages of the 20th century, and we analyse whether the dynamical structure of the storm is related to its impact. First, we apply an extra-tropical storm tracking algorithm to the ECMWF ERA-20C reanalysis that covers the whole twentieth century and for the whole Northern Hemisphere. Secondly, using the same data, we compute the 3-hourly Loss and Meteorological index for 18 different European countries as in Pinto et al. (2012) with a 25km grid resolution. Thirdly, we develop a High-Loss Tracking Method that matches information from the Loss Index results and the trajectories tracked to systematically associate damages over a particular country to a particular storm. Such a combination provides information on the typical life cycle of storms that create strong damages over a particular country. Finally, only storms hitting France are considered. More than 1500 storms are detected over the whole period and their evolution is analyzed by performing various composites depending on their position relative to the jet stream and their region of impact.

  7. Ensemble Sensitivity Analysis of a Severe Downslope Windstorm in Complex Terrain: Implications for Forecast Predictability Scales and Targeted Observing Networks

    DTIC Science & Technology

    2013-09-01

    wave breaking (NWB) and eight wave breaking (WB) storms are shown...studies, and it follows that the wind storm characteristics are likely more three dimensional as well. For the purposes of this study, a severe DSWS is...regularly using the HWAS network at USAFA since its installation in 2004. A careful examination of these events reveals downslope storms that are

  8. Windstorm Impact Reduction Implementation Plan

    DTIC Science & Technology

    2007-01-01

    wind events, including hurricanes, tornadoes and straight line winds from thunderstorms. This information is repeated in brief during severe weather...event documentation and damage analyses. Better understanding of atmospheric dynamics of straight - line winds Wind observing systems and...Developed techniques for improved extreme wind speed maps Investigation of straight - line winds Wind speed and direction analysis for input to

  9. Dynamic landscape management

    Treesearch

    Valerie Rapp

    2002-01-01

    Pacific Northwest forests and all their species evolved with fires, floods, windstorms, landslides, and other disturbances. The dynamics of disturbance were basic to how forests changed and renewed. Disturbance regimes, as scientists call the long-term patterns of these events—what kind of event, how often, how large, and how severe—created the landscape patterns seen...

  10. Natural development and regeneration of a Central European montane spruce forest

    Treesearch

    Miroslav Svoboda; Shawn Fraver; Pavel Janda; Radek Bače; Jitka Zenáhlíková

    2010-01-01

    Montane Norway spruce forests of Central Europe have a very long tradition of use for timber production; however, recently there has been increasing concern for their role in maintaining biological diversity. This concern, coupled with recent severe windstorms that led to wide-spread bark beetle outbreaks, has brought the management of montane spruce forests to the...

  11. Dynamic landscape management.

    Treesearch

    Valerie Rapp

    2003-01-01

    Pacific Northwest forests and all their species evolved with fires, floods, windstorms, landslides, and other disturbances. The dynamics of disturbance were basic to how forests changed and renewed. Disturbance regimes, as scientists call the long-term patterns of these events—what kind of event, how often, how large, and how severe—created the landscape patterns seen...

  12. Impact of windstorm on a community of centipedes (Chilopoda) in a beech forest in Western Poland.

    PubMed

    Leśniewska, Małgorzata; Skwierczyński, Filip

    2018-01-01

    The study was carried out in the years 2016-2017, five years after a windstorm which destroyed 1/3 of the protected beech forest area in the west of Poland. The community of centipedes in the area affected by the windstorm was depleted in terms of the species richness, diversity, and population density. The dominance structures were shortened and the species composition was rebuilt. The areas that proved to be the richest in terms of species richness and diversity among the sites affected by the windstorm were the one where windfallen trees were left and the other where beech trees had been planted by humans. In total, the quantitative and qualitative samples collected four times throughout a year featured 608 specimens from 11 species of two centipede orders - Lithobiomorpha and Geophilomorpha. Lithobius curtipes and L. forficatus were found in all of the investigated areas. L. pelidnus and L. piceus were captured at control sites exclusively. Only one species - L. erythrocephalus was found solely at the damaged site. The most numerous and most frequently found species in the community were L. curtipes , L. mutabilis , and Strigamia acuminata respectively. Although windstorms are natural phenomena their consequences may lead to significant changes in the community of the investigated soil animals. The importance of coarse woody debris, significantly contributing to the improvement and maintenance of species richness and diversity of Chilopoda, has once again been confirmed.

  13. The probability of occurrence of high-loss windstorms

    NASA Astrophysics Data System (ADS)

    Massey, Neil

    2016-04-01

    Windstorms are one of the largest meteorological risks to life and property in Europe. High - loss windstorms, in terms of insured losses, are a result of not only the windspeed of the storm but also the position and track of the storm. The two highest loss storms on record, Daria (1990) and Lothar (1999) caused so much damage because they tracked across highly populated areas of Europe. Although the frequency and intensity of high - loss wind storms in the observed record is known, there are not enough samples, due to the short observed record, to truly know the distribution of the frequency and intensity of windstorms over Europe and, by extension, the distribution of losses which could occur if the atmosphere had been in a different state due to the internal variability of the atmosphere. Risk and loss modelling exercises carried out by and for the reinsurance industry have typically stochastically perturbed the historical record of high - loss windstorms to produce distributions of potential windstorms with greater sample sizes than the observations. This poster presents a new method of generating many samples of potential windstorms and analyses the frequency of occurrence, intensity and potential losses of these windstorms. The large ensemble regional climate modelling project weather@home is used to generate many regional climate model representations (800 per year) of the weather over Europe between 1985 and 2010. The regional climate model is driven at the boundaries by a free running global climate model and so each ensemble member represents a potential state of the atmosphere, rather than an observed state. The winter storm season of October to March is analysed by applying an objective cyclone identification and tracking algorithm to each ensemble member. From the resulting tracks, the windspeed within a 1000km radius of the cyclone centre is extracted and the maximum windspeed over a 72 hour period is derived as the storm windspeed footprint. This footprint is fed into a population based loss model to estimate the losses for the storm. Additionally the same analysis is performed on data from the same regional climate model, driven at the boundaries by ERA - Interim. This allows the tracks and losses of the storms in the observed record to be recovered using the same tracking method and loss model. A storm track matching function is applied to the storm tracks in the large ensemble and so analogues of the observed storms can be recovered. The frequency of occurrence of the high - loss storms in the large ensemble can then be determined, and used as a proxy for the frequency of occurrence in the observations.

  14. Climate changes and technological disasters in the Russian Federation

    NASA Astrophysics Data System (ADS)

    Petrova, E. G.

    2009-04-01

    Global warming and climate change are responsible for many ecological, economic and other significant influences on natural environment and human society. Increasing in number and severity of natural and technological disasters (TD) around the world is among of such influences. Great changes in geographical distribution of disasters are also expected. The study suggested examines this problem by the example of the Russian Federation. Using data base of TD and na-techs (natural-technological disasters) happened in the Russian Federation in 1992-2008 the most important types of disasters caused by various natural hazards were identified and classified for Russian federal regions. In concept of this study na-techs are considered as TD produced by natural factors. 88 percent of all na-techs occurring in the Russian Federation during the observation period were caused by natural processes related to various meteorological and hydrological phenomena. The majority of them were produced by windstorms and hurricanes (37%), snowfalls and snowstorms (27%), rainfalls (16%), hard frost and icy conditions of roads (12%). 11 types of na-techs caused by meteorological and hydrological hazards were found. These types are: (1) accidents at power and heat supply systems caused by windstorms, cyclones, and hurricanes, snowfalls and sleets, hard frost, rainfalls, hailstones, icing, avalanches, or thunderstorms (more than 50% of all na-techs registered in the data base); (2) accidents at water supply systems caused by hard frost, rainfalls, or subsidence of rock (3%); (3) sudden collapses of constructions caused by windstorms, snowfalls, rainfalls, hard frost, subsidence of rock, or floods (12%); (4) automobile accidents caused by snowfalls and snowstorms, icy conditions of roads, rainfalls, fogs, mist, or avalanches (10%); (5) water transport accidents caused by storms, cyclones, typhoons, or fogs (9%); (6) air crashes caused by windstorms, snowfalls, icing, or fogs; (7) railway accidents caused by snowfalls and snowstorms, rainfalls, landslides, or avalanches; (8) fires and explosions caused by lightning or heat; (9) pipeline ruptures caused by windstorms, subsidence of rock, or landslides; (10) agricultural accidents caused by frost, snowfalls, rainfalls, or storm; (11) accidents with toxic emissions caused by floods and landslides The map of their distribution within the Russian Federation was created. Climate changes expected until the end of the XXI century will have important consequences for frequency increasing and change in spatial distribution of na-techs in the Russian Federation. The occurrence of na-techs caused by hydro- and meteorological hazards as well as by other natural hazards related to climate change will be more frequent to the end of this century. The area subjected to technological risk will be enlarged essentially.

  15. The role of windstorm exposure and yellow cedar decline on landslide susceptibility in southeast Alaskan temperate rainforests

    Treesearch

    Brian Buma; Adelaide C. Johnson

    2015-01-01

    Interactions between ecological disturbances have the potential to alter other disturbances and their associated regimes, such as the likelihood, severity, and extent of events. The influence of exposure to wind and yellow cedar decline on the landslide regime of Alaskan temperate rainforests was explored using presence-only modeling techniques. The wind regime was...

  16. Southern pine engraver (Ips) Beetles in Your backyard

    Treesearch

    Kamal Gandhi; Daniel R. Miller

    2009-01-01

    Imagine that you decide to cut down a few pine trees in your yard, possibly because they are leaning a little too close to your house and you are concerned about damage to your house during a severe wind-storm. You cut the trees down in early spring prior to the start of hurricane season. Because of cost and other priorities, you leave...

  17. Progress report on the rate of deterioration of beetle-killed Engelmann spruce in Colorado

    Treesearch

    Frank G. Hawksworth; Thomas E. Hinds

    1959-01-01

    A severe windstorm in 1939 blew down extensive patches of Engelmann spruce (Picea engelmannii Parry) in western Colorado. A major epidemic of the Engelmann spruce beetle (Dendroctonus engelmanni Hopk.) developed from the windthrown trees and by 1952, when the epidemic was controlled by a combination of chemical and natural-control factors, an estimated four billion...

  18. Phenological Unmixing of Sequential Wildfire and Windstorm Effects in the Southern Appalachians

    NASA Astrophysics Data System (ADS)

    Norman, S. P.; Hargrove, W. W.; Christie, W. M.

    2017-12-01

    High frequency observations of land surface phenology suggest that forest disturbances are remarkably common. Frequent high resolution imagery provides increasingly rich insights into local impacts. In this paper, we examine the impacts from two back-to-back disturbances in Great Smoky Mountains National Park located in the eastern deciduous hardwood forests of the Southeastern USA. In November 2016, the drought- and windstorm-associated Chimney Tops 2 fire creating large high severity patches on fire-adapted shrub and pine lands on wind-exposed slopes during the dormant season. Five months later, a high wind event from the same southern direction occurred on May 4th during mid-greenup and caused ephemeral leaf stripping and seasonally-persistent damage over this same area. High resolution Sentinel 2 satellite imagery captures change caused by both events, but effects must be interpreted in light of normal phenological transitions. In this paper, we map relative severity from both disturbances, then compare and contrast how these two disturbances were manifest topographically and vegetationally. This case study provides rare insight into the importance and characteristics of topographic exposure as it relates to different disturbance types, and how that may contribute to the resilience of ecological communities within this mountainous landscape.

  19. Reliability of windstorm predictions in the ECMWF ensemble prediction system

    NASA Astrophysics Data System (ADS)

    Becker, Nico; Ulbrich, Uwe

    2016-04-01

    Windstorms caused by extratropical cyclones are one of the most dangerous natural hazards in the European region. Therefore, reliable predictions of such storm events are needed. Case studies have shown that ensemble prediction systems (EPS) are able to provide useful information about windstorms between two and five days prior to the event. In this work, ensemble predictions with the European Centre for Medium-Range Weather Forecasts (ECMWF) EPS are evaluated in a four year period. Within the 50 ensemble members, which are initialized every 12 hours and are run for 10 days, windstorms are identified and tracked in time and space. By using a clustering approach, different predictions of the same storm are identified in the different ensemble members and compared to reanalysis data. The occurrence probability of the predicted storms is estimated by fitting a bivariate normal distribution to the storm track positions. Our results show, for example, that predicted storm clusters with occurrence probabilities of more than 50% have a matching observed storm in 80% of all cases at a lead time of two days. The predicted occurrence probabilities are reliable up to 3 days lead time. At longer lead times the occurrence probabilities are overestimated by the EPS.

  20. Catastrophic windstorm and fuel-reduction treatments alter ground beetle (Coleoptera: Carabidae) assemblages in a North American sub-boreal forest

    Treesearch

    Kamal J.K. Gandhi; Daniel W. Gilmore; Steven A. Katovich; William J. Mattson; John C. Zasada; Steven J. Seybold

    2008-01-01

    We studied the short-term effects of a catastrophic windstorm and subsequent salvage-logging and prescribed-burning fuel-reduction treatments on ground beetle (Coleoptera: Carabidae) assemblages in a sub-boreal forest in northeastern Minnesota, USA. During 2000?2003, 29,873 ground beetles represented by 71 species were caught in unbaited and baited pitfall traps in...

  1. Coupled effects of wind-storms and drought on tree mortality across 115 forest stands from the Western Alps and the Jura mountains.

    PubMed

    Csilléry, Katalin; Kunstler, Georges; Courbaud, Benoît; Allard, Denis; Lassègues, Pierre; Haslinger, Klaus; Gardiner, Barry

    2017-12-01

    Damage due to wind-storms and droughts is increasing in many temperate forests, yet little is known about the long-term roles of these key climatic factors in forest dynamics and in the carbon budget. The objective of this study was to estimate individual and coupled effects of droughts and wind-storms on adult tree mortality across a 31-year period in 115 managed, mixed coniferous forest stands from the Western Alps and the Jura mountains. For each stand, yearly mortality was inferred from management records, yearly drought from interpolated fields of monthly temperature, precipitation and soil water holding capacity, and wind-storms from interpolated fields of daily maximum wind speed. We performed a thorough model selection based on a leave-one-out cross-validation of the time series. We compared different critical wind speeds (CWSs) for damage, wind-storm, and stand variables and statistical models. We found that a model including stand characteristics, drought, and storm strength using a CWS of 25 ms -1 performed the best across most stands. Using this best model, we found that drought increased damage risk only in the most southerly forests, and its effect is generally maintained for up to 2 years. Storm strength increased damage risk in all forests in a relatively uniform way. In some stands, we found positive interaction between drought and storm strength most likely because drought weakens trees, and they became more prone to stem breakage under wind-loading. In other stands, we found negative interaction between drought and storm strength, where excessive rain likely leads to soil water saturation making trees more susceptible to overturning in a wind-storm. Our results stress that temporal data are essential to make valid inferences about ecological impacts of disturbance events, and that making inferences about disturbance agents separately can be of limited validity. Under projected future climatic conditions, the direction and strength of these ecological interactions could also change. © 2017 John Wiley & Sons Ltd.

  2. Modelling economic losses of historic and present-day high-impact winter storms in Switzerland

    NASA Astrophysics Data System (ADS)

    Welker, Christoph; Martius, Olivia; Stucki, Peter; Bresch, David; Dierer, Silke; Brönnimann, Stefan

    2015-04-01

    Windstorms can cause significant financial damage and they rank among the most hazardous meteorological hazards in Switzerland. Risk associated with windstorms involves the combination of hazardous weather conditions, such as high wind gust speeds, and socio-economic factors, such as the distribution of assets as well as their susceptibilities to damage. A sophisticated risk assessment is important in a wide range of areas and has benefits for e.g. the insurance industry. However, a sophisticated risk assessment needs a large sample of storm events for which high-resolution, quantitative meteorological and/or loss data are available. Latter is typically an aggravating factor. For present-day windstorms in Switzerland, the data basis is generally sufficient to describe the meteorological development and wind forces as well as the associated impacts. In contrast, historic windstorms are usually described by graphical depictions of the event and/or by weather and loss reports. The information on historic weather events is overall sparse and the available historic weather and loss reports mostly do not provide quantitative information. It has primarily been the field of activity of environmental historians to study historic weather extremes and their impacts. Furthermore, the scarce availability of atmospheric datasets reaching back sufficiently in time has so far limited the analysis of historic weather events. The Twentieth Century Reanalysis (20CR) ensemble dataset, a global atmospheric reanalysis currently spanning 1871 to 2012, offers potentially a very valuable resource for the analysis of historic weather events. However, the 2°×2° latitude-longitude grid of the 20CR is too coarse to realistically represent the complex orography of Switzerland, which has considerable ramifications for the representation of smaller-scale features of the surface wind field influenced by the local orography. Using the 20CR as a starting point, this study illustrates a method to simulate the wind field and related economic impact of both historic and present-day high-impact winter storms in Switzerland since end of the 19th century. Our technique involves the dynamical downscaling of the 20CR to 3 km horizontal resolution using the numerical Weather Research and Forecasting model and the subsequent loss simulation using an open-source impact model. This impact model estimates, for modern economic and social conditions, storm-related economic losses at municipality level, and thus allows a numerical simulation of the impact from both historic and present-day severe winter storms in Switzerland on a relatively fine spatial scale. In this study, we apply the modelling chain to a storm sample of almost 90 high-impact winter storms in Switzerland since 1871, and we are thus able to make a statement of the typical wind and loss patterns of hazardous windstorms in Switzerland. To evaluate our modelling chain, we compare simulated storm losses with insurance loss data for the present-day windstorms "Lothar" and "Joachim" in December 1999 and December 2011, respectively. Our study further includes a range of sensitivity experiments and a discussion of the main sources of uncertainty.

  3. Progress and challenges with Warn-on-Forecast

    NASA Astrophysics Data System (ADS)

    Stensrud, David J.; Wicker, Louis J.; Xue, Ming; Dawson, Daniel T.; Yussouf, Nusrat; Wheatley, Dustan M.; Thompson, Therese E.; Snook, Nathan A.; Smith, Travis M.; Schenkman, Alexander D.; Potvin, Corey K.; Mansell, Edward R.; Lei, Ting; Kuhlman, Kristin M.; Jung, Youngsun; Jones, Thomas A.; Gao, Jidong; Coniglio, Michael C.; Brooks, Harold E.; Brewster, Keith A.

    2013-04-01

    The current status and challenges associated with two aspects of Warn-on-Forecast-a National Oceanic and Atmospheric Administration research project exploring the use of a convective-scale ensemble analysis and forecast system to support hazardous weather warning operations-are outlined. These two project aspects are the production of a rapidly-updating assimilation system to incorporate data from multiple radars into a single analysis, and the ability of short-range ensemble forecasts of hazardous convective weather events to provide guidance that could be used to extend warning lead times for tornadoes, hailstorms, damaging windstorms and flash floods. Results indicate that a three-dimensional variational assimilation system, that blends observations from multiple radars into a single analysis, shows utility when evaluated by forecasters in the Hazardous Weather Testbed and may help increase confidence in a warning decision. The ability of short-range convective-scale ensemble forecasts to provide guidance that could be used in warning operations is explored for five events: two tornadic supercell thunderstorms, a macroburst, a damaging windstorm and a flash flood. Results show that the ensemble forecasts of the three individual severe thunderstorm events are very good, while the forecasts from the damaging windstorm and flash flood events, associated with mesoscale convective systems, are mixed. Important interactions between mesoscale and convective-scale features occur for the mesoscale convective system events that strongly influence the quality of the convective-scale forecasts. The development of a successful Warn-on-Forecast system will take many years and require the collaborative efforts of researchers and operational forecasters to succeed.

  4. In Brief: Earthquake, windstorm bills approved; Atmospheric map of nitrogen dioxide

    NASA Astrophysics Data System (ADS)

    Showstack, Randy

    2004-10-01

    The U.S. House of Representatives on 8 October unanimously approved legislation(H.R. 2608) to mitigate damage from earthquakes and windstorms. President Bush is expected to sign into law this bill which has been negotiated between the House and Senate. The European Space Agency's(ESA) Envisat satellite for environmental monitoring has produced a high-resolution global atmospheric map of nitrogen dioxide, the agency announced on 11 October.

  5. Innovations in fuels management: Demonstrating success in treating a serious threat of wildfire in Northern Minnesota

    Treesearch

    Dennis Neitzke

    2007-01-01

    This case study illustrates the positive effects of strategic fuels treatments in continuous heavy fuels. In 1999, a severe windstorm blew down close to 1,000 square miles of forest land in northern Minnesota and Canada. As much as 400,000 acres of the blowdown occurred in the Boundary Waters Canoe Area Wilderness. Fire experts were invited to assess the hazardous...

  6. The Morphology of Cyclonic Windstorms

    NASA Astrophysics Data System (ADS)

    Hewson, Tim

    2015-04-01

    The aim of this study is to help facilitate the correct interpretation and use of model analyses and predictions of windstorms in the extra-tropics, and to show that 'storm detection' does not just depend on the efficacy of the identification/tracking algorithm. Under the auspices of the IMILAST (Intercomparison of MId-LAtitude STorm diagnostics) project, 29 damaging European cyclonic windstorms have been studied in detail, using observational evidence as the main tool. Accordingly a conceptual model of windstorm evolution has been constructed. This usefully has its roots in the evolution one sees on standard synoptic charts, and highlights that three types of damage footprint can be associated. Building on previous work these are referred to as the warm jet, the sting jet and the cold jet footprints. The jet phenomena themselves each relate to the proximity of fronts on the synoptic charts, and accordingly occur in airmasses with different stability characteristics. These characteristics seem to play a large role in determining the magnitude of surface gusts, and how those gusts vary between coastal and inland sites. These aspects will be discussed with examples, showing that one cannot simply characterise or rank cyclones using wind strength on a lower tropospheric level such as 850hPa. A key finding that sets the sting jet apart, and that makes it a particularly dangerous phenomena, is that gust magnitude is relatively unaffected by passage inland, and this seems to relate to the atmosphere in its environment being destabilised from above. For sting jets wind strength may be greatest below 850hPa. Unfortunately neither current generation global re-analyses, nor global climate models seem to be able to simulate sting jets. This is for various reasons, though their low resolution is key. This limitation has been recognised previously, and the standard way to address this has been to use a re-calibration technique. The potential pitfalls of this approach will be highlighted using the aforementioned windstorm set to illustrate. Based again on case studies it will be shown that spatial resolution in a numerical model needs to be of order 10-20km to capture most major windstorms. However even then some of the smaller systems, which can be equally damaging, will be missed.

  7. Influence of prolonged Anomalies in North Atlantic Sea Surface Temperature on Winter Windstorms

    NASA Astrophysics Data System (ADS)

    Höschel, Ines; Schuster, Mareike; Grieger, Jens; Ulbrich, Uwe

    2016-04-01

    The focus of this presentation is on decadal scale variations in the frequency and in the intensity of mid-latitude winter windstorms. Projections for the end of the next century are often beyond the time horizon of business, thus there is an increasing interest on decadal prediction, especially for infrastructural planning and in the insurance industry. One source of decadal predictability is the Atlantic multidecadal variability (AMV), a change in the sea surface temperature of the North Atlantic, strongly linked to the meridional overturning circulation. Correlation patterns between annual AMV-indices and annual mean of geopotential height at 500 hPa in reanalysis data show an anti-correlation in the North Atlantic. That is, during AMV warm phases the North Atlantic Oscillation (NAO) is more negative. Consequently, AMV should influence the characteristics of winter windstorms at multi-year scales. For the presented investigations a 10-member ensemble of 38-year-long idealized simulations with the atmosphere model ECHAM6 with lower boundary conditions, representing warm and cool phases of the AMV, is used. In the idealized simulations, the anti-correlation between AMV and NAO is well represented. For the identification of winter windstorms an objective wind tracking algorithm based on the exceedance of the local 98th percentile of 10m wind speed is applied. Storms under AMV-warm and AMV-cool conditions will be compared in terms of storm track density and probability distribution of storm characteristics.

  8. Derecho Hazards in the United States.

    NASA Astrophysics Data System (ADS)

    Ashley, Walker S.; Mote, Thomas L.

    2005-11-01

    Convectively generated wind-storms occur over broad temporal and spatial scales; however, the more widespread and longer lived of these windstorms have been given the name "derecho." Utilizing an integrated derecho database, including 377 events from 1986 to 2003, this investigation reveals the amount of insured property losses, fatalities, and injuries associated with these windstorms in the United States. Individual derechos have been responsible for up to 8 fatalities, 204 injuries, forest blow-downs affecting over 3,000 km2 of timber, and estimated insured losses of nearly a $500 million. Findings illustrate that derecho fatalities occur more frequently in vehicles or while boating, while injuries are more likely to happen in vehicles or mobile homes. Both fatalities and injuries are most common outside the region with the highest derecho frequency. An underlying synthesis of both physical and social vulnerabilities is suggested as the cause of the unexpected casualty distribution. In addition, casualty statistics and damage estimates from hurricanes and tornadoes are contrasted with those from derechos to emphasize that derechos can be as hazardous as many tornadoes and hurricanes.


  9. Spring phytoplankton community response to an episodic windstorm event in oligotrophic waters offshore from the Ulleungdo and Dokdo islands, Korea

    NASA Astrophysics Data System (ADS)

    Baek, Seung Ho; Lee, Minji; Kim, Yun-Bae

    2018-02-01

    We investigated the phytoplankton distribution and its relationship to environmental factors at 40 stations in oligotrophic waters offshore from the Ulleungdo and Dokdo islands (hereafter Ulleungdo or Dokdo) in Japan/East Sea, prior to and following an episodic windstorm event. Nutrient addition bioassay experiments (control, + N, + P, and + NP, in both the presence and absence of added Fe) were also conducted to investigate the growth response of the phytoplankton assemblage and its nutrient consumption, using surface seawater collected from stations 36 and 40, which are in the vicinity of the Dokdo. Field measurements showed that the surface water temperature ranged from 13.33 °C to 16.18 °C and the salinity ranged from 34.03 to 34.55. The nitrate + nitrite, phosphate, and silicate concentrations varied from 0.07 to 2.22 μM, 0.01 μM to 0.19 μM and 0.76 to 6.93 μM, respectively. The Chl-a concentration varied from 0.36 to 15.97 μg L- 1 (average 2.66 ± 3.26 μM), but was significantly higher in Zone III-a (Dokdo) than in Zone I-b (between Ulljin and Ulleungdo, prior to the windstorm), Zone I-a (between Ulljin and Ulleungdo, following the windstorm), and Zone II-a (Ulleungdo) (F = 17.438, p < 0.001; ANOVA). Diatoms and Raphidophyta were the dominant phytoplankton types. Following episodic windstorm events the abundance of the raphidophyte Heterosigma akashiwo was maintained at high levels in the offshore oligotrophic area around the Ulleungdo and Dokdo, particularly in Zone III-a (F = 16.889, p < 0.001; ANOVA). In the algal bioassays conducted with and without added Fe, the in vivo fluorescence values in the + N and + NP treatments were higher than in the control and the + P treatments, which suggests that plankton biomass production was stimulated by N availability. In the + N and + NP treatments, H. akashiwo typically dominated in the initial, logarithmic, and stationary growth phases. The growth rate of the phytoplankton community in the presence of added Fe was not statistically different (p > 0.05) from that in the treatments without added Fe. The results suggest that in this area natural phytoplankton communities, including those dominated by H. akashiwo, respond rapidly to pulsed nitrogen loading events. The episodic windstorm event probably resulted in vertical mixing that brought nutrients into the euphotic upper layer. The results suggest that such events are important in triggering spring phytoplankton blooms in potentially oligotrophic waters, such as those in the vicinity of the Dokdo in the Japan/East Sea.

  10. Return periods of losses associated with European windstorm series in a changing climate

    NASA Astrophysics Data System (ADS)

    Karremann, Melanie K.; Pinto, Joaquim G.; Reyers, Mark; Klawa, Matthias

    2015-04-01

    During the last decades, several windstorm series hit Europe leading to large aggregated losses. Such storm series are examples of serial clustering of extreme cyclones, presenting a considerable risk for the insurance industry. Clustering of events and return periods of storm series affecting Europe are quantified based on potential losses using empirical models. Moreover, possible future changes of clustering and return periods of European storm series with high potential losses are quantified. Historical storm series are identified using 40 winters of NCEP reanalysis data (1973/1974 - 2012/2013). Time series of top events (1, 2 or 5 year return levels) are used to assess return periods of storm series both empirically and theoretically. Return periods of historical storm series are estimated based on the Poisson and the negative binomial distributions. Additionally, 800 winters of ECHAM5/MPI-OM1 general circulation model simulations for present (SRES scenario 20C: years 1960- 2000) and future (SRES scenario A1B: years 2060- 2100) climate conditions are investigated. Clustering is identified for most countries in Europe, and estimated return periods are similar for reanalysis and present day simulations. Future changes of return periods are estimated for fixed return levels and fixed loss index thresholds. For the former, shorter return periods are found for Western Europe, but changes are small and spatially heterogeneous. For the latter, which combines the effects of clustering and event ranking shifts, shorter return periods are found everywhere except for Mediterranean countries. These changes are generally not statistically significant between recent and future climate. However, the return periods for the fixed loss index approach are mostly beyond the range of preindustrial natural climate variability. This is not true for fixed return levels. The quantification of losses associated with storm series permits a more adequate windstorm risk assessment in a changing climate.

  11. Identification of Extreme Events Under Climate Change Conditions Over Europe and The Northwest-atlantic Region: Spatial Patterns and Time Series Characteristics

    NASA Astrophysics Data System (ADS)

    Leckebusch, G.; Ulbrich, U.; Speth, P.

    In the context of climate change and the resulting possible impacts on socio-economic conditions for human activities it seems that due to a changed occurrence of extreme events more severe consequences have to be expected than from changes in the mean climate. These extreme events like floods, excessive heats and droughts or windstorms possess impacts on human social and economic life in different categories such as forestry, agriculture, energy use, tourism and the reinsurance business. Reinsurances are affected by nearly 70% of all insured damages over Europe in the case of wind- storms. Especially the December 1999 French windstorms caused damages about 10 billion. A new EU-founded project (MICE = Modelling the Impact of Climate Ex- tremes) will focus on these impacts caused by changed occurrences of extreme events over Europe. Based upon the output of general circulation models as well as regional climate models, investigations are carried out with regard to time series characteristics as well as the spatial patterns of extremes under climate changed conditions. After the definition of specific thresholds for climate extremes, in this talk we will focus on the results of the analysis for the different data sets (HadCM3 and CGCMII GCM's and RCM's, re-analyses, observations) with regard to windstorm events. At first the results of model outputs are validated against re-analyses and observations. Especially a comparison of the stormtrack (2.5 to 8 day bandpass filtered 500 hPa geopotential height), cyclone track, cyclone frequency and intensity is presented. Highly relevant to damages is the extreme wind near the ground level, so the 10 m wind speed will be investigated additionally. of special interest to possible impacts is the changed spatial occurrence of windspeed maxima under 2xCO2-induced climate change.

  12. Increasing large scale windstorm damage in Western, Central and Northern European forests, 1951-2010

    NASA Astrophysics Data System (ADS)

    Gregow, H.; Laaksonen, A.; Alper, M. E.

    2017-04-01

    Using reports of forest losses caused directly by large scale windstorms (or primary damage, PD) from the European forest institute database (comprising 276 PD reports from 1951-2010), total growing stock (TGS) statistics of European forests and the daily North Atlantic Oscillation (NAO) index, we identify a statistically significant change in storm intensity in Western, Central and Northern Europe (17 countries). Using the validated set of storms, we found that the year 1990 represents a change-point at which the average intensity of the most destructive storms indicated by PD/TGS > 0.08% increased by more than a factor of three. A likelihood ratio test provides strong evidence that the change-point represents a real shift in the statistical behaviour of the time series. All but one of the seven catastrophic storms (PD/TGS > 0.2%) occurred since 1990. Additionally, we detected a related decrease in September-November PD/TGS and an increase in December-February PD/TGS. Our analyses point to the possibility that the impact of climate change on the North Atlantic storms hitting Europe has started during the last two and half decades.

  13. Increasing large scale windstorm damage in Western, Central and Northern European forests, 1951–2010

    PubMed Central

    Gregow, H.; Laaksonen, A.; Alper, M. E.

    2017-01-01

    Using reports of forest losses caused directly by large scale windstorms (or primary damage, PD) from the European forest institute database (comprising 276 PD reports from 1951–2010), total growing stock (TGS) statistics of European forests and the daily North Atlantic Oscillation (NAO) index, we identify a statistically significant change in storm intensity in Western, Central and Northern Europe (17 countries). Using the validated set of storms, we found that the year 1990 represents a change-point at which the average intensity of the most destructive storms indicated by PD/TGS > 0.08% increased by more than a factor of three. A likelihood ratio test provides strong evidence that the change-point represents a real shift in the statistical behaviour of the time series. All but one of the seven catastrophic storms (PD/TGS > 0.2%) occurred since 1990. Additionally, we detected a related decrease in September–November PD/TGS and an increase in December–February PD/TGS. Our analyses point to the possibility that the impact of climate change on the North Atlantic storms hitting Europe has started during the last two and half decades. PMID:28401947

  14. Index insurance for pro-poor conservation of hornbills in Thailand.

    PubMed

    Chantarat, Sommarat; Barrett, Christopher B; Janvilisri, Tavan; Mudsri, Sittichai; Niratisayakul, Chularat; Poonswad, Pilai

    2011-08-23

    This study explores the potential of index insurance as a mechanism to finance community-based biodiversity conservation in areas where a strong correlation exists between natural disaster risk, keystone species populations, and the well-being of the local population. We illustrate this potential using the case of hornbill conservation in the Budo-Sungai Padi rainforests of southern Thailand, using 16-y hornbill reproduction data and 5-y household expenditures data reflecting local economic well-being. We show that severe windstorms cause both lower household expenditures and critical nest tree losses that directly constrain nesting capacity and so reduce the number of hornbill chicks recruited in the following breeding season. Forest residents' coping strategies further disturb hornbills and their forest habitats, compounding windstorms' adverse effects on hornbills' recruitment in the following year. The strong statistical relationship between wind speed and both hornbill nest tree losses and household expenditures opens up an opportunity to design wind-based index insurance contracts that could both enhance hornbill conservation and support disaster-affected households in the region. We demonstrate how such contracts could be written and operationalized and then use simulations to show the significant promise of unique insurance-based approaches to address weather-related risk that threatens both biodiversity and poor populations.

  15. Climate Change and Forest Disturbances

    Treesearch

    V. H. Dale; L. A. Joyce; S. McNulty; R. P. Neilson; M. P. Ayres; M. D. Flannigan; P. J. Hanson; L. C. Irland; A. E. Lugo; C. J. Peterson; D. Simberloff; F. J. Swanson; B. J. Stocks; B. M. Wotton

    2001-01-01

    CLIMATE CHANGE CAN AFFECT FORESTS BY ALTERING THE FREQUENCY, INTENSITY, DURATION, AND TIMING OF FIRE, DROUGHT, INTRODUCED SPECIES, INSECT AND PATHOGEN OUTBREAKS, HURRICANES, WINDSTORMS, ICE STORMS, OR LANDSLIDES

  16. A storm severity index based on return levels of wind speeds

    NASA Astrophysics Data System (ADS)

    Becker, Nico; Nissen, Katrin M.; Ulbrich, Uwe

    2015-04-01

    European windstorms related to extra-tropical cyclones cause considerable damages to infrastructure during the winter season. Leckebusch et al. (2008) introduced a storm severity index (SSI) based on the exceedances of the local 98th percentile of wind speeds. The SSI is based on the assumption that (insured) damage usually occurs within the upper 2%-quantile of the local wind speed distribution (i.e. if the 98th percentile is exceeded). However, critical infrastructure, for example related to the power network or the transportation system, is usually designed to withstand wind speeds reaching the local 50-year return level, which is much higher than the 98th percentile. The aim of this work is to use the 50-year return level to develop a modified SSI, which takes into account only extreme wind speeds relevant to critical infrastructure. As a first step we use the block maxima approach to estimate the spatial distribution of return levels by fitting the generalized extreme value (GEV) distribution to the wind speeds retrieved from different reanalysis products. We show that the spatial distributions of the 50-year return levels derived from different reanalyses agree well within large parts of Europe. The differences between the reanalyses are largely within the range of the uncertainty intervals of the estimated return levels. As a second step the exceedances of the 50-year return level are evaluated and compared to the exceedances of the 98th percentiles for different extreme European windstorms. The areas where the wind speeds exceed the 50-year return level in the reanalysis data do largely agree with the areas where the largest damages were reported, e.g. France in the case of "Lothar" and "Martin" and Central Europe in the case of "Kyrill". Leckebusch, G. C., Renggli, D., & Ulbrich, U. (2008). Development and application of an objective storm severity measure for the Northeast Atlantic region. Meteorologische Zeitschrift, 17(5), 575-587.

  17. Natural hazard metaphors for financial crises

    NASA Astrophysics Data System (ADS)

    Woo, Gordon

    2001-02-01

    Linguistic metaphors drawn from natural hazards are commonly used at times of financial crisis. A brewing storm, a seismic shock, etc., evoke the abruptness and severity of a market collapse. If the language of windstorms, earthquakes and volcanic eruptions is helpful in illustrating a financial crisis, what about the mathematics of natural catastrophes? Already, earthquake prediction methods have been applied to economic recessions, and volcanic eruption forecasting techniques have been applied to market crashes. The purpose of this contribution is to survey broadly the mathematics of natural catastrophes, so as to convey the range of underlying principles, some of which may serve as mathematical metaphors for financial applications.

  18. 38 CFR 36.4369 - Correction of structural defects.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... (CONTINUED) LOAN GUARANTY Guaranty or Insurance of Loans to Veterans With Electronic Reporting § 36.4369... fire, earthquake, flood, windstorm, or waste, which seriously affects the livability of the dwelling...

  19. Lightning activity during the 1999 Superior derecho

    NASA Astrophysics Data System (ADS)

    Price, Colin G.; Murphy, Brian P.

    2002-12-01

    On 4 July 1999, a severe convective windstorm, known as a derecho, caused extensive damage to forested regions along the United States/Canada border, west of Lake Superior. There were 665,000 acres of forest destroyed in the Boundary Waters Canoe Area Wilderness (BWCAW) in Minnesota and Quetico Provincial Park in Canada, with approximately 12.5 million trees blown down. This storm resulted in additional severe weather before and after the occurrence of the derecho, with continuous cloud-to-ground (CG) lightning occurring for more than 34 hours during its path across North America. At the time of the derecho the percentage of positive cloud-to-ground (+CG) lightning measured by the Canadian Lightning Detection Network (CLDN) was greater than 70% for more than three hours, with peak values reaching 97% positive CG lightning. Such high ratios of +CG are rare, and may be useful indicators of severe weather.

  20. Lightning Activity During the 1999 Superior Derecho

    NASA Astrophysics Data System (ADS)

    Price, C. G.; Murphy, B. P.

    2002-12-01

    On 4 July 1999, a severe convective windstorm, known as a derecho, caused extensive damage to forested regions along the United States/Canada border, west of Lake Superior. There were 665,000 acres of forest destroyed in the Boundary Waters Canoe Area Wilderness (BWCAW) in Minnesota and Quetico Provincial Park in Canada, with approximately 12.5 million trees blown down. This storm resulted in additional severe weather before and after the occurrence of the derecho, with continuous cloud-to-ground (CG) lightning occurring for more than 34 hours during its path across North America. At the time of the derecho the percentage of positive cloud-to-ground (+CG) lightning measured by the Canadian Lightning Detection Network (CLDN) was greater than 70% for more than three hours, with peak values reaching 97% positive CG lightning. Such high ratios of +CG are rare, and may be useful indicators of severe weather.

  1. Increased wind risk from sting-jet windstorms with climate change

    NASA Astrophysics Data System (ADS)

    Martínez-Alvarado, Oscar; Gray, Suzanne L.; Hart, Neil C. G.; Clark, Peter A.; Hodges, Kevin; Roberts, Malcolm J.

    2018-04-01

    Extra-tropical cyclones dominate autumn and winter weather over western Europe. The strongest cyclones, often termed windstorms, have a large socio-economic impact on landfall due to strong surface winds and coastal storm surges. Climate model integrations have predicted a future increase in the frequency of, and potential damage from, European windstorms and yet these integrations cannot properly represent localised jets, such as sting jets, that may significantly enhance damage. Here we present the first prediction of how the climatology of sting-jet-containing cyclones will change in a future warmer climate, considering the North Atlantic and Europe. A proven sting-jet precursor diagnostic is applied to 13 year present-day and future (~2100) climate integrations from the Met Office Unified Model in its Global Atmosphere 3.0 configuration. The present-day climate results are consistent with previously-published results from a reanalysis dataset (with around 32% of cyclones exhibiting the sing-jet precursor), lending credibility to the analysis of the future-climate integration. The proportion of cyclones exhibiting the sting-jet precursor in the future-climate integration increases to 45%. Furthermore, while the proportion of explosively-deepening storms increases only slightly in the future climate, the proportion of those storms with the sting-jet precursor increases by 60%. The European resolved-wind risk associated with explosively-deepening storms containing a sting-jet precursor increases substantially in the future climate; in reality this wind risk is likely to be further enhanced by the release of localised moist instability, unresolved by typical climate models.

  2. Hydraulic and Wave Aspects of Novorossiysk Bora

    NASA Astrophysics Data System (ADS)

    Shestakova, Anna A.; Moiseenko, Konstantin B.; Toropov, Pavel A.

    2018-02-01

    Bora in Novorossiysk (seaport on the Black Sea coast of the Caucasus) is one of the strongest and most prominent downslope windstorms on the territory of Russia. In this paper, we evaluate the applicability of the hydraulic and wave hypotheses, which are widely used for downslope winds around the world, to Novorossiysk bora on the basis of observational data, reanalysis, and mesoscale numerical modeling with WRF-ARW. It is shown that mechanism of formation of Novorossiysk bora is essentially mixed, which is expressed in the simultaneous presence of gravity waves breaking and a hydraulic jump, as well as in the significant variability of the contribution of wave processes to the windstorm dynamics. Effectiveness of each mechanism depends on the elevated inversion intensity and mean state critical level height. Most favorable conditions for both mechanisms working together are moderate or weak inversion and high or absent critical level.

  3. AmeriFlux US-Slt Silas Little- New Jersey

    DOE Data Explorer

    Clark, Ken [USDA Forest Service

    2016-01-01

    This is the AmeriFlux version of the carbon flux data for the site US-Slt Silas Little- New Jersey. Site Description - Wildfires, prescribed fires, insect defoliation events and windstorms are the common disturbances in the NJ Pinelands. The oak-dominated forest at Silas Little Experimental Forest was most recently defoliated by Gypsy moth (Lymantria dispar L.) in 2006 to 2008, with complete defoliation occuring in 2007. Following this multi-year defoliation event, oak mortality was significant, and resulted in the death of approximately 20 % of the overstory oaks, and a similar reduction in stand biomass. Previous disturbances have included windstorms and earlier Gypsy moth defoliation events in the 1990's. The last major wildfire to occur at and near the Experimental Forest was in 1963. Since then, a number of prescribed fires have been conducted in the vicinity of the Silas Little flux site.

  4. 7 CFR 1779.2 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Commitment for Guarantee. The Agency's written statement to the lender that the material submitted is..., windstorm, lightning, hail, explosion, riot, civil commotion, aircraft, vehicles, smoke, builder's risk... directly involved in the operation and management of the borrower. Protective advances. Advances made by...

  5. National Windstorm Impact Reduction Act Reauthorization of 2014

    THOMAS, 113th Congress

    Rep. Neugebauer, Randy [R-TX-19

    2013-04-26

    Senate - 07/15/2014 Received in the Senate and Read twice and referred to the Committee on Commerce, Science, and Transportation. (All Actions) Tracker: This bill has the status Passed HouseHere are the steps for Status of Legislation:

  6. Can we trust climate models to realistically represent severe European windstorms?

    NASA Astrophysics Data System (ADS)

    Trzeciak, Tomasz M.; Knippertz, Peter; Owen, Jennifer S. R.

    2014-05-01

    Despite the enormous advances made in climate change research, robust projections of the position and the strength of the North Atlantic stormtrack are not yet possible. In particular with respect to damaging windstorms, this incertitude bears enormous risks to European societies and the (re)insurance industry. Previous studies have addressed the problem of climate model uncertainty through statistical comparisons of simulations of the current climate with (re-)analysis data and found that there is large disagreement between different climate models, different ensemble members of the same model and observed climatologies of intense cyclones. One weakness of such statistical evaluations lies in the difficulty to separate influences of the climate model's basic state from the influence of fast processes on the development of the most intense storms. Compensating effects between the two might conceal errors and suggest higher reliability than there really is. A possible way to separate influences of fast and slow processes in climate projections is through a "seamless" approach of hindcasting historical, severe storms with climate models started from predefined initial conditions and run in a numerical weather prediction mode on the time scale of several days. Such a cost-effective case-study approach, which draws from and expands on the concepts from the Transpose-AMIP initiative, has recently been undertaken in the SEAMSEW project at the University of Leeds funded by the AXA Research Fund. Key results from this work focusing on 20 historical storms and using different lead times and horizontal and vertical resolutions include: (a) Tracks are represented reasonably well by most hindcasts. (b) Sensitivity to vertical resolution is low. (c) There is a systematic underprediction of cyclone depth for a coarse resolution of T63, but surprisingly no systematic bias is found for higher-resolution runs using T127, showing that climate models are in fact able to represent the storm dynamics well, if given the correct initial conditions. Combined with a too low number of deep cyclones in many climate models, this points too an insufficient number of storm-prone initial conditions in free-running climate runs. This question will be addressed in future work.

  7. 7 CFR 3575.2 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... material submitted is approved subject to the completion of all conditions and requirements contained in.... Insurance. Fire, windstorm, lightning, hail, explosion, riot, civil commotion, aircraft, vehicles, smoke... borrower. Problem loan. A loan which is not complying with its terms and conditions. Protective advances...

  8. 7 CFR 3575.2 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... material submitted is approved subject to the completion of all conditions and requirements contained in.... Insurance. Fire, windstorm, lightning, hail, explosion, riot, civil commotion, aircraft, vehicles, smoke... borrower. Problem loan. A loan which is not complying with its terms and conditions. Protective advances...

  9. 7 CFR 3575.2 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... material submitted is approved subject to the completion of all conditions and requirements contained in.... Insurance. Fire, windstorm, lightning, hail, explosion, riot, civil commotion, aircraft, vehicles, smoke... borrower. Problem loan. A loan which is not complying with its terms and conditions. Protective advances...

  10. 7 CFR 4279.143 - Insurance.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... value of the collateral or the amount of the loan. Hazard insurance includes fire, windstorm, lightning... Regulations of the Department of Agriculture (Continued) RURAL BUSINESS-COOPERATIVE SERVICE AND RURAL... Insurance. (a) Hazard. Hazard insurance with a standard mortgage clause naming the lender as beneficiary...

  11. 7 CFR 4279.143 - Insurance.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... value of the collateral or the amount of the loan. Hazard insurance includes fire, windstorm, lightning... Regulations of the Department of Agriculture (Continued) RURAL BUSINESS-COOPERATIVE SERVICE AND RURAL... Insurance. (a) Hazard. Hazard insurance with a standard mortgage clause naming the lender as beneficiary...

  12. 7 CFR 4279.143 - Insurance.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... value of the collateral or the amount of the loan. Hazard insurance includes fire, windstorm, lightning... Regulations of the Department of Agriculture (Continued) RURAL BUSINESS-COOPERATIVE SERVICE AND RURAL... Insurance. (a) Hazard. Hazard insurance with a standard mortgage clause naming the lender as beneficiary...

  13. 7 CFR 4279.143 - Insurance.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... value of the collateral or the amount of the loan. Hazard insurance includes fire, windstorm, lightning... Regulations of the Department of Agriculture (Continued) RURAL BUSINESS-COOPERATIVE SERVICE AND RURAL... Insurance. (a) Hazard. Hazard insurance with a standard mortgage clause naming the lender as beneficiary...

  14. 78 FR 38702 - Brenda Wirkkala See; Notice of Application Accepted for Filing, Soliciting Comments, Motions To...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-27

    ... windstorm in December of 2007 which damaged the power line. The cost of restoring the project is too great... Documents: Any filing must (1) bear in all capital letters the title ``COMMENTS'', ``PROTEST'', or ``MOTION...

  15. Braving the Elements: Protecting Schools against Weather-Related Disasters.

    ERIC Educational Resources Information Center

    Breighner, Mary

    1997-01-01

    Discusses common weather-related hazards (floods, windstorms, and winter storms) and provides some steps administrators can take to protect their schools. Suggests administrators periodically assess their school's commitment to loss control, housekeeping, suitable building construction and reinforcement, sprinkler systems, water supply,…

  16. 24 CFR 3280.306 - Windstorm protection.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... URBAN DEVELOPMENT MANUFACTURED HOME CONSTRUCTION AND SAFETY STANDARDS Body and Frame Construction... frame structure to be used as the points for connection of diagonal ties, no specific connecting devices need be provided on the main frame structure. (b) Contents of instructions. (1) The manufacturer must...

  17. 24 CFR 3280.306 - Windstorm protection.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... URBAN DEVELOPMENT MANUFACTURED HOME CONSTRUCTION AND SAFETY STANDARDS Body and Frame Construction... frame structure to be used as the points for connection of diagonal ties, no specific connecting devices need be provided on the main frame structure. (b) Contents of instructions. (1) The manufacturer must...

  18. Windstorms and Insured Loss in the UK: Modelling the Present and the Future

    NASA Astrophysics Data System (ADS)

    Hewston, R.; Dorling, S.; Viner, D.

    2006-12-01

    Worldwide, the costs of catastrophic weather events have increased dramatically in recent years, with average annual insured losses rising from a negligible level in 1950 to over $10bn in 2005 (Munich Re 2006). When losses from non-catastrophic weather related events are included this figure is doubled. A similar trend is exhibited in the UK with claims totalling over £6bn for the period 1998-2003, more than twice the value for the previous five years (Dlugolecki 2004). More than 70% of this loss is associated with storms. Extratropical cyclones are the main source of wind damage in the UK. In this research, a windstorm model is constructed to simulate patterns of insured loss associated with wind damage in the UK. Observed daily maximum wind gust speeds and a variety of socioeconomic datasets are utilised in a GIS generated model, which is verified against actual domestic property insurance claims data from two major insurance providers. The increased frequency and intensity of extreme events which are anticipated to accompany climate change in the UK will have a direct affect on general insurance, with the greatest impact expected to be on property insurance (Dlugolecki 2004). A range of experiments will be run using Regional Climate Model output data, in conjunction with the windstorm model, to simulate possible future losses resulting from climate change, assuming no alteration to the vulnerability of the building stock. Losses for the periods 2020-2050 and 2070- 2100 will be simulated under the various IPCC emissions scenarios. Munich Re (2006). Annual Review: Natural Catastrophes 2005. Munich, Munich Re: 52. Dlugolecki, A. (2004). A Changing Climate for Insurance - A summary report for Chief Executives and Policymakers, Association of British Insurers

  19. Ecosystem disturbances in Central European spruce forests: a multi-proxy integration of dendroecology and sedimentary records

    NASA Astrophysics Data System (ADS)

    Clear, Jennifer; Chiverrell, Richard; Kunes, Petr; Svoboda, Miroslav; Boyle, John

    2016-04-01

    Disturbance dynamics in forest ecosystems shows signs of perturbation in the light of changing climate regimes with the frequency and intensity of events (e.g. pathogens in North America and Central Europe) amplified, becoming more frequent and severe. The montane Norway spruce (Picea abies) dominated forests of Central Europe are a niche habitat and environment; situated outside their natural boreal distribution (e.g. Fenno-Scandinavia). These communities are at or near their ecological limits and are vulnerable to both short term disturbances (e.g. fire, windstorm and pathogens) and longer-term environmental change (e.g. climate induced stress and changing disturbance patterns). Researches have linked negative impacts on spruce forest with both wind disturbance (wind-throw) and outbreaks of spruce bark beetle (Ips typographus), and there is growing evidence for co-association with wind damage enhancing pathogenic outbreaks. Examples include: in the Bohemian Forest (Czech Republic) the mid-1990s spruce bark beetle outbreak and the 2007 windstorm and subsequent bark beetle outbreak. In the High Tatra Mountains (Slovakia) there is a further co-association of forest disturbance with windstorms (2004 and 2014) and an ongoing bark beetle outbreak. The scale and severity of these recent outbreaks of spruce bark beetle are unprecedented in the historical forest records. Here, findings from ongoing research developing and integrating data from dendroecological, sedimentary palaeoecological and geochemical time series to develop a longer-term perspective on forest dynamics in these regions. Tree-ring series from plots or forest stands (>500) are used alongside lake (5) and forest hollow (3) sediments from the Czech and Slovak Republics to explore the local, regional and biogeographical scale of forest disturbances. Dendroecological data showing tree-ring gap recruitment and post-suppression growth release highlight frequent disturbance events focused on tree or forest stand spatial scales, but are patchy in terms of reoccurrence. However they highlight levels of disturbance in the late 19th Century. Sediment records from lakes and forest hollows record variable pollen influx (beetle host / non-host ratios) and a stratigraphy that includes mineral in-wash events. μXRF scanning of lakes in the region with varying catchments and catchment-to-lake area ratios show spikes in K, Zr, Ti concentrations reflecting frequent erosive episodes throughout the Holocene. Linking across the temporal scales inherent in dendroecological (0 to 250 years) and sedimentary (0 to 11,500 years) is enhancing our understanding of disturbance dynamics. The identified recent and ongoing forest disturbances coupled with well-evidenced events in the 19th century highlight the need for the longer sedimentary perspective to assess whether contemporary climate warming has and continues to stretch the resilience of these fragile ecosystems. Our data are informative to the ongoing land-management conflict between active forest management (harvesting valuable timber and salvage logging) and forest conservation agenda encouraging forest dynamics and disturbance recovery.

  20. 7 CFR 3560.105 - Insurance and taxes.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ...) Windstorm Coverage. (ii) Earthquake Coverage. (iii) Sinkhole Insurance or Mine Subsidence Insurance. (3) For... the total insured value. (iv) Earthquake Coverage. In the event that the borrower obtains earthquake... insurance or mine subsidence insurance should be similar to what would be required for earthquake insurance...

  1. 7 CFR 3560.105 - Insurance and taxes.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ...) Windstorm Coverage. (ii) Earthquake Coverage. (iii) Sinkhole Insurance or Mine Subsidence Insurance. (3) For... the total insured value. (iv) Earthquake Coverage. In the event that the borrower obtains earthquake... insurance or mine subsidence insurance should be similar to what would be required for earthquake insurance...

  2. 7 CFR 3560.105 - Insurance and taxes.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ...) Windstorm Coverage. (ii) Earthquake Coverage. (iii) Sinkhole Insurance or Mine Subsidence Insurance. (3) For... the total insured value. (iv) Earthquake Coverage. In the event that the borrower obtains earthquake... insurance or mine subsidence insurance should be similar to what would be required for earthquake insurance...

  3. 7 CFR 3560.105 - Insurance and taxes.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ...) Windstorm Coverage. (ii) Earthquake Coverage. (iii) Sinkhole Insurance or Mine Subsidence Insurance. (3) For... the total insured value. (iv) Earthquake Coverage. In the event that the borrower obtains earthquake... insurance or mine subsidence insurance should be similar to what would be required for earthquake insurance...

  4. 7 CFR 3560.105 - Insurance and taxes.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...) Windstorm Coverage. (ii) Earthquake Coverage. (iii) Sinkhole Insurance or Mine Subsidence Insurance. (3) For... the total insured value. (iv) Earthquake Coverage. In the event that the borrower obtains earthquake... insurance or mine subsidence insurance should be similar to what would be required for earthquake insurance...

  5. 7 CFR 1951.877 - Loan agreements.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... property being insured or the amount of the loan. Hazard insurance includes fire, windstorm, lightning... Agriculture Regulations of the Department of Agriculture (Continued) RURAL HOUSING SERVICE, RURAL BUSINESS... agreements. (a) A loan agreement will have been executed by the RDLF intermediary and OCS or HHS for each...

  6. Matching current windstorms to historical analogues

    NASA Astrophysics Data System (ADS)

    Becker, Bernd; Maisey, Paul; Scannell, Claire; Vanvyve, Emilie; Mitchell, Lorna; Steptoe, Hamish

    2015-04-01

    European windstorms are capable of producing devastating socioeconomic impacts. They are capable of causing power outages to millions of people, closing transport networks, uprooting trees, causing walls, buildings and other structures to collapse, which in the worst cases has resulted in dozens of fatalities. In Europe windstorms present the greatest natural hazard risk for primary insurers and result in the greatest aggregate loss due to the high volume of claims. In the winter of 2013/2014 alone storms Christian, Xaver, Dirk and Tini cost the insurance industry an estimated EUR 2.5 bn. Here we make use of a high resolution (4 km) historical storm footprint catalogue which contains over 6000 storms. This catalogue was created using the 35 year ERA-Interim model reanalysis dataset, downscaled to 12 km and then to 4.4 km. This approach was taken in order to provide a long term, high resolution data set, consistent with Met Office high resolution deterministic forecast capability for Europe. The footprints are defined as the maximum 3 second gust at each model grid point over a 72 hour period during each storm. Matches between current/forecast storm footprints and footprints from the historical catalogue are found using fingerprint identification techniques, by way of calculating image texture derived from the gray-level-co-occurrence matrix (Haralick, 1973). The best match is found by firstly adding the current or forecast footprints to the stack of the historical storm catalogue. An "identical twin" or "best match" of this footprint is then sought from within this stack. This search is repeated for a set of measures (15 in total) including position of the strongest gusts, storm damage potential and 13 Haralick measures. Each time a candidate is found, the nearest neighbours are noted and a rank proximity measure is calculated. Finally, the Frobenius norm (distance between the two fields at each grid-point averaged) is calculated. This provides an independent assessment of the goodness of fit made by the rank proximity measure. Using this technique a series of potential historical footprints matching the current footprint is found. Each potential match is indexed according to its closeness to the current footprint where an index rating of 0 is a perfect match or "identical twin". Such pattern matching of current and forecast windstorms against an historical archive can enable insurers estimate a rapid prediction of likely loss and aid the timely deployment of staff and funds at the right level.

  7. Carbon and nitrogen loss in windblown dust on the Columbia Plateau

    USDA-ARS?s Scientific Manuscript database

    Soil erosion from windstorms may lead to high nutrient loss in fields and cause environmental degradation as a result of suspension in the atmosphere or deposition in surface water systems. In particular, high wind weather events can emit particulates from tilled agricultural soils on the Columbia P...

  8. 24 CFR 3280.306 - Windstorm protection.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... vertical building projection, as horizontal wind load, and across the surface of the full roof structure... applied in the design of the tiedown system. The dead load of the structure may be used to resist these... manufacturer's installation instructions provide for the main frame structure to be used as the points for...

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDonald, J R; Murray, R

    High winds tend to pick up and transport various objects and debris, which are referred to as wind-borne missiles or tornado missiles, depending on the type of storm. Missiles cause damage by perforating the building envelope or by collapsing structural elements such as walls, columns or frames. The primary objectives of this study are as follows: (1) to provide a basis for wind-borne or tornado missile criteria for the design and evaluation of DOE facilities, and (2) to provide guidelines for the design and evaluation of impact-resistant missile barriers for DOE facilities The first objective is accomplished through a synthesismore » of information from windstorm damage documentation experience and computer simulation of missile trajectories. The second objective is accomplished by reviewing the literature, which describes various missile impact tests, and by conducting a series of impact tests at a Texas Tech University facility to fill in missing information.« less

  10. Can we trust climate models to realistically represent severe European windstorms?

    NASA Astrophysics Data System (ADS)

    Trzeciak, Tomasz M.; Knippertz, Peter; Pirret, Jennifer S. R.; Williams, Keith D.

    2016-06-01

    Cyclonic windstorms are one of the most important natural hazards for Europe, but robust climate projections of the position and the strength of the North Atlantic storm track are not yet possible, bearing significant risks to European societies and the (re)insurance industry. Previous studies addressing the problem of climate model uncertainty through statistical comparisons of simulations of the current climate with (re-)analysis data show large disagreement between different climate models, different ensemble members of the same model and observed climatologies of intense cyclones. One weakness of such evaluations lies in the difficulty to separate influences of the climate model's basic state from the influence of fast processes on the development of the most intense storms, which could create compensating effects and therefore suggest higher reliability than there really is. This work aims to shed new light into this problem through a cost-effective "seamless" approach of hindcasting 20 historical severe storms with the two global climate models, ECHAM6 and GA4 configuration of the Met Office Unified Model, run in a numerical weather prediction mode using different lead times, and horizontal and vertical resolutions. These runs are then compared to re-analysis data. The main conclusions from this work are: (a) objectively identified cyclone tracks are represented satisfactorily by most hindcasts; (b) sensitivity to vertical resolution is low; (c) cyclone depth is systematically under-predicted for a coarse resolution of T63 by both climate models; (d) no systematic bias is found for the higher resolution of T127 out to about three days, demonstrating that climate models are in fact able to represent the complex dynamics of explosively deepening cyclones well, if given the correct initial conditions; (e) an analysis using a recently developed diagnostic tool based on the surface pressure tendency equation points to too weak diabatic processes, mainly latent heating, as the main source for the under-prediction in the coarse-resolution runs. Finally, an interesting implication of these results is that the too low number of deep cyclones in many free-running climate simulations may therefore be related to an insufficient number of storm-prone initial conditions. This question will be addressed in future work.

  11. 7 CFR 1753.48 - Procurement procedures.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... that is estimated to cost $250,000, or less, inclusive of labor and materials. (2) The procedures to be...) Materials on hand, until released to the contractor, shall be covered by fire and either wind-storm or... construction are necessary, and the cost of such changes or corrections is properly chargeable to the borrower...

  12. 7 CFR 1753.48 - Procurement procedures.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... that is estimated to cost $250,000, or less, inclusive of labor and materials. (2) The procedures to be...) Materials on hand, until released to the contractor, shall be covered by fire and either wind-storm or... construction are necessary, and the cost of such changes or corrections is properly chargeable to the borrower...

  13. 7 CFR 1753.48 - Procurement procedures.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... that is estimated to cost $250,000, or less, inclusive of labor and materials. (2) The procedures to be...) Materials on hand, until released to the contractor, shall be covered by fire and either wind-storm or... construction are necessary, and the cost of such changes or corrections is properly chargeable to the borrower...

  14. 7 CFR 1753.48 - Procurement procedures.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... that is estimated to cost $250,000, or less, inclusive of labor and materials. (2) The procedures to be...) Materials on hand, until released to the contractor, shall be covered by fire and either wind-storm or... construction are necessary, and the cost of such changes or corrections is properly chargeable to the borrower...

  15. Evaluating the vulnerability of Maine forests to wind damage

    Treesearch

    Thomas E. Perry; Jeremy S. Wilson

    2010-01-01

    Numerous factors, some of which cannot be controlled, are continually interacting with the forest resource, introducing risk to management, and making consistent predictable management outcomes uncertain. Included in these factors are threats or hazards such as windstorms and wildfire. Factors influencing the probability (risk) of windthrow or windsnap occurring can be...

  16. Menominee Tribal Enterprises forest regeneration efforts

    Treesearch

    Suzanne M. Beilfuss

    2002-01-01

    Menominee Tribal Enterprises (MTE) is located in northeastern Wisconsin on the Menominee Indian Reservation, which includes ten townships of mostly forested land. Past fires, windstorms, and logging all have affected the composition and structure of this forest, which brings us to why regeneration on the forest is very important. Stands are regenerated with tree...

  17. Gainesville's urban forest canopy cover

    Treesearch

    Francisco Escobedo; Jennifer A. Seitz; Wayne Zipperer

    2009-01-01

    Ecosystem benefits from trees are linked directly to the amount of healthy urban forest canopy cover. Urban forest cover is dynamic and changes over time due to factors such as urban development, windstorms, tree removals, and growth. The amount of a city's canopy cover depends on its land use, climate, and people's preferences. This fact sheet examines how...

  18. 24 CFR 3280.306 - Windstorm protection.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ..., and across the surface of the full roof structure, as uplift loading. For Wind Zones II and III, the... the structure may be used to resist these wind loading effects in all Wind Zones. (1) The provisions... frame structure to be used as the points for connection of diagonal ties, no specific connecting devices...

  19. 24 CFR 3280.306 - Windstorm protection.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ..., and across the surface of the full roof structure, as uplift loading. For Wind Zones II and III, the... the structure may be used to resist these wind loading effects in all Wind Zones. (1) The provisions... frame structure to be used as the points for connection of diagonal ties, no specific connecting devices...

  20. Ecological importance of intermediate windstorms rivals large, infrequent disturbances in the northern Great Lakes

    Treesearch

    Kirk M. Stueve; Charles H. (Hobie) Perry; Mark D. Nelson; Sean P. Healey; Andrew D. Hill; Gretchen G. Moisen; Warren B. Cohen; Dale D. Gormanson; Chengquan Huang

    2011-01-01

    Exogenous disturbances are critical agents of change in temperate forests capable of damaging trees and influencing forest structure, composition, demography, and ecosystem processes. Forest disturbances of intermediate magnitude and intensity receive relatively sparse attention, particularly at landscape scales, despite influencing most forests at least once per...

  1. 7 CFR 4274.338 - Loan agreements between the Agency and the intermediary.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... loan. Hazard insurance includes fire, windstorm, lightning, hail, business interruption, explosion... recipients. (B) These reports shall contain information only on the IRP revolving loan fund, or if other... an extra payment on the loan, any funds it has received and not used in accordance with the work plan...

  2. 7 CFR 4274.338 - Loan agreements between the Agency and the intermediary.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... loan. Hazard insurance includes fire, windstorm, lightning, hail, business interruption, explosion... recipients. (B) These reports shall contain information only on the IRP revolving loan fund, or if other... an extra payment on the loan, any funds it has received and not used in accordance with the work plan...

  3. 7 CFR 4274.338 - Loan agreements between the Agency and the intermediary.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... loan. Hazard insurance includes fire, windstorm, lightning, hail, business interruption, explosion... recipients. (B) These reports shall contain information only on the IRP revolving loan fund, or if other... an extra payment on the loan, any funds it has received and not used in accordance with the work plan...

  4. Major cause of unprecedented Arctic warming in January 2016: Critical role of an Atlantic windstorm

    PubMed Central

    Kim, Baek-Min; Hong, Ja-Young; Jun, Sang-Yoon; Zhang, Xiangdong; Kwon, Hataek; Kim, Seong-Joong; Kim, Joo-Hong; Kim, Sang-Woo; Kim, Hyun-Kyung

    2017-01-01

    In January 2016, the Arctic experienced an extremely anomalous warming event after an extraordinary increase in air temperature at the end of 2015. During this event, a strong intrusion of warm and moist air and an increase in downward longwave radiation, as well as a loss of sea ice in the Barents and Kara seas, were observed. Observational analyses revealed that the abrupt warming was triggered by the entry of a strong Atlantic windstorm into the Arctic in late December 2015, which brought enormous moist and warm air masses to the Arctic. Although the storm terminated at the eastern coast of Greenland in late December, it was followed by a prolonged blocking period in early 2016 that sustained the extreme Arctic warming. Numerical experiments indicate that the warming effect of sea ice loss and associated upward turbulent heat fluxes are relatively minor in this event. This result suggests the importance of the synoptically driven warm and moist air intrusion into the Arctic as a primary contributing factor of this extreme Arctic warming event. PMID:28051170

  5. Return period estimates for European windstorm clusters: a multi-model perspective

    NASA Astrophysics Data System (ADS)

    Renggli, Dominik; Zimmerli, Peter

    2017-04-01

    Clusters of storms over Europe can lead to very large aggregated losses. Realistic return period estimates for such cluster are therefore of vital interest to the (re)insurance industry. Such return period estimates are usually derived from historical storm activity statistics of the last 30 to 40 years. However, climate models provide an alternative source, potentially representing thousands of simulated storm seasons. In this study, we made use of decadal hindcast data from eight different climate models in the CMIP5 archive. We used an objective tracking algorithm to identify individual windstorms in the climate model data. The algorithm also computes a (population density weighted) Storm Severity Index (SSI) for each of the identified storms (both on a continental and more regional basis). We derived return period estimates for the cluster seasons 1990, 1999, 2013/2014 and 1884 in the following way: For each climate model, we extracted two different exceedance frequency curves. The first describes the exceedance frequency (or the return period as the inverse of it) of a given SSI level due to an individual storm occurrence. The second describes the exceedance frequency of the seasonally aggregated SSI level (i.e. the sum of the SSI values of all storms in a given season). Starting from appropriate return period assumptions for each individual storm of a historical cluster (e.g. Anatol, Lothar and Martin in 1999) and using the first curve, we extracted the SSI levels at the corresponding return periods. Summing these SSI values results in the seasonally aggregated SSI value. Combining this with the second (aggregated) exceedance frequency curve results in return period estimate of the historical cluster season. Since we do this for each model separately, we obtain eight different return period estimates for each historical cluster. In this way, we obtained the following return period estimates: 50 to 80 years for the 1990 season, 20 to 45 years for the 1999 season, 3 to 4 years for the 2013/2014 season, and 14 to 16 years for the 1884 season. More detailed results show substantial variation between five different regions (UK, France, Germany, Benelux and Scandinavia), as expected from the path and footprints of the different events. For example, the 1990 season is estimated to be well beyond a 100-year season for Germany and Benelux. 1999 clearly was an extreme season for France, whereas the1884 was very disruptive for the UK. Such return period estimates can be used as an independent benchmark for other approaches quantifying clustering of European windstorms. The study might also serve as an example to derive similar risk measures also for other climate-related perils from a robust, publicly available data source.

  6. Use of the historic range of variability to evaluate ecosystem sustainability [Chapter 24

    Treesearch

    Carolyn B. Meyer; Dennis H. Knight; Greg K. Dillon

    2010-01-01

    Ecosystems are not static, having evolved with disturbances such as fire, windstorms, floods, disease, and animal activity. The natural variability imposed by such disturbances must be included when defining sustainability goals. One approach is to target the historic range of variability (HRV), determining if current management maintains the ecosystem within its HRV....

  7. Above- and below-ground characteristics associated with wind toppling in a young Populus plantation.

    Treesearch

    Constance A. Harrington; Dean S. DeBell

    1996-01-01

    Damage from a dormant-season windstorm in a 3-year-old Populus research trial differed among four clones and three spacings and between monoclonal and polyclonal plots. Clonal differences in susceptibility to toppling (or leaning) were associated with both above and below-ground characteristics. Susceptible clones had less taper in the lower stem...

  8. [Pohoda no. II (delayed death--following music festival)].

    PubMed

    Stuller, F; Novomeský, F; Straka, L; Krajcovic, J

    2011-07-01

    A mass tragedy on the Slovak biggest music festival "POHODA", caused by a windstorm, shocked whole society, even abroad. Many questions concerned a causality and a circumstances of the incident arose immediately. The forensic autopsies of victims (29-aged man and 19-aged woman) represented a very special expertise act in police investigation of the case.

  9. Survival, Growth, and Ecosystem Dynamics of Displaced Bromeliads in a Montane Tropical Forest.

    Treesearch

    Jennifer Pett-Ridge; Whendee L. Silver

    2002-01-01

    Epiphytes generally occupy arboreal perches, which are inherently unstable environments due to periodic windstorms, branch falls, and treefalls. During high wind events, arboreal bromeliads are often knocked from the canopy and deposited on the forest floor. In this study, we used a common epiphytic tank bromeliad, Guzmania berteroniana (R. & S.) Mez, to determine...

  10. Facts About Derechos - Very Damaging Windstorms

    Science.gov Websites

    or bowed shape. The bow-shaped storms are called bow echoes.  Bow echoes typically arise when thunderstorms (typically from 40 miles to 250 miles in length) that may at times take the shape of a single bow yield vastly different outcomes --- that is, a derecho or no derecho --- depending upon how the

  11. Chapter 2: Climate, disturbance, and vulnerability to vegetation change in the Northwest Forest Plan Area

    Treesearch

    Matthew J. Reilly; Thomas A.  Spies; Ramona Butz Littell; John B. . Kim

    2018-01-01

    Climate change is expected to alter the composition, structure, and function of forested ecosystems in the United States (Vose et al. 2012). Increases in atmospheric concentrations of greenhouse gases (e.g., carbon dioxide [CO2]) and temperature, as well as altered precipitation and disturbance regimes (e.g., fire, insects, pathogens, and windstorms), are expected to...

  12. Case study of the 9 May 2003 windstorm in southwestern Slovakia

    NASA Astrophysics Data System (ADS)

    Kaňák, Ján; Benko, Martin; Simon, André; Sokol, Alois

    2007-02-01

    May 9, 2003 thunderstorm in southwest Slovakia is considered one of the most severe convective events to have happened in Slovakia during the past ten years. The majority of the reported damage was caused by very strong outflowing winds and hail. The downburst (macroburst) nature of the event was confirmed by a damage survey carried out in the area hit by the thunderstorm. The supercell nature of the storm was inferred from radar measurements, with the fields of radar reflectivity and radial Doppler velocity showing typical supercell features (e.g. BWER echo). The satellite imagery (from METEOSAT 7) indicated a large-scale dry air intrusion as a possible factor of downdraft enhancement. Aspects of the storm environment were inferred from soundings, numerical analysis of the ALADIN model and Velocity Azimuth Display data from radar. The results enable comparison of the outputs of several instability indices, such as CAPE, DCAPE and Storm to Relative Environmental Helicity (SREH). It was concluded based on structure and development that the storm showed many similarities to the so called High Precipitation (HP) supercell type.

  13. Fire chronology and windstorm effects on persistence of a disjunct oak-shortleaf pine community

    Treesearch

    Michael D. Jones; Marlin L. Bowles

    2012-01-01

    We investigated effects of a human-altered fire regime and wind storms on persistence of disjunct oak-shortleaf pine vegetation occurring along 5.5 km of xeric habitat on the east bluffs of the Mississippi River in Union County, IL. In 2009, we resampled vegetation transects established in seven stands in 1954 and obtained 26 cross sections containing fire scars from...

  14. Effects of salvage logging and pile-and-burn on fuel loading, potential fire behaviour, fuel consumption and emissions

    Treesearch

    Morris C. Johnson; Jessica E. Halofsky; David L. Peterson

    2013-01-01

    We used a combination of field measurements and simulation modelling to quantify the effects of salvage logging, and a combination of salvage logging and pile-and-burn fuel surface fuel treatment (treatment combination), on fuel loadings, fire behaviour, fuel consumption and pollutant emissions at three points in time: post-windstorm (before salvage logging), post-...

  15. Internal Tidal Hydrodynamics and Ambient Characteristics of the Adriatic (ITHACA)

    DTIC Science & Technology

    2006-12-31

    document atmospheric conditions along the line extending from the coast to the open sea. 2.1. Measuring sites Position of meteorological measuring sites...idealized case, they are perpendicular to the straight shoreline. However, along the eastern Adriatic irregularly shaped coast there is a number of...Cvitan, 2003; Belusic et al., 2004; Belusic and Klaic, 2004, 2006), southwestward, downslope windstorm , which frequently blows over the Adriatic

  16. Large woody debris in a second-growth central Appalachian hardwood stand: volume, composition, and dynamics

    Treesearch

    M. B. Adams; T. M. Schuler; W. M. Ford; J. N. Kochenderfer

    2003-01-01

    We estimated the volume of large woody debris in a second-growth stand and evaluated the importance of periodic windstorms as disturbances in creating large woody debris. This research was conducted on a reference watershed (Watershed 4) on the Fernow Experimental Forest in West Virginia. The 38-ha stand on Watershed 4 was clearcut around 1911 and has been undisturbed...

  17. The influence of an extensive dust event on snow chemistry in the southern Rocky Mountains

    Treesearch

    Charles Rhoades; Kelly Elder; E. Greene

    2010-01-01

    In mid-February 2006, windstorms in Arizona, Utah, and western Colorado generated a dust cloud that distributed a layer of dust across the surface of the snowpack throughout much of the Colorado Rockies; it remained visible throughout the winter. We compared the chemical composition of snowfall and snowpack collected during and after the dust deposition event with pre-...

  18. After the blowdown: a resource assessment of the Boundary Waters Canoe Area Wilderness, 1999-2003

    Treesearch

    W. Keith Moser; Mark H. Hansen; Mark D. Nelson; Susan J. Crocker; Charles H. Perry; Bethany Schulz; Christopher W. Woodall

    2007-01-01

    The Boundary Waters Canoe Area Wilderness (BWCAW) was struck by a major windstorm on July 4, 1999. Estimated volume in blowdown areas was up to 29 percent less than in non-blowdown areas. Mean down woody fuel loadings were twice as high in blowdown areas than in non-blowdown areas. Overstory species diversity declined in blowdown areas, but understory diversity,...

  19. Modelling the economic losses of historic and present-day high-impact winter storms in Switzerland

    NASA Astrophysics Data System (ADS)

    Welker, Christoph; Stucki, Peter; Bresch, David; Dierer, Silke; Martius, Olivia; Brönnimann, Stefan

    2014-05-01

    Severe winter storms such as "Vivian" in February 1990 and "Lothar" in December 1999 are among the most destructive meteorological hazards in Switzerland. Disaster severity resulting from such windstorms is attributable, on the one hand, to hazardous weather conditions such as high wind gust speeds; and on the other hand to socio-economic factors such as population density, distribution of values at risk, and damage susceptibility. For present-day winter storms, the data basis is generally good to describe the meteorological development and wind forces as well as the associated socio-economic impacts. In contrast, the information on historic windstorms is overall sparse and the available historic weather and loss reports mostly do not provide quantitative information. This study illustrates a promising technique to simulate the economic impacts of both historic and present winter storms in Switzerland since end of the 19th century. Our approach makes use of the novel Twentieth Century Reanalysis (20CR) spanning 1871-present. The 2-degree spatial resolution of the global 20CR dataset is relatively coarse. Thus, the complex orography of Switzerland is not realistically represented, which has considerable ramifications for the representation of wind systems that are strongly influenced by the local orography, such as Föhn winds. Therefore, a dynamical downscaling of the 20CR to 3 km resolution using the Weather Research and Forecasting (WRF) model was performed, for in total 40 high-impact winter storms in Switzerland since 1871. Based on the downscaled wind gust speeds and the climada loss model, the estimated economic losses were calculated at municipality level for current economic and social conditions. With this approach, we find an answer to the question what would be the economic losses of e.g. a hazardous Föhn storm - which occurred in northern Switzerland in February 1925 - today, i.e. under current socio-economic conditions. Encouragingly, the pattern of simulated losses for this specific storm is very similar to historic loss reports. A comparison of wind gust speeds with simulated storm losses for all highly damaging winter storms in Switzerland since the late 19th century considered in this study shows that storm losses have been related primarily to population density (and distribution of values at risk, respectively) rather than hazardous wind speed.

  20. Wave Breaking Induced Surface Wakes and Jets Observed during a Bora Event

    DTIC Science & Technology

    2005-01-01

    terrain contours (interval = 200 m) superposed. The approximate NCAR Electra and NOAA P-3 flight tracks are indicated by bold and dotted straight lines ...Hz data. The red curves correspond to the COAMPS simulated fields obtained by interpolating the 1-km grid data to the straight line through the...Alpine Experiment (ALPEX) in 1982 [Smith, 1987]. These studies suggested that the bora flow shares some common characteristics with downslope windstorms

  1. Gravity Wave Breaking over the Central Alps: Role of Complex Terrain.

    NASA Astrophysics Data System (ADS)

    Jiang, Qingfang; Doyle, James D.

    2004-09-01

    The characteristics of gravity waves excited by the complex terrain of the central Alps during the intensive observational period (IOP) 8 of the Mesoscale Alpine Programme (MAP) is studied through the analysis of aircraft in situ measurements, GPS dropsondes, radiosondes, airborne lidar data, and numerical simulations.Mountain wave breaking occurred over the central Alps on 21 October 1999, associated with wind shear, wind turning, and a critical level with Richardson number less than unity just above the flight level (5.7 km) of the research aircraft NCAR Electra. The Electra flew two repeated transverses across the Ötztaler Alpen, during which localized turbulence was sampled. The observed maximum vertical motion was 9 m s-1, corresponding to a turbulent kinetic energy (TKE) maximum of 10.5 m2 s-2. Spectrum analysis indicates an inertia subrange up to 5-km wavelength and multiple energy-containing spikes corresponding to a wide range of wavelengths.Manual analysis of GPS dropsonde data indicates the presence of strong flow descent and a downslope windstorm over the lee slope of the Ötztaler Alpen. Farther downstream, a transition occurs across a deep hydraulic jump associated with the ascent of isentropes and local wind reversal. During the first transverse, the turbulent region is convectively unstable as indicated by a positive sensible heat flux within the turbulent portion of the segment. The TKE derived from the flight-level data indicates multiple narrow spikes, which match the patterns shown in the diagnosed buoyancy production rate of TKE. The turbulence is nonisotropic with the major TKE contribution from the -wind component. The convectively unstable zone is advected downstream during the second transverse and the turbulence becomes much stronger and more isotropic.The downslope windstorm, flow descent, and transition to turbulence through a hydraulic jump are captured by a real-data Coupled Ocean Atmosphere Mesoscale Predition System (COAMPS) simulation. Several idealized simulations are performed motivated by the observations of multiscale waves forced by the complex terrain underneath. The simulations indicate that multiscale terrain promotes wave breaking, increases mountain drag, and enhances the downslope winds and TKE generation.


  2. Beyond Line of Sight (BLOS) Command and Control (C2) Capability to Improve Disaster Response and Recovery

    DTIC Science & Technology

    2013-09-01

    on the possible threat of an electromagnetic pulse (EMP) and its potential consequences following the destructive “ derecho ” that hit Washington, DC...in 2012.19 Spanish for the word “straight,” a derecho is a term used to describe a widespread, long- lived, straight-line windstorm that is...emergency communications system and raised concern for future response. Both Hurricane Katrina and the Washington, DC, area derecho have subsequently

  3. Eastern Mediterranean Sea Spatial and Temporal Variability of Thermohaline Structure and Circulation Identified from Observational (T, S) Profiles

    DTIC Science & Technology

    2015-12-01

    effect of Etesian winds between the late May and early October. Although they are generally dry, cool and moderate; they may turn into a windstorm...very significant to provide the realization of ocean modeling and prediction. The Optimal Spectral Decomposition (OSD) method is an effective ...represents the potential density, by differentiating this equation with respect to z and multiplying with the coriolis parameter f, conservation of

  4. Quantifying the Stable Boundary Layer Structure and Evolution during T-REX 2006

    DTIC Science & Technology

    2014-09-30

    integrating surface observations, data from in-situ measurements, and a nested numerical model with two related topics was conducted in this project. the WRF ...as well as quantify differences at a fine scale model output using the different turbulent mixing/diffusion options in the WRF -ARW model; and (2... WRF model planetary boundary layer schemes were also conducted to study a downslope windstorm and rotors in Las Vegas valley. Two events (March 20

  5. Potential Seasonal Predictability for Winter Storms over Europe

    NASA Astrophysics Data System (ADS)

    Wild, Simon; Befort, Daniel J.; Leckebusch, Gregor C.

    2017-04-01

    Reliable seasonal forecasts of strong extra-tropical cyclones and windstorms would have great social and economical benefits, as these events are the most costly natural hazards over Europe. In a previous study we have shown good agreement of spatial climatological distributions of extra-tropical cyclones and wind storms in state-of-the-art multi-member seasonal prediction systems with reanalysis. We also found significant seasonal prediction skill of extra-tropical cyclones and windstorms affecting numerous European countries. We continue this research by investigating the mechanisms and precursor conditions (primarily over the North Atlantic) on a seasonal time scale leading to enhanced extra-tropical cyclone activity and winter storm frequency over Europe. Our results regarding mechanisms show that an increased surface temperature gradient at the western edge of the North Atlantic can be related to enhanced winter storm frequency further downstream causing for example a greater number of storms over the British Isles, as observed in winter 2013-14.The so-called "Horseshoe Index", a SST tripole anomaly pattern over the North Atlantic in the summer months can also cause a higher number of winter storms over Europe in the subsequent winter. We will show results of AMIP-type sensitivity experiments using an AGCM (ECHAM5), supporting this hypothesis. Finally we will analyse whether existing seasonal forecast systems are able to capture these identified mechanisms and precursor conditions affecting the models' seasonal prediction skill.

  6. Can an earthquake prediction and warning system be developed?

    USGS Publications Warehouse

    N.N, Ambraseys

    1990-01-01

    Over the last 20 years, natural disasters have killed nearly 3 million people and disrupted the lives of over 800 million others. In 2 years there were more than 50 serious natural disasters, including landslides in Italy, France, and Colombia; a typhoon in Korea; wildfires in China and the United States; a windstorm in England; grasshopper plagues in Africa's horn and the Sahel; tornadoes in Canada; devastating earthquakes in Soviet Armenia and Tadzhikstand; infestations in Africa; landslides in Brazil; and tornadoes in the United States 

  7. Comparison of Two Windstorm Events During the Sierra Rotors Project and Terrain-Induced Rotor Experiment

    DTIC Science & Technology

    2010-06-01

    photograph of Owens Valley during the event. There is an isolated cloud with a leading edge over the center of the valley, consistent with a lenticular ...tom) Lenticular cloud over Owens Valley at 2100 UTC looking SE (photo by Alex Reinecke). vations and there is no separation of the surface wester- lies...southwestern US. On the other hand, satellite also shows clouds organized in bands parallel to the mountains and lenticular clouds were present over Owens

  8. TRISTEN/FRAM II Cruise Report, East Arctic, April 1980.

    DTIC Science & Technology

    1981-04-13

    is not readily accessible by air from Alaska. The Eurasia Basin contains the Arctic Midoceanic Ridge, which extends in a straight line for 2000 km...13 6 Bottom Refraction - Shot- Lines Overlain on FRAM II Positions 14 7 Waterfall Display of Successive Spectral Estimates of Single...Northeast leg of the array was oriented 341T and the NW leg 304 ’T. After a windstorm and flow break-up on 16 April, hydrophones 11 and 12 and 21-24 were

  9. Forecasting challenges during the severe weather outbreak in Central Europe on 25 June 2008

    NASA Astrophysics Data System (ADS)

    Púčik, Tomáš; Francová, Martina; Rýva, David; Kolář, Miroslav; Ronge, Lukáš

    2011-06-01

    On 25 June 2008, severe thunderstorms caused widespread damage and two fatalities in the Czech Republic. Significant features of the storms included numerous downbursts on a squall line that exhibited a bow echo reflectivity pattern, with sustained wind gusts over 32 m/s at several reporting stations. Moreover, a tornado and several downbursts of F2 intensity occurred within the convective system, collocated with the development of mesovortices within the larger scale bow echo. The extent of the event was sufficient to call it a derecho, as the windstorm had affected Eastern Germany, Southern Poland, Slovakia, Austria and Northern Hungary as well. Ahead of the squall line, several well-organized isolated cells occurred, exhibiting supercellular characteristics, both from a radar and visual perspective. These storms produced large hail and also isolated severe wind gusts. This paper deals mostly with the forecasting challenges that were experienced by the meteorologist on duty during the evolution of this convective scenario. The main challenge of the day was to identify the region that would be most affected by severe convection, especially as the numerical weather prediction failed to anticipate the extent and the progress of the derecho-producing mesoscale convective systems (MCSs). Convective storms developed in an environment conducive to severe thunderstorms, with strong wind shear confined mostly to the lower half of the troposphere. These developments also were strongly influenced by mesoscale factors, especially a mesolow centered over Austria and its trough stretching to Eastern Bohemia. The paper demonstrates how careful mesoscale analysis could prove useful in dealing with such convective situations. Remote-sensing methods are also shown to be useful in such situations, especially when they can offer sufficient lead time to issue a warning, which is not always the case.

  10. A forensic re-analysis of one of the deadliest European tornadoes

    NASA Astrophysics Data System (ADS)

    Holzer, Alois M.; Schreiner, Thomas M. E.; Púčik, Tomáš

    2018-06-01

    Extremely rare events with high potential impact, such as violent tornadoes, are of strong interest for climatology and risk assessment. In order to obtain more knowledge about the most extreme events, it is vital to study historical cases. The purpose of this paper is twofold: (1) to demonstrate how a windstorm catastrophe that happened 100 years ago, such as the Wiener Neustadt, Lower Austria, tornado on 10 July 1916, can be successfully re-analyzed using a forensic approach, and (2) to propose a repeatable working method for assessing damage and reconstructing the path and magnitude of local windstorm and tornado cases with sufficient historical sources. Based on the results of the forensic re-analyses, a chronology of the tornado impact is presented, followed by a description of the key tornado characteristics: a maximum intensity rating of F4, a damage path length of 20 km and a maximum visible tornado diameter of 1 km. Compared to a historical scientific study published soon after the event, additional new findings are presented, namely the existence of two predecessor tornadoes and a higher number of fatalities: at least 34 instead of 32. While the storm-scale meteorology could not be reconstructed, rich damage data sources for the urban area of Wiener Neustadt facilitated a detailed analysis of damage tracks and wind intensities within the tornado. The authors postulate the requirement for an International Fujita Scale to rate tornadoes globally in a consistent way, based on comparable damage indicators.

  11. The observed clustering of damaging extratropical cyclones in Europe

    NASA Astrophysics Data System (ADS)

    Cusack, Stephen

    2016-04-01

    The clustering of severe European windstorms on annual timescales has substantial impacts on the (re-)insurance industry. Our knowledge of the risk is limited by large uncertainties in estimates of clustering from typical historical storm data sets covering the past few decades. Eight storm data sets are gathered for analysis in this study in order to reduce these uncertainties. Six of the data sets contain more than 100 years of severe storm information to reduce sampling errors, and observational errors are reduced by the diversity of information sources and analysis methods between storm data sets. All storm severity measures used in this study reflect damage, to suit (re-)insurance applications. The shortest storm data set of 42 years provides indications of stronger clustering with severity, particularly for regions off the main storm track in central Europe and France. However, clustering estimates have very large sampling and observational errors, exemplified by large changes in estimates in central Europe upon removal of one stormy season, 1989/1990. The extended storm records place 1989/1990 into a much longer historical context to produce more robust estimates of clustering. All the extended storm data sets show increased clustering between more severe storms from return periods (RPs) of 0.5 years to the longest measured RPs of about 20 years. Further, they contain signs of stronger clustering off the main storm track, and weaker clustering for smaller-sized areas, though these signals are more uncertain as they are drawn from smaller data samples. These new ultra-long storm data sets provide new information on clustering to improve our management of this risk.

  12. The effect of the United States Great Lakes on the maintenance of derecho-producing mesoscale convective systems.

    NASA Astrophysics Data System (ADS)

    Bentley, M.; Sparks, J.; Graham, R.

    2003-04-01

    The primary aim of this research is to investigate the influence of the United States Great Lakes on the intensity of mesoscale convective systems (MCSs). One of the greatest nowcast challenges during the warm season is anticipating the impact of the Great Lakes on severe convection, particularly MCSs capable of producing damaging widespread windstorms known as derechos. Since a major derecho activity corridor lies over the Great Lakes region, it is important to understand the effects of the Lakes on the intensity and propagation of severe wind producing MCSs. Specific objectives of the research include: 1) The development of a short-term climatology of MCS events that have impacted the Great Lakes region over the past seven years; 2) An analysis of radar, satellite, surface (including buoy and lighthouse observations), and lake surface temperature data to determine the environmental conditions impacting the evolution of MCSs passing over a Great Lake; 3) An examination of MCS initiation times and seasonal frequencies of occurrence to delineate temporal consistencies in MCS evolution due to changing lake surface temperatures; and 4) The development of conceptual and forecast models to help anticipate MCS intensity and morphology as these systems interact with the Great Lakes environment.

  13. Natural Hazards, Second Edition

    NASA Astrophysics Data System (ADS)

    Rouhban, Badaoui

    Natural disaster loss is on the rise, and the vulnerability of the human and physical environment to the violent forces of nature is increasing. In many parts of the world, disasters caused by natural hazards such as earthquakes, floods, landslides, drought, wildfires, intense windstorms, tsunami, and volcanic eruptions have caused the loss of human lives, injury, homelessness, and the destruction of economic and social infrastructure. Over the last few years, there has been an increase in the occurrence, severity, and intensity of disasters, culminating with the devastating tsunami of 26 December 2004 in South East Asia.Natural hazards are often unexpected or uncontrollable natural events of varying magnitude. Understanding their mechanisms and assessing their distribution in time and space are necessary for refining risk mitigation measures. This second edition of Natural Hazards, (following a first edition published in 1991 by Cambridge University Press), written by Edward Bryant, associate dean of science at Wollongong University, Australia, grapples with this crucial issue, aspects of hazard prediction, and other issues. The book presents a comprehensive analysis of different categories of hazards of climatic and geological origin.

  14. The need for the International Decade of Natural Hazard Reduction

    USGS Publications Warehouse

    Press, F.

    1990-01-01

    Over the last 20 years, natural disasters have killed nearly 3 million people and disrupted the lives of over 800 million others. In 2 years there were more than 50 serious natural disasters, including landslides in Italy, France, and Colombia; a typhoon in Korea; wildfires in China and the United State; a windstorm in England; grasshopper plagues in Africa's horns and the Sahel; tornadoes in Canada; devastating earthquakes in Soviet Armenia and Tadzhikistan; infestations in Africa; landslides in Brazil; and tornadoes in the United States. 

  15. The observed clustering of damaging extra-tropical cyclones in Europe

    NASA Astrophysics Data System (ADS)

    Cusack, S.

    2015-12-01

    The clustering of severe European windstorms on annual timescales has substantial impacts on the re/insurance industry. Management of the risk is impaired by large uncertainties in estimates of clustering from historical storm datasets typically covering the past few decades. The uncertainties are unusually large because clustering depends on the variance of storm counts. Eight storm datasets are gathered for analysis in this study in order to reduce these uncertainties. Six of the datasets contain more than 100~years of severe storm information to reduce sampling errors, and the diversity of information sources and analysis methods between datasets sample observational errors. All storm severity measures used in this study reflect damage, to suit re/insurance applications. It is found that the shortest storm dataset of 42 years in length provides estimates of clustering with very large sampling and observational errors. The dataset does provide some useful information: indications of stronger clustering for more severe storms, particularly for southern countries off the main storm track. However, substantially different results are produced by removal of one stormy season, 1989/1990, which illustrates the large uncertainties from a 42-year dataset. The extended storm records place 1989/1990 into a much longer historical context to produce more robust estimates of clustering. All the extended storm datasets show a greater degree of clustering with increasing storm severity and suggest clustering of severe storms is much more material than weaker storms. Further, they contain signs of stronger clustering in areas off the main storm track, and weaker clustering for smaller-sized areas, though these signals are smaller than uncertainties in actual values. Both the improvement of existing storm records and development of new historical storm datasets would help to improve management of this risk.

  16. Natural hazard fatalities in Switzerland from 1946 to 2015

    NASA Astrophysics Data System (ADS)

    Andres, Norina; Badoux, Alexandre; Techel, Frank

    2017-04-01

    Switzerland, located in the middle of the Alps, is prone to several different natural hazards which regularly cause fatalities. To explore temporal trends as well as demographic and spatial patterns in the number of natural hazard fatalities, a database comprising all natural hazard events causing fatalities was compiled for the years 1946 until 2015. The new database includes avalanche, flood, lightning, windstorm, landslide, debris flow, rockfall, earthquake and ice avalanche processes. Two existing databases were incorporated and the resulting dataset extended by a comprehensive newspaper search. In total the database contains 635 natural hazard events causing 1023 fatalities. The database does not include victims which exposed themselves to an important danger on purpose (e.g. high risk sports). The most common causes of death were snow avalanches (37 %), followed by lightning (16 %), floods (12 %), windstorms (10 %), rockfall (8 %), landslides (7 %) and other processes (9 %). Around 14.6 fatalities occurred on average each year. A distinct decrease of natural hazard fatalities could be shown over the last 70 years, which was mostly due to the decline in the number of avalanche and lightning fatalities. Thus, nearly three times as many people were killed by natural hazard processes from 1946 to 1980 than from 1981 to 2015. Normalisation of fatality data by population resulted in a clearly declining annual crude mortality rate: 3.9 deaths per million persons for the first 35 years and 1.1 deaths per million persons for the second 35 years of the study period. The average age of the victims was approximately 36 years and about 75% were males. Most people were killed in summer (JJA, 42%) and winter (DJF, 32 %). Furthermore, almost two-thirds of the fatalities took place in the afternoon and evening. The spatial distribution of the natural hazard fatalities over Switzerland was quite homogeneous. However, mountainous parts of the country (Prealps, Alps) were somewhat more prone to fatal events compared to the Swiss Plateau and the Jura. It appears that the overall natural hazard mortality rate in Switzerland over the past 70 years has been relatively low in comparison to rates in other countries or rates of other types of fatal accidents in Switzerland. Nevertheless, the collected data provides a valuable base for analysis and helps authorities to better identify higher risk demographic groups and regions, and accordingly target these to reduce the number of victims.

  17. On the episodic nature of derecho-producing convective systems in the United States

    NASA Astrophysics Data System (ADS)

    Ashley, Walker S.; Mote, Thomas L.; Bentley, Mace L.

    2005-11-01

    Convectively generated windstorms occur over broad temporal and spatial scales; however, one of the larger-scale and most intense of these windstorms has been given the name derecho. This study illustrates the tendency for derecho-producing mesoscale convective systems to group together across the United States - forming a derecho series. The derecho series is recognized as any succession of derechos that develop within a similar synoptic environment with no more than 72 h separating individual events. A derecho dataset for the period 1994-2003 was assembled to investigate the groupings of these extremely damaging convective wind events. Results indicate that over 62% of the derechos in the dataset were members of a derecho series. On average, nearly six series affected the United States annually. Most derecho series consisted of two or three events; though, 14 series during the period of record contained four or more events. Two separate series involved nine derechos within a period of nine days. Analyses reveal that derecho series largely frequent regions of the Midwest, Ohio Valley, and the south-central Great Plains during May, June, and July. Results suggest that once a derecho occurred during May, June, or July, there was a 58% chance that this event was the first of a series of two or more, and about a 46% chance that this was the first of a derecho series consisting of three or more events. The derecho series climatology reveals that forecasters in regions frequented by derechos should be prepared for the probable regeneration of a derecho-producing convective system after an initial event occurs. Copyright

  18. New Observatory at the University of Tennessee at Martin

    NASA Astrophysics Data System (ADS)

    Crews, Lionel J.; Chrysler, R.; Turner, K.

    2010-01-01

    A new observatory has been completed at the University of Tennessee at Martin and is now open for student research, local teacher training, and public outreach. The telescope is a 16" Meade RCT on a Software Bisque Paramount ME mount, 10' HomeDome, and SBIG CCD camera. The project endured many delays from a necessary change in housing from roll-top roof to dome, to the shutter blowing off in a heavy windstorm. This project was funded primarily by a Tennessee Math-Science Partnership grant (PI: Dr. Michael Gibson, UT Martin) directed at secondary teacher training in sciences.

  19. North Atlantic explosive cyclones and large scale atmospheric variability modes

    NASA Astrophysics Data System (ADS)

    Liberato, Margarida L. R.

    2015-04-01

    Extreme windstorms are one of the major natural catastrophes in the extratropics, one of the most costly natural hazards in Europe and are responsible for substantial economic damages and even fatalities. During the last decades Europe witnessed major damage from winter storms such as Lothar (December 1999), Kyrill (January 2007), Klaus (January 2009), Xynthia (February 2010), Gong (January 2013) and Stephanie (February 2014) which exhibited uncommon characteristics. In fact, most of these storms crossed the Atlantic in direction of Europe experiencing an explosive development at unusual lower latitudes along the edge of the dominant North Atlantic storm track and reaching Iberia with an uncommon intensity (Liberato et al., 2011; 2013; Liberato 2014). Results show that the explosive cyclogenesis process of most of these storms at such low latitudes is driven by: (i) the southerly displacement of a very strong polar jet stream; and (ii) the presence of an atmospheric river (AR), that is, by a (sub)tropical moisture export over the western and central (sub)tropical Atlantic which converges into the cyclogenesis region and then moves along with the storm towards Iberia. Previous studies have pointed to a link between the North Atlantic Oscillation (NAO) and intense European windstorms. On the other hand, the NAO exerts a decisive control on the average latitudinal location of the jet stream over the North Atlantic basin (Woollings et al. 2010). In this work the link between North Atlantic explosive cyclogenesis, atmospheric rivers and large scale atmospheric variability modes is reviewed and discussed. Liberato MLR (2014) The 19 January 2013 windstorm over the north Atlantic: Large-scale dynamics and impacts on Iberia. Weather and Climate Extremes, 5-6, 16-28. doi: 10.1016/j.wace.2014.06.002 Liberato MRL, Pinto JG, Trigo IF, Trigo RM. (2011) Klaus - an exceptional winter storm over Northern Iberia and Southern France. Weather 66:330-334. doi:10.1002/wea.755 Liberato MLR, Pinto JG, Trigo RM, Ludwig P, Ordóñez P, Yuen D, Trigo IF (2013) Explosive development of winter storm Xynthia over the subtropical North Atlantic Ocean. Nat Hazards Earth Syst Sci 13:2239-2251. doi:10.5194/nhess-13-2239-2013 Woollings T, Hannachi A, Hoskins B (2010) Variability of the North Atlantic eddy-driven jet stream. Quart. J. Roy. Meteor. Soc., 136, 856-868, doi:10.1002/qj.625 Acknowledgements: This work was partially supported by FEDER (Fundo Europeu de Desenvolvimento Regional) funds through the COMPETE (Programa Operacional Factores de Competitividade) and by national funds through FCT (Fundação para a Ciência e a Tecnologia, Portugal) under project STORMEx FCOMP-01-0124-FEDER- 019524 (PTDC/AAC-CLI/121339/2010).

  20. An observational analysis of a derecho in South China

    NASA Astrophysics Data System (ADS)

    Xia, Rudi; Wang, Donghai; Sun, Jianhua; Wang, Gaili; Xia, Guancong

    2012-12-01

    Derechos occur frequently in Europe and the United States, but reports of derechos in China are scarce. In this paper, radar, satellite, and surface observation data are used to analyze a derecho event in South China on 17 April 2011. A derecho-producing mesoscale convective system formed in an environment with medium convective available energy, strong vertical wind shear, and a dry layer in the middle troposphere, and progressed southward in tandem with a front and a surface wind convergence line. The windstorm can be divided into two stages according to differences in the characteristics of the radar echo and the causes of the gale. One stage was a supercell stage, in which the sinking rear inflow of a high-precipitation supercell with a bow-shaped radar echo induced a Fujita F0 class gale. The other stage was a non-supercell stage (the echo was sequentially kidney-shaped, foot-shaped, and an ordinary single cell), in which downbursts induced a gale in Fujita F1 class. This derecho event had many similarities with derechos observed in western countries. For example, the windstorm was perpendicular to the mean flow, the gale was located in the bulging portion of the bow echo, and the derecho moved southward along with the surface front. Some differences were observed as well. The synoptic-scale forcing was weak in the absence of an advancing high-amplitude midlevel trough and an accompanying strong surface cyclone; however, the vertical wind shear was very strong, a characteristic typical of derechos associated with strong synoptic-scale forcing. Extremely high values of convective available potential energy and downdraft convective available potential energy have previously been considered necessary to the formation of weak-forcing archetype and hybrid derechos; however, these values were much less than 2000 J during this derecho event.

  1. An overview of natural hazard impacts to railways and urban transportation systems

    NASA Astrophysics Data System (ADS)

    Bíl, Michal; Nezval, Vojtěch; Bílová, Martina; Andrášik, Richard; Kubeček, Jan

    2017-04-01

    We present an overview and two case studies of natural hazard impacts on rail transportation systems in the Czech Republic. Flooding, landsliding, heavy snowfall, windstorms and glaze (black ice) are the most common natural processes which occur in this region. Whereas flooding and landsliding usually cause direct damage to the transportation infrastructure, other hazards predominantly cause indirect losses. Railway and urban tramline networks are almost fully dependent on electricity which is provided by a system of overhead lines (electric lines above the tracks). These lines are extremely susceptible to formation of glaze which blocks conduction of electric current. A December 2014 glaze event caused significant indirect losses in the largest Czech cities and railways due to the above-mentioned process. Details of this event will be provided during the presentation. Windstorms usually cause tree falls which can affect overhead lines and physically block railway tracks. Approximately 30 % of the Czech railway network is closer than 50 m from the nearest forest. This presents significant potential for transport interruption due to falling trees. Complicated legal relations among the owners of the plots of land along railways, the environment (full-grown trees and related habitat), and the railway administrator are behind many traffic interruptions due to falling trees. We have registered 2040 tree falls between 2012 and 2015 on the railway network. A model of the fallen tree hazard was created for the entire Czech railway network. Both above-mentioned case studies provide illustrative examples of the increased fragility of the modern transportation systems which fully rely on electricity. Natural processes with a low destructive power are thereby able to cause network wide service cut-offs.

  2. Cloud based, Open Source Software Application for Mitigating Herbicide Drift

    NASA Astrophysics Data System (ADS)

    Saraswat, D.; Scott, B.

    2014-12-01

    The spread of herbicide resistant weeds has resulted in the need for clearly marked fields. In response to this need, the University of Arkansas Cooperative Extension Service launched a program named Flag the Technology in 2011. This program uses color-coded flags as a visual alert of the herbicide trait technology within a farm field. The flag based program also serves to help avoid herbicide misapplication and prevent herbicide drift damage between fields with differing crop technologies. This program has been endorsed by Southern Weed Science Society of America and is attracting interest from across the USA, Canada, and Australia. However, flags have risk of misplacement or disappearance due to mischief or severe windstorms/thunderstorms, respectively. This presentation will discuss the design and development of a cloud-based, free application utilizing open-source technologies, called Flag the Technology Cloud (FTTCloud), for allowing agricultural stakeholders to color code their farm fields for indicating herbicide resistant technologies. The developed software utilizes modern web development practices, widely used design technologies, and basic geographic information system (GIS) based interactive interfaces for representing, color-coding, searching, and visualizing fields. This program has also been made compatible for a wider usability on different size devices- smartphones, tablets, desktops and laptops.

  3. Exceptional winter storms affecting Western Iberia and extremes: diagnosis, modelling and multi-model ensemble projection

    NASA Astrophysics Data System (ADS)

    Liberato, M. L. R.; Pinto, J. G.; Gil, V.; Ramos, A. M.; Trigo, R. M.

    2017-12-01

    Extratropical cyclones dominate autumn and winter weather over Western Europe and particularly over the Iberian Peninsula. Intense, high-impact storms are one of the major weather risks in the region, mostly due to the simultaneous occurrence of high winds and extreme precipitation events. These intense extratropical cyclones may result in windstorm damage, flooding and coastal storm surges, with large societal impacts. In Portugal, due to the extensive human use of coastal areas, the natural and built coastal environments have been amongst the most affected. In this work several historical winter storms that adversely affected the Western Iberian Peninsula are studied in detail in order to contribute to an improved assessment of the characteristics of these events. The diagnosis has been performed based on instrumental daily precipitation and wind records, on satellite images, on reanalysis data and through model simulations. For several examples the synoptic evolution and upper-level dynamics analysis of physical processes controlling the life cycle of extratropical storms associated with the triggering of the considered extreme events has also been accomplished. Furthermore, the space-time variability of the exceptionally severe storms affecting Western Iberia over the last century and under three climate scenarios (the historical simulation, the RCP4.5 and RCP8.5 scenarios) is presented. These studies contribute to improving the knowledge of atmospheric dynamics controlling the life cycle of midlatitude storms associated to severe weather (precipitation and wind) in the Iberian Peninsula. AcknowledgementsThis work is supported by the Portuguese Foundation for Science and Technology (FCT), Portugal, through project UID/GEO/50019/2013 - Instituto Dom Luiz. A. M. Ramos is also supported by a FCT postdoctoral grant (FCT/DFRH/SFRH/BPD/84328/2012).

  4. Growth response of oaks, beech and pine to Standardized Precipitation Index (SPI)

    NASA Astrophysics Data System (ADS)

    Stojanovic, Dejan; Levanič, Tom; Matović, Bratislav; Orlović, Saša

    2017-04-01

    Climate change may have various consequences on forests, from more frequent forest fires and windstorms to pest and disease outbreaks. Standardized Precipitation Index (SPI) was chosen for the evaluation of climate change impact to radial forest growth, after comprehensive testing of different climate parameters from CARPATCLIM database. SPI was calculated for periods between 3 and 36 months for different forest stands (lowland and mountainous parts of Serbia, Southeast Europe). Observed were following tree species: Quercus robur, Q. cerris, Fagus sylvatica and Pinus sylvestris. Bootstrapped Pearson's correlation between SPI monthly indices and tree-ring widths was calculated and ranked for all species. We found that 12-month SPI for summer months may be a good predictor for growth of different species at different sites. The strongest positive correlation between tree-ring width indices and SPI was particularly from the year of growth, since the strongest negative correlation for all four species was exclusively from the year prior to growth. The strongest positive correlation were between 12 and 14-month SPI from June to September, which suggests that the high growth rates are expected when autumn of previous-year, winter, spring and summer of the current year are with high precipitation rates.

  5. A history of wind erosion prediction models in the United States Department of Agriculture prior to the Wind Erosion Prediction System

    NASA Astrophysics Data System (ADS)

    Tatarko, John; Sporcic, Michael A.; Skidmore, Edward L.

    2013-09-01

    The Great Plains experienced an influx of settlers in the late 1850s-1900. Periodic drought was hard on both settlers and the soil and caused severe wind erosion. The period known as the Dirty Thirties, 1931-1939, produced many severe windstorms, and the resulting dusty sky over Washington, DC helped Hugh Hammond Bennett gain political support for the Soil Conservation Act of 1937 that started the USDA Soil Conservation Service (SCS). Austin W. Zingg and William S. Chepil began wind erosion studies at a USDA laboratory at Kansas State University in 1947. Neil P. Woodruff and Francis H. Siddoway published the first widely used model for wind erosion in 1965, called the Wind Erosion Equation (WEQ). The WEQ was solved using a series of charts and lookup tables. Subsequent improvements to WEQ included monthly magnitudes of the total wind, a computer version of WEQ programmed in FORTRAN, small-grain equivalents for range grasses, tillage systems, effects of residue management, crop row direction, cloddiness, monthly climate factors, and the weather. The SCS and the Natural Resources Conservation Service (NRCS) produced several computer versions of WEQ with the goal of standardizing and simplifying it for field personnel including a standalone version of WEQ was developed in the late 1990s using Microsoft Excel. Although WEQ was a great advancement to the science of prediction and control of wind erosion on cropland, it had many limitations that prevented its use on many lands throughout the United States and the world. In response to these limitations, the USDA developed a process-based model know as the Wind Erosion Prediction System (WEPS). The USDA Agricultural Research Service has taken the lead in developing science and technology for wind erosion prediction.

  6. Index insurance for pro-poor conservation of hornbills in Thailand

    PubMed Central

    Chantarat, Sommarat; Barrett, Christopher B.; Janvilisri, Tavan; Mudsri, Sittichai; Niratisayakul, Chularat

    2011-01-01

    This study explores the potential of index insurance as a mechanism to finance community-based biodiversity conservation in areas where a strong correlation exists between natural disaster risk, keystone species populations, and the well-being of the local population. We illustrate this potential using the case of hornbill conservation in the Budo-Sungai Padi rainforests of southern Thailand, using 16-y hornbill reproduction data and 5-y household expenditures data reflecting local economic well-being. We show that severe windstorms cause both lower household expenditures and critical nest tree losses that directly constrain nesting capacity and so reduce the number of hornbill chicks recruited in the following breeding season. Forest residents’ coping strategies further disturb hornbills and their forest habitats, compounding windstorms’ adverse effects on hornbills’ recruitment in the following year. The strong statistical relationship between wind speed and both hornbill nest tree losses and household expenditures opens up an opportunity to design wind-based index insurance contracts that could both enhance hornbill conservation and support disaster-affected households in the region. We demonstrate how such contracts could be written and operationalized and then use simulations to show the significant promise of unique insurance-based approaches to address weather-related risk that threatens both biodiversity and poor populations. PMID:21873183

  7. Breadfruit (Artocarpus altilis) gibberellin 2-oxidase genes in stem elongation and abiotic stress response.

    PubMed

    Zhou, Yuchan; Underhill, Steven J R

    2016-01-01

    Breadfruit (Artocarpus altilis) is a traditional staple tree crop in the Oceania. Susceptibility to windstorm damage is a primary constraint on breadfruit cultivation. Significant tree loss due to intense tropical windstorm in the past decades has driven a widespread interest in developing breadfruit with dwarf stature. Gibberellin (GA) is one of the most important determinants of plant height. GA 2-oxidase is a key enzyme regulating the flux of GA through deactivating biologically active GAs in plants. As a first step toward understanding the molecular mechanism of growth regulation in the species, we isolated a cohort of four full-length GA2-oxidase cDNAs, AaGA2ox1- AaGA2ox4 from breadfruit. Sequence analysis indicated the deduced proteins encoded by these AaGA2oxs clustered together under the C19 GA2ox group. Transcripts of AaGA2ox1, AaGA2ox2 and AaGA2ox3 were detected in all plant organs, but exhibited highest level in source leaves and stems. In contrast, transcript of AaGA2ox4 was predominantly expressed in roots and flowers, and displayed very low expression in leaves and stems. AaGA2ox1, AaGA2ox2 and AaGA2ox3, but not AaGA2ox4 were subjected to GA feedback regulation where application of exogenous GA3 or gibberellin biosynthesis inhibitor, paclobutrazol was shown to manipulate the first internode elongation of breadfruit. Treatments of drought or high salinity increased the expression of AaGA2ox1, AaGA2ox2 and AaGA2ox4. But AaGA2ox3 was down-regulated under salt stress. The function of AaGA2oxs is discussed with particular reference to their role in stem elongation and involvement in abiotic stress response in breadfruit. Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  8. A geostatistical extreme-value framework for fast simulation of natural hazard events

    PubMed Central

    Stephenson, David B.

    2016-01-01

    We develop a statistical framework for simulating natural hazard events that combines extreme value theory and geostatistics. Robust generalized additive model forms represent generalized Pareto marginal distribution parameters while a Student’s t-process captures spatial dependence and gives a continuous-space framework for natural hazard event simulations. Efficiency of the simulation method allows many years of data (typically over 10 000) to be obtained at relatively little computational cost. This makes the model viable for forming the hazard module of a catastrophe model. We illustrate the framework by simulating maximum wind gusts for European windstorms, which are found to have realistic marginal and spatial properties, and validate well against wind gust measurements. PMID:27279768

  9. Investigating the impact of surface wave breaking on modeling the trajectories of drifters in the northern Adriatic Sea during a wind-storm event

    USGS Publications Warehouse

    Carniel, S.; Warner, J.C.; Chiggiato, J.; Sclavo, M.

    2009-01-01

    An accurate numerical prediction of the oceanic upper layer velocity is a demanding requirement for many applications at sea and is a function of several near-surface processes that need to be incorporated in a numerical model. Among them, we assess the effects of vertical resolution, different vertical mixing parameterization (the so-called Generic Length Scale -GLS- set of k-??, k-??, gen, and the Mellor-Yamada), and surface roughness values on turbulent kinetic energy (k) injection from breaking waves. First, we modified the GLS turbulence closure formulation in the Regional Ocean Modeling System (ROMS) to incorporate the surface flux of turbulent kinetic energy due to wave breaking. Then, we applied the model to idealized test cases, exploring the sensitivity to the above mentioned factors. Last, the model was applied to a realistic situation in the Adriatic Sea driven by numerical meteorological forcings and river discharges. In this case, numerical drifters were released during an intense episode of Bora winds that occurred in mid-February 2003, and their trajectories compared to the displacement of satellite-tracked drifters deployed during the ADRIA02-03 sea-truth campaign. Results indicted that the inclusion of the wave breaking process helps improve the accuracy of the numerical simulations, subject to an increase in the typical value of the surface roughness z0. Specifically, the best performance was obtained using ??CH = 56,000 in the Charnok formula, the wave breaking parameterization activated, k-?? as the turbulence closure model. With these options, the relative error with respect to the average distance of the drifter was about 25% (5.5 km/day). The most sensitive factors in the model were found to be the value of ??CH enhanced with respect to a standard value, followed by the adoption of wave breaking parameterization and the particular turbulence closure model selected. ?? 2009 Elsevier Ltd.

  10. Nonlinear behaviors of FRP-wrapped tall trees subjected to high wind loads

    NASA Astrophysics Data System (ADS)

    Kang, J.; Yi, Z. Z.; Choi, S. G.

    2017-12-01

    This study investigated the mechanical stability of historical tall trees wrapped with fiber-reinforced polymer (FRP) laminates using finite element (FE) analysis. High wind loads are considered as external loading conditions as they are one of the major threats on the structural stability of tall old trees. There have been several traditional practices to enhance the stability of tall trees exposed to high windstorms such as tree supporters and anchorages. They, however, have been sometimes causing negative effects with their misuses as the application guidelines for those methods were not adequately studied or documented. Furthermore, the oldest known trees in the country should be protected from the damage of external surface as well as ruin of the landscape. The objective of this study was to evaluate the structural effects of FRP wraps applied to tall trees subjected to high wind loads. The anisotropic material properties of wood and FRP laminates were considered in the analysis in addition to geometrically nonlinear behaviors. This study revealed that FRP wrapping for tall trees could effectively reduce the deflections and maximum stresses of trees, which results in the enhanced stability of tall trees. The optimum geometry and thicknesses of FRP wraps proposed in this study would provide fundemental guidelines for designing and constructing the application of innovative FRP wraps on tall trees, which are structurally unstable or should be preserved nationally and historically.

  11. Usefulness of syndromic data sources for investigating morbidity resulting from a severe weather event.

    PubMed

    Baer, Atar; Elbert, Yevgeniy; Burkom, Howard S; Holtry, Rekha; Lombardo, Joseph S; Duchin, Jeffrey S

    2011-03-01

    We evaluated emergency department (ED) data, emergency medical services (EMS) data, and public utilities data for describing an outbreak of carbon monoxide (CO) poisoning following a windstorm. Syndromic ED data were matched against previously collected chart abstraction data. We ran detection algorithms on selected time series derived from all 3 data sources to identify health events associated with the CO poisoning outbreak. We used spatial and spatiotemporal scan statistics to identify geographic areas that were most heavily affected by the CO poisoning event. Of the 241 CO cases confirmed by chart review, 190 (78.8%) were identified in the syndromic surveillance data as exact matches. Records from the ED and EMS data detected an increase in CO-consistent syndromes after the storm. The ED data identified significant clusters of CO-consistent syndromes, including zip codes that had widespread power outages. Weak temporal gastrointestinal (GI) signals, possibly resulting from ingestion of food spoiled by lack of refrigeration, were detected in the ED data but not in the EMS data. Spatial clustering of GI-based groupings in the ED data was not detected. Data from this evaluation support the value of ED data for surveillance after natural disasters. Enhanced EMS data may be useful for monitoring a CO poisoning event, if these data are available to the health department promptly. ©2011 American Medical Association. All rights reserved.

  12. Tree stability under wind: simulating uprooting with root breakage using a finite element method.

    PubMed

    Yang, Ming; Défossez, Pauline; Danjon, Frédéric; Fourcaud, Thierry

    2014-09-01

    Windstorms are the major natural hazard affecting European forests, causing tree damage and timber losses. Modelling tree anchorage mechanisms has progressed with advances in plant architectural modelling, but it is still limited in terms of estimation of anchorage strength. This paper aims to provide a new model for root anchorage, including the successive breakage of roots during uprooting. The model was based on the finite element method. The breakage of individual roots was taken into account using a failure law derived from previous work carried out on fibre metal laminates. Soil mechanical plasticity was considered using the Mohr-Coulomb failure criterion. The mechanical model for roots was implemented in the numerical code ABAQUS using beam elements embedded in a soil block meshed with 3-D solid elements. The model was tested by simulating tree-pulling experiments previously carried out on a tree of Pinus pinaster (maritime pine). Soil mechanical parameters were obtained from laboratory tests. Root system architecture was digitized and imported into ABAQUS while root material properties were estimated from the literature. Numerical simulations of tree-pulling tests exhibited realistic successive root breakages during uprooting, which could be seen in the resulting response curves. Broken roots could be visually located within the root system at any stage of the simulations. The model allowed estimation of anchorage strength in terms of the critical turning moment and accumulated energy, which were in good agreement with in situ measurements. This study provides the first model of tree anchorage strength for P. pinaster derived from the mechanical strength of individual roots. The generic nature of the model permits its further application to other tree species and soil conditions.

  13. Tree stability under wind: simulating uprooting with root breakage using a finite element method

    PubMed Central

    Yang, Ming; Défossez, Pauline; Danjon, Frédéric; Fourcaud, Thierry

    2014-01-01

    Background and Aims Windstorms are the major natural hazard affecting European forests, causing tree damage and timber losses. Modelling tree anchorage mechanisms has progressed with advances in plant architectural modelling, but it is still limited in terms of estimation of anchorage strength. This paper aims to provide a new model for root anchorage, including the successive breakage of roots during uprooting. Methods The model was based on the finite element method. The breakage of individual roots was taken into account using a failure law derived from previous work carried out on fibre metal laminates. Soil mechanical plasticity was considered using the Mohr–Coulomb failure criterion. The mechanical model for roots was implemented in the numerical code ABAQUS using beam elements embedded in a soil block meshed with 3-D solid elements. The model was tested by simulating tree-pulling experiments previously carried out on a tree of Pinus pinaster (maritime pine). Soil mechanical parameters were obtained from laboratory tests. Root system architecture was digitized and imported into ABAQUS while root material properties were estimated from the literature. Key Results Numerical simulations of tree-pulling tests exhibited realistic successive root breakages during uprooting, which could be seen in the resulting response curves. Broken roots could be visually located within the root system at any stage of the simulations. The model allowed estimation of anchorage strength in terms of the critical turning moment and accumulated energy, which were in good agreement with in situ measurements. Conclusions This study provides the first model of tree anchorage strength for P. pinaster derived from the mechanical strength of individual roots. The generic nature of the model permits its further application to other tree species and soil conditions. PMID:25006178

  14. CFD Prediction on the Pressure Distribution and Streamlines around an Isolated Single-Storey House Considering the Effect of Topographic Characteristics

    NASA Astrophysics Data System (ADS)

    Abdullah, J.; Zaini, S. S.; Aziz, M. S. A.; Majid, T. A.; Deraman, S. N. C.; Yahya, W. N. W.

    2018-04-01

    Single-storey houses are classified as low rise building and vulnerable to damages under windstorm event. This study was carried out with the aim to investigate the pressure distribution and streamlines around an isolated house by considering the effect of terrain characteristics. The topographic features such as flat, depression, ridge, and valley, are considered in this study. This simulation were analysed with Ansys FLUENT 14.0 software package. The result showed the topography characteristics influence the value of pressure coefficient and streamlines especially when the house was located at ridge terrain. The findings strongly suggested that wind analysis should include all topographic features in the analysis in order to establish the true wind force exerted on any structure.

  15. Hydro-meteorological extreme events in the 18th century in Portugal

    NASA Astrophysics Data System (ADS)

    Fragoso, Marcelo; João Alcoforado, Maria; Taborda, João Paulo

    2013-04-01

    The present work is carried out in the frame of the KLIMHIST PROJECT ("Reconstruction and model simulations of past climate in Portugal using documentary and early instrumental sources, 17th-19th century)", and is devoted to the study of hydro-meteorological extreme events during the last 350 years, in order to understand how they have changed in time and compare them with current analogues. More specifically, the results selected to this presentation will focus on some hydro-meteorological extreme events of the 18th century, like severe droughts, heavy precipitation episodes and windstorms. One of the most noteworthy events was the winterstorm Bárbara (3rd to 6th December 1739), already studied in prior investigations (Taborda et al, 2004; Pfister et al, 2010), a devastating storm with strong impacts in Portugal caused by violent winds and heavy rainfall. Several other extreme events were detected by searching different documentary archives, including individual, administrative and ecclesiastic sources. Moreover, a more detailed insight to the 1783-1787 period will be made with regard the Lisbon region, taking into consideration the availability of information for daily meteorological observations as well as documentary evidences, like descriptions from Gazeta de Lisboa, the periodic with more continuous publication in the 18thcentury. Key-words: Instrumental data, Documentary data, Extreme events, Klimhist Project, Portugal References Pfister, C., Garnier, E., Alcoforado, M.J., Wheeler, D. Luterbacher, J. Nunes, M.F., Taborda, J.P. (2010) The meteorological framework and the cultural memory of three severe winter-storms in early eighteenth-century Europe, Climatic Change, 101, 1-2, 281-310 Taborda, JP; Alcoforado, MJ and Garcia, JC (2004) O Clima do Sul de Portugal no Séc.XVIII, Centro de Estudos Geográficos, Área de de Investigação de Geo-Ecologia, relatório no 2

  16. Use of Citizen Science and Social Media to Improve Wind Hazard and Damage Characterization

    NASA Astrophysics Data System (ADS)

    Lombardo, F.; Meidani, H.

    2017-12-01

    Windstorm losses are significant in the U.S. annually and cause damage worldwide. A large percentage of losses are caused by localized events (e.g., tornadoes). In order to better mitigate these losses improvement is needed in understanding the hazard characteristics and physical damage. However, due to the small-scale nature of these events the resolution of the dedicated measuring network does not capture most occurrences. As a result damage-based assessments are sometimes used to gauge intensity. These damage assessments often suffer from a lack of available manpower, inability to arrive at the scene rapidly and difficulty accessing a damaged site. The use and rapid dissemination of social media, the power of crowds engaged in scientific endeavors, and the public's awareness of their vulnerabilities point to a paradigm shift in how hazards can be sensed in a rapid manner. In this way, `human-sensor' data has the potential to radically improve fundamental understanding of hazard and disasters and resolve some of the existing challenges in wind hazard and damage characterization. Data from social media outlets such as Twitter have been used to aid in damage assessments from hazards such as flood and earthquake, however, the reliability and uncertainty of participatory sensing has been questioned and has been called the `biggest challenge' for its sustained use. This research proposes to investigate the efficacy of both citizen science applications and social media data to represent wind hazards and associated damage. Research has focused on a two-phase approach: 1) to have citizen scientists perform their own `damage survey' (i.e., questionnaire) with known damage to assess uncertainty in estimation and 2) downloading and analysis of social media text and imagery streams to ascertain the possibility of performing `unstructured damage surveys'. Early results have shown that the untrained public can estimate tornado damage levels in residential structures with some accuracy. In addition, valuable windstorm hazard and damage information in both text and imagery can be extracted and archived from Twitter in an automated fashion. Information extracted from these sources will feed into advances in hazard and disaster modeling, social-cognitive theories of human behavior and decision-making for hazard mitigation.

  17. Ecosystem Disturbances in Central European Spruce Forests: a Multi-proxy Integration of Dendroecology and Sedimentary Records

    NASA Astrophysics Data System (ADS)

    Clear, J.; Chiverrell, R. C.; Kunes, P.; Boyle, J.; Kuosmanen, N.; Carter, V.

    2016-12-01

    The montane Norway spruce (Picea abies) dominated forests of Central Europe are a niche environment; situated outside their natural boreal distribution they are vulnerable to both short term disturbances (e.g. floods, avalanches, fire, windstorm and pathogens) and longer-term environmental change (e.g. climate induced stress, snow regimes). Holocene sediment records from lakes in the High Tatra (Slovakia) and Bohemian (Czech) Mountains show repeated disturbances of the pristine Picea abies-dominated forests as sharp well defined minerogenic in-wash horizons that punctuate the accumulation of organic gyttja. These event horizons span a process continuum from lakes with restricted catchments and limited inflow (e.g. Prazilske Lake, Czech) to more catchment-process dominated lakes with large catchments (e.g. Popradske Lake, Slovakia). The events include complex responses to a global climatic downturn at 8.2ka, other cooler episodes 3.5, 1.6 and 0.5 ka, and to recent discrete wind-storms and pathogen outbreaks. We develop a typology for disturbance events using sediment geochemistry, particle size, mineral magnetism, charcoal and palaeoecology to assess likely drivers of disturbance. For the recent past integrating data from dendroecology and sediments is used to calibrate our longer-term perspective on forest dynamics. Tree-ring series from plots or forest stands are used alongside lake and forest hollow sediments to explore the local, regional and biogeographical scale of forest disturbances. Dendroecological data showing tree-ring gap recruitment and post-suppression growth release highlight frequent disturbance events focused on tree or forest stand spatial scales, but are patchy in terms of their reoccurrence. However they highlight levels of disturbance in the late 19th Century and parallel lake and forest hollow sediments record variable pollen influx (beetle host / non-host ratios) and stratigraphies that include mineral in-wash events. The identified recent and ongoing forest disturbances coupled with well-evidenced events in the 19th century highlight the need for the longer sedimentary perspective to assess whether contemporary climate warming has and continues to stretch the resilience of these fragile ecosystems.

  18. An analysis of the characteristics of extratropical cyclone Klaus

    NASA Astrophysics Data System (ADS)

    Gómara, Iñigo; Rodriguez-Puebla, Concepcion; Yague, Carlos

    2010-05-01

    Klaus was a very destructive extratropical cyclone that affected the south-west of Europe from the 23rd to the 25th of January 2009. In particular, it impacted over northern Spain, southern France and Italy where losses totalled billions of Euros and the death toll was 31. The extreme strength of the wind gusts generated by the storm was the main reason for the damage caused. Klaus had the properties of a cyclonic "bomb", and a brief meteorological description of the windstorm will be presented based on surface and upper-air reanalysis data. The analysis procedure has been based on earlier research carried out in this field by J. R. Gyakum (1980), Lance F. Bosart (1984) and J.R. Reed (1986). Klaus was formed under very favourable growing conditions in the North Atlantic ocean: a high atmospheric baroclinicity level due to high temperature and absolute humidity horizontal gradients and strong upper-air winds. In addition, the surface low that started as a stationary front interacted with a mobile upper trough that was located at an altitude of 9000 m near the surface low on the 23rd of January. A strong polar jet stream region above the surface incipient low was also located in the same region of the storm's growth, around 40W-42.5N. After its formation and interaction with the mobile upper-trough, Klaus moved very fast eastwards until it reached land in France on the 24th. We will discuss some social and economic impacts of the storm and the intervention of governments and weather services before, during and after the windstorm. References Gyakum, J. R. and F. Sanders, 1980: Synoptic-Dynamic Climatology of the "bomb". Monthly Weather Review, 108, 1589-1606. Bosart, L. F. and S.C. Lin, 1984: A diagnostic analysis of the Presidents' Day Storm of February 1979. Monthly Weather Review, 112, 2148-2177. Reed, J. R. and M. D. Albright, 1986: A case study of explosive cyclogenesis in the eastern Pacific. Monthly Weather Review, 114, 2297-2319.

  19. International decade for natural disaster reduction

    USGS Publications Warehouse

    Hays, W. W.

    1990-01-01

    Throughout history, humanity has found itself in conflict with naturally occurring events of geologic, hydrologic, and atmospheric origin. this conflict has been demonstrated repeatedly when people build urban centers at the water's edge, in or near active fault systems capable of generating earthquakes, on steep slopes, near active volcanoes, or at the urban-wilderness interface prone to wildfires. Naturally occurring, recurrent events such as floods, windstorms, tsunamis, earthquakes, landslides, volcanic eruptions, and wildfires have tested human-engineered works many times and have often found them unable to withstand the forces generated by the event. In the past 20 years, for example, events like these throughout the world have claimed more than 2.8 million lives and adversely affected 820 million people; single disasters have caused economic losses of billions of dollars. Industrialized countries like the United States and Japan have been able to absorb the socioeconomic losses of past natural disasters, but the economics of many developing countries have been devastated by losses equal to a large percentage of their gross national product. Furthermore, the magnitude of the losses is increasing at a rapid rate as the building wealth of nations is expanded to meet the needs of rapidly increasing population, often without adequate consideration of the potential threat posed by the recurrent natural hazards and without implementing effective loss-reduction measures because of lack of knowledge or lack of technical capability. 

  20. A Climatology of Derecho-Producing Mesoscale Convective Systems in the Central and Eastern United States, 1986-95. Part I: Temporal and Spatial Distribution.

    NASA Astrophysics Data System (ADS)

    Bentley, Mace L.; Mote, Thomas L.

    1998-11-01

    In 1888, Iowa weather researcher Gustavus Hinrichs gave widespread convectively induced windstorms the name "derecho". Refinements to this definition have evolved after numerous investigations of these systems; however, to date, a derecho climatology has not been conducted.This investigation examines spatial and temporal aspects of derechos and their associated mesoscale convective systems that occurred from 1986 to 1995. The spatial distribution of derechos revealed four activity corridors during the summer, five during the spring, and two during the cool season. Evidence suggests that the primary warm season derecho corridor is located in the southern Great Plains. During the cool season, derecho activity was found to occur in the southeast states and along the Atlantic seaboard. Temporally, derechos are primarily late evening or overnight events during the warm season and are more evenly distributed throughout the day during the cool season.

  1. Wind vs Water in Hurricanes: The Challenge of Multi-peril Hazard Modeling

    NASA Astrophysics Data System (ADS)

    Powell, M. D.

    2017-12-01

    With the advancing threat of Sea Level Rise much of the U. S. is in danger of falling into the "protection gap". Residential property flood risk is not yet covered by the insurance market. Many coastal properties are not paying into the National Flood Insurance Program (NFIP) at premiums commensurate with the risk. This is exasperated by the program being deep in debt, despite only covering a fraction of the potential loss, while windstorm insurance covers up to replacement value. This results in a battle that benefits nobody. Any significant hurricane will include both wind and storm surge perils at the same time and any coastal property has to contend with the risk of damage by both. If you have extensive flood damage your wind storm policy might deny your claim and your flood policy (if you even have one) will in most cases be constrained to a $250,000 limit. Bring on the litigators! Some homeowners will claim that the wind destroyed the home first and then it was carried away by flood waters or pulverized by waves. Insurers might respond that the storm surge did all the damage and deny the claim. We've seen this already following Hurricane Katrina in 2005, and Hurricane Ike in 2008, with thousands of litigation claims and a cottage industry of scientists serving as expert witnesses on both sides of the aisle. Congress responded in 2012 with the Coastal Act, which provided an "unfunded mandate" directing NOAA to provide wind and water level data to FEMA for input to their "Coastal Formula" for attributing loss to wind and water. The results of the formula would then limit the amount paid by the NFIP by subtracting out the wind loss portion. The Texas Windstorm Insurance Association (TWIA) went further by assembling a panel of experts to recommend guidelines for how the state should respond to future hurricane impacting properties on the Texas coast. The expert panel report was released in April of 2016, and TWIA is currently developing a comprehensive operational solution to collect wind and water level measurements, and to conduct observation based modeling of wind and water impacts. My presentation will discuss some of the challenges to wind and water hazard monitoring and modeling.

  2. Short-term variability in particle flux: Storms, blooms and river discharge in a coastal sea

    NASA Astrophysics Data System (ADS)

    Johannessen, Sophia C.; Macdonald, Robie W.; Wright, Cynthia A.; Spear, David J.

    2017-07-01

    The flux and composition of particles sinking in the surface ocean vary on a wide range of time scales. This variability is a component of underwater weather that is analogous to rain. The rain of particles in the coastal ocean is affected by atmospheric events, such as rainstorms and windstorms; by events on land, such as peaks in river discharge or coastal erosion; and by events within the surface ocean, such as phytoplankton blooms. Here, we use a four-year record of sinking particles collected using sediment traps moored at 50 m depth at two locations in the Strait of Georgia, a coastal sea off the west coast of Canada, to determine the relative importance of short-term events to particle flux. We identify four dominant types of particle-flux events: those associated with 1) summer freshet of the Fraser River, 2) rainstorms, 3) phytoplankton blooms, and 4) a jellyfish bloom. The relative importance of these events differs between the southern Strait, where the Fraser River freshet dominates flux and variability, and the northern Strait, where the effects of phytoplankton blooms, rainstorms and small local rivers are more evident. During 2008-2012, half of each year's total flux accumulated over 10-26% of the year in the southern Strait, mainly during the Fraser River freshet. In the northern Strait half of the annual flux accumulated over 22-36% of the year, distributed among small events during spring to fall. The composition of the sinking particulate matter also varied widely, with organic carbon and biogenic silica ranging over 0.70-5.7% (excluding one event) and 0.4-14%, respectively, in the south, compared with 0.17-22% and 0.31-33% in the north. Windstorms had no immediate effect on particle flux in either basin. A large phytoplankton bloom in April 2011, in the northern Strait contributed 25% of the year's organic carbon at that site and 53% of the biogenic silica. A jellyfish bloom in July 2008 contributed 16% of the year's nitrogen and 12% of the year's organic carbon during a single collection interval (12 days). As short-term climate variability increases in a warming climate , the importance of these sorts of events is likely to increase in the future, particularly in coastal waters that are strongly influenced by river discharge.

  3. The New Multi-HAzard and MulTi-RIsK Assessment MethodS for Europe (MATRIX) Project - An overview of its major findings

    NASA Astrophysics Data System (ADS)

    Fleming, Kevin; Zschau, Jochen; Gasparini, Paolo

    2014-05-01

    Recent major natural disasters, such as the 2011 Tōhoku earthquake, tsunami and subsequent Fukushima nuclear accident, have raised awareness of the frequent and potentially far-reaching interconnections between natural hazards. Such interactions occur at the hazard level, where an initial hazard may trigger other events (e.g., an earthquake triggering a tsunami) or several events may occur concurrently (or nearly so), e.g., severe weather around the same time as an earthquake. Interactions also occur at the vulnerability level, where the initial event may make the affected community more susceptible to the negative consequences of another event (e.g., an earthquake weakens buildings, which are then damaged further by windstorms). There is also a temporal element involved, where changes in exposure may alter the total risk to a given area. In short, there is the likelihood that the total risk estimated when considering multiple hazard and risks and their interactions is greater than the sum of their individual parts. It is with these issues in mind that the European Commission, under their FP7 program, supported the New Multi-HAzard and MulTi-RIsK Assessment MethodS for Europe or MATRIX project (10.2010 to 12.2013). MATRIX set out to tackle multiple natural hazards (i.e., those of concern to Europe, namely earthquakes, landslides, volcanos, tsunamis, wild fires, storms and fluvial and coastal flooding) and risks within a common theoretical framework. The MATRIX work plan proceeded from an assessment of single-type risk methodologies (including how uncertainties should be treated), cascade effects within a multi-hazard environment, time-dependent vulnerability, decision making and support for multi-hazard mitigation and adaption, and an assessment of how the multi-hazard and risk viewpoint may be integrated into current decision making and risk mitigation programs, considering the existing single-hazard and risk focus. Three test sites were considered during the project: Naples, Cologne, and the French West Indies. In addition, a software platform, the MATRIX-Common IT sYstem (MATRIX-CITY), was developed to allow the evaluation of characteristic multi-hazard and risk scenarios in comparison to single-type analyses. This presentation therefore outlines the more significant outcomes of the project, in particular those dealing with the harmonization of single-type hazards, cascade event analysis, time-dependent vulnerability changes and the response of the disaster management community to the MATRIX point of view.

  4. Carpathian mountain forest vegetation and its responses to climate stressors

    NASA Astrophysics Data System (ADS)

    Zoran, Maria A.; Savastru, Roxana S.; Savastru, Dan M.; Tautan, Marina N.; Baschir, Laurentiu V.; Dida, Adrian I.

    2017-10-01

    Due to anthropogenic and climatic changes, Carpathian Mountains forests in Romania experience environmental degradation. As a result of global climate change, there is growing evidence that some of the most severe weather events could become more frequent in Romania over the next 50 to 100 years. In the case of Carpathian mountain forests, winter storms and heat waves are considered key climate risks, particularly in prealpine and alpine areas. Effects of climate extremes on forests can have both short-term and long-term implications for standing biomass, tree health and species composition. The preservation and enhancement of mountain forest vegetation cover in natural, semi-natural forestry ecosystems is an essential factor in sustaining environmental health and averting natural hazards. This paper aims to: (i) describe observed trends and scenarios for summer heat waves, windstorms and heavy precipitation, based on results from satellite time series NOAA AVHRR, MODIS Terra/Aqua and Landsat TM/ETM+/OLI NDVI and LAI data recorded during 2000-2016 period correlated with meteorological parameters, regional climate models, and other downscaling procedures, and (ii) discuss potential impacts of climate changes and extreme events on Carpathian mountain forest system in Romania. The response of forest land cover vegetation in Carpathian Mountains, Romania to climatic factors varies in different seasons of the years, the diverse vegetation feedbacks to climate changes being related to different vegetation characteristics and meteorological conditions. Based on integrated analysis of satellite and field data was concluded that forest ecosystem functions are responsible of the relationships between mountain specific vegetation and climate.

  5. The Effects of Revealed Information on Catastrophe Loss Projection Models' Characterization of Risk: Damage Vulnerability Evidence from Florida.

    PubMed

    Karl, J Bradley; Medders, Lorilee A; Maroney, Patrick F

    2016-06-01

    We examine whether the risk characterization estimated by catastrophic loss projection models is sensitive to the revelation of new information regarding risk type. We use commercial loss projection models from two widely employed modeling firms to estimate the expected hurricane losses of Florida Atlantic University's building stock, both including and excluding secondary information regarding hurricane mitigation features that influence damage vulnerability. We then compare the results of the models without and with this revealed information and find that the revelation of additional, secondary information influences modeled losses for the windstorm-exposed university building stock, primarily evidenced by meaningful percent differences in the loss exceedance output indicated after secondary modifiers are incorporated in the analysis. Secondary risk characteristics for the data set studied appear to have substantially greater impact on probable maximum loss estimates than on average annual loss estimates. While it may be intuitively expected for catastrophe models to indicate that secondary risk characteristics hold value for reducing modeled losses, the finding that the primary value of secondary risk characteristics is in reduction of losses in the "tail" (low probability, high severity) events is less intuitive, and therefore especially interesting. Further, we address the benefit-cost tradeoffs that commercial entities must consider when deciding whether to undergo the data collection necessary to include secondary information in modeling. Although we assert the long-term benefit-cost tradeoff is positive for virtually every entity, we acknowledge short-term disincentives to such an effort. © 2015 Society for Risk Analysis.

  6. NHERI: Advancing the Research Infrastructure of the Multi-Hazard Community

    NASA Astrophysics Data System (ADS)

    Blain, C. A.; Ramirez, J. A.; Bobet, A.; Browning, J.; Edge, B.; Holmes, W.; Johnson, D.; Robertson, I.; Smith, T.; Zuo, D.

    2017-12-01

    The Natural Hazards Engineering Research Infrastructure (NHERI), supported by the National Science Foundation (NSF), is a distributed, multi-user national facility that provides the natural hazards research community with access to an advanced research infrastructure. Components of NHERI are comprised of a Network Coordination Office (NCO), a cloud-based cyberinfrastructure (DesignSafe-CI), a computational modeling and simulation center (SimCenter), and eight Experimental Facilities (EFs), including a post-disaster, rapid response research facility (RAPID). Utimately NHERI enables researchers to explore and test ground-breaking concepts to protect homes, businesses and infrastructure lifelines from earthquakes, windstorms, tsunamis, and surge enabling innovations to help prevent natural hazards from becoming societal disasters. When coupled with education and community outreach, NHERI will facilitate research and educational advances that contribute knowledge and innovation toward improving the resiliency of the nation's civil infrastructure to withstand natural hazards. The unique capabilities and coordinating activities over Year 1 between NHERI's DesignSafe-CI, the SimCenter, and individual EFs will be presented. Basic descriptions of each component are also found at https://www.designsafe-ci.org/facilities/. Additionally to be discussed are the various roles of the NCO in leading development of a 5-year multi-hazard science plan, coordinating facility scheduling and fostering the sharing of technical knowledge and best practices, leading education and outreach programs such as the recent Summer Institute and multi-facility REU program, ensuring a platform for technology transfer to practicing engineers, and developing strategic national and international partnerships to support a diverse multi-hazard research and user community.

  7. Natural hazard fatalities in Switzerland from 1946 to 2015

    NASA Astrophysics Data System (ADS)

    Badoux, Alexandre; Andres, Norina; Techel, Frank; Hegg, Christoph

    2016-12-01

    A database of fatalities caused by natural hazard processes in Switzerland was compiled for the period between 1946 and 2015. Using information from the Swiss flood and landslide damage database and the Swiss destructive avalanche database, the data set was extended back in time and more hazard processes were added by conducting an in-depth search of newspaper reports. The new database now covers all natural hazards common in Switzerland, categorised into seven process types: flood, landslide, rockfall, lightning, windstorm, avalanche and other processes (e.g. ice avalanches, earthquakes). Included were all fatal accidents associated with natural hazard processes in which victims did not expose themselves to an important danger on purpose. The database contains information on 635 natural hazard events causing 1023 fatalities, which corresponds to a mean of 14.6 victims per year. The most common causes of death were snow avalanches (37 %), followed by lightning (16 %), floods (12 %), windstorms (10 %), rockfall (8 %), landslides (7 %) and other processes (9 %). About 50 % of all victims died in one of the 507 single-fatality events; the other half were killed in the 128 multi-fatality events. The number of natural hazard fatalities that occurred annually during our 70-year study period ranged from 2 to 112 and exhibited a distinct decrease over time. While the number of victims in the first three decades (until 1975) ranged from 191 to 269 per decade, it ranged from 47 to 109 in the four following decades. This overall decrease was mainly driven by a considerable decline in the number of avalanche and lightning fatalities. About 75 % of victims were males in all natural hazard events considered together, and this ratio was roughly maintained in all individual process categories except landslides (lower) and other processes (higher). The ratio of male to female victims was most likely to be balanced when deaths occurred at home (in or near a building), a situation that mainly occurred in association with landslides and avalanches. The average age of victims of natural hazards was 35.9 years and, accordingly, the age groups with the largest number of victims were the 20-29 and 30-39 year-old groups, which in combination represented 34 % of all fatalities. It appears that the overall natural hazard mortality rate in Switzerland over the past 70 years has been relatively low in comparison to rates in other countries or rates of other types of fatal accidents in Switzerland. However, a large variability in mortality rates was observed within the country with considerably higher rates in Alpine environments.

  8. Orographic influence on storm damage to forests in mountain areas by the example of windstorm 'Lothar'

    NASA Astrophysics Data System (ADS)

    Schmoeckel, J.; Kottmeier, Ch.

    2003-04-01

    The extraordinary strong storm 'LOTHAR' on December 26, 1999 caused large damage in the forests of France, Switzerland and Germany. In Germany, specially the Black Forest (Schwarzwald) was concerned. In this contribution an empirical analysis of storm damage in the northern Black Forest is given. The aim is to derive the orographical influence on the windfield from the damage pattern. This is recorded approx. 5 months after the desaster by an airborne survey with a digital line scanner. From these data highly resolved, georeferenced distributions of the vegetation index are calculated (2 m x 2 m pixel size). The damaged forest areas appear with a lower vegetation index than areas with intact vegetation. Demarcation between damaged forest areas and populated or differently used areas is given by a landuse model. Mapping of the storm damages and their combination with a digital elevation model and landuse data is performed in a GIS. It is shown that the damage pattern is significantly affected by orographic factors. Large damage occurred e.g. at the location of saddles between single mountains, on mountain flanks facing to the North and Northwest, and at the windward (west) flanks of extended mountain ridges. Little damage is found in areas that presumably were protected against the wind, i.e. on the leeside (eastern) mountain flanks, in dells and niches as well as in valleys perpendicular to the mean west to southwest winds. To explain the spatially complex distribution of damages more fully, an analysis is made where characteristics of the forest and of the soil are taken into account. The knowledge gained can be profitable for future afforestation in mountain areas to stabilize forests against severe storms.

  9. Building the framework for climate change adaptation in the urban areas using participatory approach: the Czech Republic experience

    NASA Astrophysics Data System (ADS)

    Emmer, Adam; Hubatová, Marie; Lupač, Miroslav; Pondělíček, Michael; Šafařík, Miroslav; Šilhánková, Vladimíra; Vačkář, David

    2016-04-01

    The Czech Republic has experienced numerous extreme hydrometeorological / climatological events such as floods (significant ones in 1997, 2002, 2010, 2013), droughts (2013, 2015), heat waves (2015) and windstorms (2007) during past decades. These events are generally attributed to the ongoing climate change and caused loss of lives and significant material damages (up to several % of GDP in some years), especially in urban areas. To initiate the adaptation process of urban areas, the main objective was to prepare a framework for creating climate change adaptation strategies of individual cities reflecting physical-geographical and socioeconomical conditions of the Czech Republic. Three pilot cities (Hradec Králové, Žďár nad Sázavou, Dobru\\vska) were used to optimize entire procedure. Two sets of participatory seminars were organised in order to involve all key stakeholders (the city council, department of the environment, department of the crisis management, hydrometeorological institute, local experts, ...) into the process of creation of the adaptation strategy from its early stage. Lesson learned for the framework were related especially to its applicability on a local level, which is largely a matter of the understandability of the concept. Finally, this illustrative and widely applicable framework (so called 'road map to adaptation strategy') includes five steps: (i) analysis of existing strategies and plans on national, regional and local levels; (ii) analysing climate-change related hazards and key vulnerabilities; (iii) identification of adaptation needs, evaluation of existing adaptation capacity and formulation of future adaptation priorities; (iv) identification of limits and barriers for the adaptation (economical, environmental, ...); and (v) selection of specific types of adaptation measures reflecting identified adaptation needs and formulated adaptation priorities. Keywords: climate change adaptation (CCA); urban areas; participatory approach; road map

  10. Wet trend continues for lakes

    NASA Astrophysics Data System (ADS)

    Katzoff, Judith A.

    About 20% of the United States, including the regions of the Great Lakes and the Great Salt Lake, has entered a fourth year of record and near-record streamflow and lake levels, according to the U.S. Geological Survey (USGS). From June 3 until June 8, 1986, the Great Salt Lake stood at 1283.77 m above sea level, 0.076 m above the previous record, which was set in 1873. (Records have been kept for the lake since 1847.) On June 8, a dike south of the lake gave way during a windstorm, causing flooding of evaporation ponds used for mineral recovery.As a result of the breach, the lake's level dropped to 1283.65 m above sea level by June 10 but rose to 1283.68 m by June 20. The latest official reading, made on June 30, showed that the lake's level had dropped to 1283.63 m above sea level. According to Tom Ross, chief of the Current Water Conditions Group at the USGS National Center in Reston, Va., this drop represents “a normal seasonal decline brought on by evaporation.”

  11. Assessing the short-term effects of an extreme storm on Mediterranean forest raptors

    NASA Astrophysics Data System (ADS)

    Martínez, José E.; Jiménez-Franco, María V.; Zuberogoitia, Iñigo; León-Ortega, Mario; Calvo, José F.

    2013-04-01

    Different species show different responses to natural disturbances, depending on their capacity to exploit the altered environment and occupy new niches. In the case of semi-arid Mediterranean areas, there is no information available on the response of bird communities to disturbance caused by extreme weather events. Here, we evaluate the short-term effects of a heavy snowfall and strong winds on three long-lived species of forest-dwelling raptor in a semi-arid Mediterranean region situated in the south-east of Spain. The loss of nests was significantly higher in the first and second years following the disturbance than in the third year. The three species studied exhibited great tolerance to the short-term effects of the storm since we found no differences in density or reproductive parameters between the nine breeding seasons prior to the disturbance and the three which immediately followed it. We suggest that the tolerance shown by these three species to windstorms in semi-arid Mediterranean zones could be an adaptive response, resulting from the climatic and human pressures which have prevailed from the Bronze Age to the present day.

  12. Evaluating the Benefits of Adaptation of Critical Infrastructures to Hydrometeorological Risks.

    PubMed

    Thacker, Scott; Kelly, Scott; Pant, Raghav; Hall, Jim W

    2018-01-01

    Infrastructure adaptation measures provide a practical way to reduce the risk from extreme hydrometeorological hazards, such as floods and windstorms. The benefit of adapting infrastructure assets is evaluated as the reduction in risk relative to the "do nothing" case. However, evaluating the full benefits of risk reduction is challenging because of the complexity of the systems, the scarcity of data, and the uncertainty of future climatic changes. We address this challenge by integrating methods from the study of climate adaptation, infrastructure systems, and complex networks. In doing so, we outline an infrastructure risk assessment that incorporates interdependence, user demands, and potential failure-related economic losses. Individual infrastructure assets are intersected with probabilistic hazard maps to calculate expected annual damages. Protection measure costs are integrated to calculate risk reduction and associated discounted benefits, which are used to explore the business case for investment in adaptation. A demonstration of the methodology is provided for flood protection of major electricity substations in England and Wales. We conclude that the ongoing adaptation program for major electricity assets is highly cost beneficial. © 2017 Society for Risk Analysis.

  13. Documentary evidence of economic character as a source for the study of hydrometeorological extremes

    NASA Astrophysics Data System (ADS)

    Chromá, K.; Brázdil, R.; Valášek, H.

    2009-04-01

    Various human activities, such as agriculture, forestry and water management, have always been influenced by climate variability and hydrometeorological extremes. From this reason historical economic records often include information about contemporaneous weather as well as descriptions of its impacts. This study deals with the interpretation of hydrometeorological extremes for the territory of Moravia (eastern part of the Czech Republic) derived from taxation records and reports of domain and estate administrators. Information obtained reflects the occurrence of floods, convective storms (including hailstorms), windstorms, late spring and early autumn frosts. Based on data from eight domains or estates, frequency series of floods and convective storms (including hailstorms) were compiled for the period 1650-1849. Detail analysis of disastrous weather event from 10 August 1694 in the Pernštejn domain is used to demonstrate the potential of such data for the study of hydrometeorological extremes and their impacts on human activity. Another example is analysis of data about tax reduction due to hailstorm damage on agriculture crops in Moravia in the period 1896-1906.

  14. Protection forest resilience after a fire event: a case study in Vallis, Switzerland

    NASA Astrophysics Data System (ADS)

    Vergani, Chiara; Werlen, Mario; Schwarz, Massimiliano

    2016-04-01

    Forests are well known to protect against natural hazards such as landslides, rockfall and floods. Nevertheless, they are dynamic ecosystems which are exposed to a variety of disturbances such as windstorms, fires, bark beetle and pathogen outbreaks. Catastrophic disturbances like windstorms and fires usually remove large portions of the canopy, starting a succession process which lead to a complete stand regeneration. Disturbances belong to the natural dynamic of forests, however they are highly undesirable in the case where forest protect infrastructure or settlements. Quantifying the decay and recovery of the protection effect of forests after disturbances is therefore important to evaluate risks and implement appropriate management techniques, when needed. This work analyzes the dynamic of a Scots Pine (Pinus silvestris) protection forests near Visp (Vallis) after a fire event, focusing on root reinforcement, which is the key factor in preventing shallow landslides. Forest cover, root distribution and root mechanical properties were analyzed 4 years after the fire event, and the root reinforcement has been quantified. Furthermore, the contribution of natural regeneration has been evaluated. Results show that the root reinforcement of Scots pine has declined massively in the forest fire area. At a distance of 1.5 m from the tree stem there is a reduction of 60% compared with the live stand. With increasing distance from the stem, the reduction in the reinforcement is even bigger. At a distance of 2.5 meters it is 12% and at 3.5 meters, only 5% of the original root reinforcement. This decrease is due to the decomposition of roots and associated change in the mechanical properties of the wood. The reinforcement of the dead roots in the forest area is estimated between 0.36 kPa and 2.64 kPa. The contribution of the emerging regeneration is estimated on average 0.01 kPa. Overall the stand provides a reinforcement between 0.37 kPa and 2.65 kPa. From the results it can be concluded that the dying roots can still provide a certain root reinforcement; however, the contribution of rejuvenation is too little to compensate the continuously decreasing protective effect in the future time. The time in which a forest can return in the initial state plays therefore a decisive role for contrasting the formation of landslides, which after a forest fire can be triggered at lower precipitation events. The results obtained need now to be implemented in slope stability analysis to compare the protection effect of vegetation before and after the disturbance. This work contributes to provide a first framework to evaluate the efficiency of protection forests before and after a catastrophic event, in order to support risk evaluation and plan possible management actions.

  15. Modeling, design, and testing of a proof-of-concept prototype damper with friction and eddy current damping effects

    NASA Astrophysics Data System (ADS)

    Amjadian, Mohsen; Agrawal, Anil K.

    2018-01-01

    Friction is considered as one of the most reliable mechanisms of energy dissipation that has been utilized extensively in passive damping devices to mitigate vibration of civil engineering structures subjected to extreme natural hazards such as earthquakes and windstorms. However, passive friction dampers are well-known for having a highly nonlinear hysteretic behavior caused by stick-slip motion at low velocities, a phenomenon that is inherent in friction and increases the acceleration response of the structure under control unfavorably. The authors have recently proposed the theoretical concept of a new type of damping device termed as "Passive Electromagnetic Eddy Current Friction Damper" (PEMECFD) in which an eddy current damping mechanism was utilized not only to decrease the undesirable effects of stick-slip motion, but also to increase the energy dissipation capacity of the damping device as a whole. That study was focused on demonstration of the theoretical performance of the proposed damping device through numerical simulations. This paper further investigates the influence of eddy current damping on energy dissipation due to friction through modeling, design, and testing of a proof-of-concept prototype damper. The design of this damper has been improved over the design in the previous study. The normal force in this damper is produced by the repulsive magnetic force between two cuboidal permanent magnets (PMs) magnetized in the direction normal to the direction of the motion. The eddy current damping force is generated because of the motion of the two PMs and two additional PMs relative to a copper plate in their vicinity. The dynamic models for the force-displacement relationship of the prototype damper are based on LuGre friction model, electromagnetic theory, and inertial effects of the prototype damper. The parameters of the dynamic models have been identified through a series of characterization tests on the prototype damper under harmonic excitations of different frequencies in the laboratory. Finally, the identified dynamic models have been validated by subjecting the prototype damper to two different random excitations. The results indicate that the proposed dynamic models are capable of representing force-displacement behavior of the new type of passive damping device for a wide range of operating conditions.

  16. The 2007 southern California wildfires: Lessons in complexity

    USGS Publications Warehouse

    Keeley, J.E.; Safford, H.; Fotheringham, C.J.; Franklin, J.; Moritz, M.

    2009-01-01

    The 2007 wildfire season in southern California burned over 1,000,000 ac (400,000 ha) and included several megafires. We use the 2007 fires as a case study to draw three major lessons about wildfires and wildfire complexity in southern California. First, the great majority of large fires in southern California occur in the autumn under the influence of Santa Ana windstorms. These fires also cost the most to contain and cause the most damage to life and property, and the October 2007 fires were no exception because thousands of homes were lost and seven people were killed. Being pushed by wind gusts over 100 kph, young fuels presented little barrier to their spread as the 2007 fires reburned considerable portions of the area burned in the historic 2003 fire season. Adding to the size of these fires was the historic 2006-2007 drought that contributed to high dead fuel loads and long distance spotting. As in 2003, young chaparral stands and fuel treatments were not reliable barriers to fire in October 2007. Second, the Zaca Fire in July and August 2007 showed that other factors besides high winds can sometimes combine to create conditions for large fires in southern California. Spring and summer fires in southern California chaparral are usually easily contained because of higher fuel moisture and the general lack of high winds. However, the Zaca Fire burned in a remote wilderness area of rugged terrain that made access difficult. In addition, because of its remoteness, anthropogenic ignitions have been low and stand age and fuel loads were high. Coupled with this was severe drought that year that generated fuel moisture levels considerably below normal for early summer. A third lesson comes from 2007 conifer forest fires in the southern California mountains. In contrast to lower elevation chaparral, fire suppression has led to major increases in conifer forest fuels that can lead to unnaturally severe fires when ignitions escape control. The Slide and Grass Valley Fires of October 2007 occurred in forests that had been subject to extensive fuel treatment, but fire control was complicated by a patchwork of untreated private properties and mountain homes built of highly flammable materials. In a fashion reminiscent of other recent destructive conifer fires in California, burning homes themselves were a major source of fire spread. These lessons suggest that the most important advances in fire safety in this region are to come from advances in fire prevention, fire preparedness, and land-use planning that includes fire hazard patterns.

  17. Predicting the Texas Windstorm Insurance Association claim payout of commercial buildings from Hurricane Ike

    NASA Astrophysics Data System (ADS)

    Kim, J. M.; Woods, P. K.; Park, Y. J.; Son, K.

    2013-08-01

    Following growing public awareness of the danger from hurricanes and tremendous demands for analysis of loss, many researchers have conducted studies to develop hurricane damage analysis methods. Although researchers have identified the significant indicators, there currently is no comprehensive research for identifying the relationship among the vulnerabilities, natural disasters, and economic losses associated with individual buildings. To address this lack of research, this study will identify vulnerabilities and hurricane indicators, develop metrics to measure the influence of economic losses from hurricanes, and visualize the spatial distribution of vulnerability to evaluate overall hurricane damage. This paper has utilized the Geographic Information System to facilitate collecting and managing data, and has combined vulnerability factors to assess the financial losses suffered by Texas coastal counties. A multiple linear regression method has been applied to develop hurricane economic damage predicting models. To reflect the pecuniary loss, insured loss payment was used as the dependent variable to predict the actual financial damage. Geographical vulnerability indicators, built environment vulnerability indicators, and hurricane indicators were all used as independent variables. Accordingly, the models and findings may possibly provide vital references for government agencies, emergency planners, and insurance companies hoping to predict hurricane damage.

  18. High resolution climate projection of storm surge at the Venetian coast

    NASA Astrophysics Data System (ADS)

    Mel, R.; Sterl, A.; Lionello, P.

    2013-04-01

    Climate change impact on storm surge regime is of great importance for the safety and maintenance of Venice. In this study a future storm surge scenario is evaluated using new high resolution sea level pressure and wind data recently produced by EC-Earth, an Earth System Model based on the operational seasonal forecast system of the European Centre for Medium-Range Weather Forecasts (ECMWF). The study considers an ensemble of six 5 yr long simulations of the rcp45 scenario at 0.25° resolution and compares the 2094-2098 to the 2004-2008 period. EC-Earth sea level pressure and surface wind fields are used as input for a shallow water hydrodynamic model (HYPSE) which computes sea level and barotropic currents in the Adriatic Sea. Results show that a high resolution climate model is needed for producing realistic values of storm surge statistics and confirm previous studies in that they show little sensitivity of storm surge levels to climate change. However, some climate change signals are detected, such as increased persistence of high pressure conditions, an increased frequency of windless hour, and a decreased number of moderate windstorms.

  19. Predicatbility of windstorm Klaus; sensitivity to PV perturbations

    NASA Astrophysics Data System (ADS)

    Arbogast, P.; Maynard, K.

    2010-09-01

    It appears that some short-range weather forecast failures may be attributed to initial conditions errors. In some cases it is possible to anticipate the behavior of the model by comparison between observations and model analyses. In the case of extratropical cyclone development one may qualify the representation of the upper-level precursors described in terms of PV in the initial conditions by comparison with either satellite ozone or water-vapor. A step forward has been made in developing a tool based upon manual modifications of dynamical tropopause (i.e. height of 1.5 PV units) and PV inversion. After five years of experimentations it turns out that the forecasters eventually succeed in improving the forecast of some strong cyclone development. However the present approach is subjective per se. To measure the subjectivity of the procedure a set of 15 experiments has been performed provided by 7 different people (senior forecasters and scientists involved in dynamical meteorology) in order to improve an initial state of the global model ARPEGE leading to a poor forecast of the wind storm Klaus (24 January 2009). This experiment reveals that the manually defined corrections present common features but also a large spread.

  20. Proximity sounding analysis for derechos and supercells: an assessment of similarities and differences

    NASA Astrophysics Data System (ADS)

    Doswell, Charles A.; Evans, Jeffry S.

    Proximity soundings (within 2 h and 167 km) of derechos (long-lived, widespread damaging convective windstorms) and supercells have been obtained. More than 65 derechos, accompanied by 115 proximity soundings, are identified during the years 1983 to 1993. The derechos have been divided into categories according to the synoptic situation: strong forcing (SF), weak forcing (WF), and "hybrid" cases (which are neither weakly nor strongly forced). Nearly 100 supercell proximity soundings have been found for the period 1998 to 2001, subdivided into nontornadic and tornadic supercells; tornadic supercells were further subdivided into those producing significant (>F1 rating) tornadoes and weak tornadoes (F0-F1 rating). WF derecho situations typically are characterized by warm, moist soundings with large convective available potential instability (CAPE) and relatively weak vertical wind shear. SF derechos usually have stronger wind shears, and cooler and less moist soundings with lower CAPE than the weakly forced cases. Most derechos exhibit strong storm-relative inflow at low levels. In WF derechos, this is usually the result of rapid convective system movement, whereas in SF derechos, storm-relative inflow at low levels is heavily influenced by relatively strong low-level windspeeds. "Hybrid" cases collectively are similar to an average of the SF and WF cases. Supercells occur in environments that are not all that dissimilar from those that produce SF derechos. It appears that some parameter combining instability and deep layer shear, such as the Energy-Helicity Index (EHI), can help discriminate between tornadic and nontornadic supercell situations. Soundings with significant tornadoes (F2 and greater) typically show high 0-1 km relative humidities, and strong 0-1 km shear. Results suggest it may not be easy to forecast the mode of severe thunderstorm activity (i.e., derecho versus supercell) on any particular day, given conditions that favor severe thunderstorm activity in general. It is possible that the convective initiation mechanism is an important factor, with linear initiation favoring derechos, whereas nonlinear forcing might favor supercells. Upper-level storm-relative flow in supercells tends to be rear-to-front, whereas for derechos, storm-relative flow tends to be front-to-rear through a deep surface-based layer. However, knowing the storm-relative hodograph requires knowledge of storm motion, which can be a challenge to predict. These results generally imply that probabilistic forecasts of convective mode could be a successful strategy.

  1. Investigation of polar mesocyclones in Arctic Ocean using COSMO-CLM and WRF numerical models and remote sensing data

    NASA Astrophysics Data System (ADS)

    Varentsov, Mikhail; Verezemskaya, Polina; Baranyuk, Anastasia; Zabolotskikh, Elizaveta; Repina, Irina

    2015-04-01

    Polar lows (PL), high latitude marine mesoscale cyclones, are an enigmatic atmospheric phenomenon, which could result in windstorm damage of shipping and infrastructure in high latitudes. Because of their small spatial scales, short life times and their tendency to develop in remote data sparse regions (Zahn, Strorch, 2008), our knowledge of their behavior and climatology lags behind that of synoptic-scale cyclones. In case of continuing global warming (IPCC, 2013) and prospects of the intensification of economic activity and marine traffic in Arctic region, the problem of relevant simulation of this phenomenon by numerical models of the atmosphere, which could be used for weather and climate prediction, is especially important. The focus of this paper is researching the ability to simulate polar lows by two modern nonhydrostatic mesoscale numerical models, driven by realistic lateral boundary conditions from ERA-Interim reanalysis: regional climate model COSMO-CLM (Böhm et. al., 2009) and weather prediction and research model (WRF). Fields of wind, pressure and cloudiness, simulated by models, were compared with remote sensing data and ground meteorological observations for several cases, when polar lows were observed, in Norwegian, Kara and Laptev seas. Several types of satellite data were used: atmospheric water vapor, cloud liquid water content and surface wind fields were resampled by examining AMSR-E and AMSR-2 microwave radiometer data (MODIS Aqua, GCOM-W1), and wind fields were additionally extracted from QuickSCAT scatterometer. Infrared and visible pictures of cloud cover were obtained from MODIS (Aqua). Completed comparison shown that COSMO-CLM and WRF models could successfully reproduce evolution of polar lows and their most important characteristics such as size and wind speed in short experiments with WRF model and longer (up to half-year) experiments with COSMO-CLM model. Improvement of the quality of polar lows reproduction by these models in relation to source reanalysis fields were investigated. References: 1. Böhm U. et al. CLM - the climate version of LM: Brief description and long-term applications [Journal] // COSMO Newsletter. - 2006. - Vol. 6. 2. IPCC Fifth Assessment Report: Climate Change 2013 (AR5) Rep.,Cambridge University Press, Cambridge, United Kingdom and New York, NY, USA. 3. Zahn, M., and H. von Storch (2008), A long-term climatology of North Atlantic polar lows, Geophys. Res. Lett., 35, L22702

  2. The Vulnerability of People to Damaging Hydrogeological Events in the Calabria Region (Southern Italy)

    PubMed Central

    Salvati, Paola; Aceto, Luigi; Bianchi, Cinzia; Pasqua, Angela Aurora; Guzzetti, Fausto

    2017-01-01

    Background: Damaging Hydrogeological Events (DHEs) are severe weather periods during which floods, landslides, lightning, windstorms, hail or storm surges can harm people. Climate change is expected to increase the frequency/intensity of DHEs and, consequently, the potential harm to people. Method: We investigated the impacts of DHEs on people in Calabria (Italy) over 37 years (1980–2016). Data on 7288 people physically affected by DHEs were gathered from the systematic analysis of regional newspapers and collected in the database named PEOPLE. The damage was codified in three severity levels as follows: fatalities (people who were killed), injured (people who suffered physical harm) and involved (people who were present at the place where an accident occurred but survived and were not harmed). During the study period, we recorded 68 fatalities, 566 injured and 6654 people involved in the events. Results: Males were more frequently killed, injured and involved than females, and females who suffered fatalities were older than males who suffered fatalities, perhaps indicating that younger females tended to be more cautious than same-aged males, while older females showed an intrinsic greater vulnerability. Involved people were younger than injured people and fatalities, suggesting that younger people show greater promptness in reacting to dangerous situations. Floods caused the majority of the fatalities, injured and involved people, followed by landslides. Lightning was the most dangerous phenomenon, and it affected a relatively low number of people, killing 11.63% of them and causing injuries to 37.2%. Fatalities and injuries mainly occurred outdoors, largely along roads. In contrast, people indoors, essentially in public or private buildings, were more frequently involved without suffering harm. Being “dragged by water/mud” and “surrounded by water/mud”, respectively, represented the two extremes of dynamic dangerousness. The dragging effect of rapid-flowing water totally or partially obstructed the attempts of people to save their lives. In contrast, people surrounded by steady water/mud encountered difficulties but ultimately could survive. Conclusions: The study outcomes can be used in informational campaigns to increase risk awareness among both administrators and citizens and to improve community resilience, particularly in promoting self-protective behaviors and avoiding the underestimation of hazardous situations. PMID:29286338

  3. A Mediterranean derecho: Catalonia (Spain), 17th August 2003

    NASA Astrophysics Data System (ADS)

    López, J. Manuel

    2007-02-01

    At approximately 6:10 UTC in the morning of 17th August 2003, a squall line developed over south Catalonia (the northeast region of Spain). During the next 9 h, the squall moved rapidly northeast and crossed Catalonia and the French regions of Languedoc-Roussillon and Province, damaging and uprooting hundreds of trees and blocking trains in the region. Wind gusts reached were recoded up to 52 m/s with evidence of F2 intensity damage. This case study shows the characteristics of a derecho (widespread convectively induced windstorm). Radar observations of the evolving squall line show signatures often correlated with damaging surface winds, including: Bow echoes, Rear inflow notches, Rear inflow jets, Medium altitude radial convergence, Narrow gradient of very marked reflectivity, Development of isolated cells ahead of the convective line, A band of convection off the northern end of the line known as a "warm advection wing". When examining the different surface observations, satellite, radar imagery and cloud-to-ground lightning data, this case shows many similarities to those investigated in the United States. The derecho is a hybrid case, but has many characteristics of warm season derechoes. This emanates from a mesoscale convective complex (MCC) moving along a quasi-stationary, low-level thermal boundary in an environment characterized by high potential instability and relatively strong mid-tropospheric winds.

  4. A synoptic climatology of derecho producing mesoscale convective systems in the North-Central Plains

    NASA Astrophysics Data System (ADS)

    Bentley, Mace L.; Mote, Thomas L.; Byrd, Stephen F.

    2000-09-01

    Synoptic-scale environments favourable for producing derechos, or widespread convectively induced windstorms, in the North-Central Plains are examined with the goal of providing pattern-recognition/diagnosis techniques. Fifteen derechos were identified across the North-Central Plains region during 1986-1995. The synoptic environment at the initiation, mid-point and decay of each derecho was then evaluated using surface, upper-air and National Center for Atmospheric Research (NCAR)/National Center for Environmental Prediction (NCEP) reanalysis datasets.Results suggest that the synoptic environment is critical in maintaining derecho producing mesoscale convective systems (DMCSs). The synoptic environment in place downstream of the MCS initiation region determines the movement and potential strength of the system. Circulation around surface low pressure increased the instability gradient and maximized leading edge convergence in the initiation region of nearly all events regardless of DMCS location or movement. Other commonalities in the environments of these events include the presence of a weak thermal boundary, high convective instability and a layer of dry low-to-mid-tropospheric air. Of the two corridors sampled, northeastward moving derechos tend to initiate east of synoptic-scale troughs, while southeastward moving derechos form on the northeast periphery of a synoptic-scale ridge. Other differences between these two DMCS events are also discussed.

  5. RiskScape Volcano: Development of a risk assessment tool for volcanic hazards

    NASA Astrophysics Data System (ADS)

    Deligne, Natalia; King, Andrew; Jolly, Gill; Wilson, Grant; Wilson, Tom; Lindsay, Jan

    2013-04-01

    RiskScape is a multi-hazard risk assessment tool developed by GNS Science and the National Institute of Water and Atmospheric Research Ltd. (NIWA) in New Zealand that models the risk and impact of various natural hazards on a given built environment. RiskScape has a modular structure: the hazard module models hazard exposure (e.g., ash thickness at a given location), the asset module catalogues assets (built environment, infrastructure, and people) and their attributes exposed to the hazard, and the vulnerability module models the consequences of asset exposure to the hazard. Hazards presently included in RiskScape are earthquakes, river floods, tsunamis, windstorms, and ash from volcanic eruptions (specifically from Ruapehu). Here we present our framework for incorporating other volcanic hazards (e.g., pyroclastic density currents, lava flows, lahars, ground deformation) into RiskScape along with our approach for assessing asset vulnerability. We also will discuss the challenges of evaluating risk for 'point source' (e.g., stratovolcanoes) vs 'diffuse' (e.g., volcanic fields) volcanism using Ruapehu and the Auckland volcanic field as examples. Once operational, RiskScape Volcano will be a valuable resource both in New Zealand and internationally as a practical tool for evaluating risk and also as an example for how to predict the consequences of volcanic eruptions on both rural and urban environments.

  6. From cyclone tracks to the costs of European winter storms: A probabilistic loss assessment model

    NASA Astrophysics Data System (ADS)

    Renggli, Dominik; Corti, Thierry; Reese, Stefan; Wueest, Marc; Viktor, Elisabeth; Zimmerli, Peter

    2014-05-01

    The quantitative assessment of the potential losses of European winter storms is essential for the economic viability of a global reinsurance company. For this purpose, reinsurance companies generally use probabilistic loss assessment models. This work presents an innovative approach to develop physically meaningful probabilistic events for Swiss Re's new European winter storm loss model. The meteorological hazard component of the new model is based on cyclone and windstorm tracks identified in the 20th Century Reanalysis data. The knowledge of the evolution of winter storms both in time and space allows the physically meaningful perturbation of properties of historical events (e.g. track, intensity). The perturbation includes a random element but also takes the local climatology and the evolution of the historical event into account. The low-resolution wind footprints taken from 20th Century Reanalysis are processed by a statistical-dynamical downscaling to generate high-resolution footprints of the historical and probabilistic winter storm events. Downscaling transfer functions are generated using ENSEMBLES regional climate model data. The result is a set of reliable probabilistic events representing thousands of years. The event set is then combined with country- and risk-specific vulnerability functions and detailed market- or client-specific exposure information to compute (re-)insurance risk premiums.

  7. Airflow analyses using thermal imaging in Arizona's Meteor Crater as part of METCRAX II

    NASA Astrophysics Data System (ADS)

    Grudzielanek, A. Martina; Vogt, Roland; Cermak, Jan; Maric, Mateja; Feigenwinter, Iris; Whiteman, C. David; Lehner, Manuela; Hoch, Sebastian W.; Krauß, Matthias G.; Bernhofer, Christian; Pitacco, Andrea

    2016-04-01

    In October 2013 the second Meteor Crater Experiment (METCRAX II) took place at the Barringer Meteorite Crater (aka Meteor Crater) in north central Arizona, USA. Downslope-windstorm-type flows (DWF), the main research objective of METCRAX II, were measured by a comprehensive set of meteorological sensors deployed in and around the crater. During two weeks of METCRAX II five infrared (IR) time lapse cameras (VarioCAM® hr research & VarioCAM® High Definition, InfraTec) were installed at various locations on the crater rim to record high-resolution images of the surface temperatures within the crater from different viewpoints. Changes of surface temperature are indicative of air temperature changes induced by flow dynamics inside the crater, including the DWF. By correlating thermal IR surface temperature data with meteorological sensor data during intensive observational periods the applicability of the IR method of representing flow dynamics can be assessed. We present evaluation results and draw conclusions relative to the application of this method for observing air flow dynamics in the crater. In addition we show the potential of the IR method for METCRAX II in 1) visualizing airflow processes to improve understanding of these flows, and 2) analyzing cold-air flows and cold-air pooling.

  8. Objective Tracking of Tropical Cyclones in the North-West Pacific Basin Based on Wind Field Information only

    NASA Astrophysics Data System (ADS)

    Leckebusch, G. C.; Befort, D. J.; Kruschke, T.

    2016-12-01

    Although only ca. 12% of the global insured losses of natural disasters occurred in Asia, there are two major reasons to be concerned about risks in Asia: a) The fraction of loss events was substantial higher with 39% of which 94% were due to atmospheric processes; b) Asia and especially China, is undergoing quick transitions and especially the insurance market is rapidly growing. In order to allow for the estimation of potential future (loss) impacts in East-Asia, in this study we further developed and applied a feature tracking system based on extreme wind speed occurrences to tropical cyclones, which was originally developed for extra-tropical cyclones (Leckebusch et al., 2008). In principle, wind fields will be identified and tracked once a coherent exceedance of local percentile thresholds is identified. The focus on severe wind impact will allow an objective link between the strength of a cyclone and its potential damages over land. The wind tracking is developed in such a way to be applicable also to course-gridded AOGCM simulation. In the presented configuration the wind tracking algorithm is applied to the Japanese reanalysis (JRA55) and TC Identification is based on 850hPa wind speeds (6h resolution) from 1979 to 2014 over the Western North Pacific region. For validation the IBTrACS Best Track archive version v03r8 is used. Out of all 904 observed tracks, about 62% can be matched to at least one windstorm event identified in JRA55. It is found that the relative amount of matched best tracks increases with the maximum intensity. Thus, a positive matching (hit rate) of above 98% for Violent Typhoons (VTY), above 90% for Very Strong Typhoons (VSTY), about 75% for Typhoons (TY), and still some 50% for less intense TCs (TD, TS, STS) is found. This result is extremely encouraging to apply this technique to AOGCM outputs and to derive information about affected regions and intensity-frequency distributions potentially changed under future climate conditions.

  9. Is tropospheric weather influenced by solar wind through atmospheric vertical coupling downward control?

    NASA Astrophysics Data System (ADS)

    Prikryl, Paul; Tsukijihara, Takumi; Iwao, Koki; Muldrew, Donald B.; Bruntz, Robert; Rušin, Vojto; Rybanský, Milan; Turňa, Maroš; Šťastný, Pavel; Pastirčák, Vladimír

    2017-04-01

    More than four decades have passed since a link between solar wind magnetic sector boundary structure and mid-latitude upper tropospheric vorticity was discovered (Wilcox et al., Science, 180, 185-186, 1973). The link has been later confirmed and various physical mechanisms proposed but apart from controversy, little attention has been drawn to these results. To further emphasize their importance we investigate the occurrence of mid-latitude severe weather in the context of solar wind coupling to the magnetosphere-ionosphere-atmosphere (MIA) system. It is observed that significant snowstorms, windstorms and heavy rain, particularly if caused by low pressure systems in winter, tend to follow arrivals of high-speed solar wind. Previously published statistical evidence that explosive extratropical cyclones in the northern hemisphere tend to occur after arrivals of high-speed solar wind streams from coronal holes (Prikryl et al., Ann. Geophys., 27, 1-30, 2009; Prikryl et al., J. Atmos. Sol.-Terr. Phys., 149, 219-231, 2016) is corroborated for the southern hemisphere. A physical mechanism to explain these observations is proposed. The leading edge of high-speed solar wind streams is a locus of large-amplitude magneto-hydrodynamic waves that modulate Joule heating and/or Lorentz forcing of the high-latitude lower thermosphere generating medium-scale atmospheric gravity waves that propagate upward and downward through the atmosphere. Simulations of gravity wave propagation in a model atmosphere using the Transfer Function Model (Mayr et al., Space Sci. Rev., 54, 297-375, 1990) show that propagating waves originating in the thermosphere can excite a spectrum of gravity waves in the lower atmosphere. In spite of significantly reduced amplitudes but subject to amplification upon reflection in the upper troposphere, these gravity waves can provide a lift of unstable air to release instabilities in the troposphere thus initiating convection to form cloud/precipitation bands (Prikryl et al., Ann. Geophys., 27, 31-57, 2009). It is primarily the energy provided by release of latent heat that leads to intensification of storms. These results indicate that vertical coupling in the atmosphere exerts downward control from solar wind to the lower atmospheric levels influencing tropospheric weather development.

  10. Response of Urban Systems to Climate Change in Europe: Heat Stress Exposure and the Effect on Human Health

    NASA Astrophysics Data System (ADS)

    Stevens, Catherine; Thomas, Bart; Grommen, Mart

    2015-04-01

    Climate change is driven by global processes such as the global ocean circulation and its variability over time leading to changing weather patterns on regional scales as well as changes in the severity and occurrence of extreme events such as heavy rain- and windstorms, floods, drought, heat waves, etc. The summer 2003 European heat wave was the hottest summer on record in Europe over the past centuries leading to health crises in several countries like France and caused up to 70.000 excess deaths over four months in Central and Western Europe. The main risks induced by global climate change in urbanised areas are considered to be overheating and resulting health effects, increased exposure to flood events, increased damage losses from extreme weather conditions but also shortages in the provision of life-sustaining services. Moreover, the cities themselves create specific or inherent risks and urban adaptation is often very demanding. As most of Europe's inhabitants live in cities, it is of particular relevance to examine the impact of climate variability on urban areas and their populations. The present study focusses on the identification of heat stress variables related to human health and the extraction of this information by processing daily temperature statistics of local urban climate simulations over multiple timeframes of 20 years and three different European cities based on recent, near future and far future global climate predictions. The analyses have been conducted in the framework of the NACLIM FP7 project funded by the European Commission involving local stakeholders such as the cities of Antwerp (Belgium), Berlin (Germany) and Almada (Portugal) represented by different climate and urban characteristics. Apart from the urban-rural temperature increment (urban heat island effect), additional heat stress parameters such as the average number of heat wave days together with their duration and intensities have been covered during this research. In a subsequent step, the heat stress variables are superposed on relevant socio-economic datasets targeting total population and its distribution per age class as well as vulnerable institutions such as hospitals, schools, rest homes and child/day care facilities in order to generate heat stress exposure maps for each use case city and various climate, urban planning and mitigation scenarios. The specifications and requirements for the various scenarios have been consolidated in close collaboration with the local stakeholders during dedicated end-users workshops. The results of this study will allow urban planners and policy makers facing the challenges of climate change and develop sound strategies for evolving towards sustainable and climate resilient cities.

  11. Weather Support for the 2002 Winter Olympic and Paralympic Games.

    NASA Astrophysics Data System (ADS)

    Horel, J.; Potter, T.; Dunn, L.; Steenburgh, W. J.; Eubank, M.; Splitt, M.; Onton, D. J.

    2002-02-01

    The 2002 Winter Olympic and Paralympic Games will be hosted by Salt Lake City, Utah, during February-March 2002. Adverse weather during this period may delay sporting events, while snow and ice-covered streets and highways may impede access by the athletes and spectators to the venues. While winter snowstorms and other large-scale weather systems typically have widespread impacts throughout northern Utah, hazardous winter weather is often related to local terrain features (the Wasatch Mountains and Great Salt Lake are the most prominent ones). Examples of such hazardous weather include lake-effect snowstorms, ice fog, gap winds, downslope windstorms, and low visibility over mountain passes.A weather support system has been developed to provide weather information to the athletes, games officials, spectators, and the interested public around the world. This system is managed by the Salt Lake Olympic Committee and relies upon meteorologists from the public, private, and academic sectors of the atmospheric science community. Weather forecasting duties will be led by National Weather Service forecasters and a team of private, weather forecasters organized by KSL, the Salt Lake City NBC television affiliate. Other government agencies, commercial firms, and the University of Utah are providing specialized forecasts and support services for the Olympics. The weather support system developed for the 2002 Winter Olympics is expected to provide long-term benefits to the public through improved understanding,monitoring, and prediction of winter weather in the Intermountain West.

  12. Increasing impacts of climate extremes on critical infrastructures in Europe

    NASA Astrophysics Data System (ADS)

    Forzieri, Giovanni; Bianchi, Alessandra; Feyen, Luc; Silva, Filipe Batista e.; Marin, Mario; Lavalle, Carlo; Leblois, Antoine

    2016-04-01

    The projected increases in exposure to multiple climate hazards in many regions of Europe, emphasize the relevance of a multi-hazard risk assessment to comprehensively quantify potential impacts of climate change and develop suitable adaptation strategies. In this context, quantifying the future impacts of climatic extremes on critical infrastructures is crucial due to their key role for human wellbeing and their effects on the overall economy. Critical infrastructures describe the existing assets and systems that are essential for the maintenance of vital societal functions, health, safety, security, economic or social well-being of people, and the disruption or destruction of which would have a significant impact as a result of the failure to maintain those functions. We assess the direct damages of heat and cold waves, river and coastal flooding, droughts, wildfires and windstorms to energy, transport, industry and social infrastructures in Europe along the 21st century. The methodology integrates in a coherent framework climate hazard, exposure and vulnerability components. Overall damage is expected to rise up to 38 billion €/yr, ten time-folds the current climate damage, with drastic variations in risk scenarios. Exemplificative are drought and heat-related damages that could represent 70% of the overall climate damage in 2080s versus the current 12%. Many regions, prominently Southern Europe, will likely suffer multiple stresses and systematic infrastructure failures due to climate extremes if no suitable adaptation measures will be taken.

  13. From Cyclone Tracks to the Costs of European Winter Storms: A Probabilistic Loss Assessment Model

    NASA Astrophysics Data System (ADS)

    Orwig, K.; Renggli, D.; Corti, T.; Reese, S.; Wueest, M.; Viktor, E.; Zimmerli, P.

    2014-12-01

    European winter storms cause billions of dollars of insured losses every year. Therefore, it is essential to understand potential impacts of future events, and the role reinsurance can play to mitigate the losses. The authors will present an overview on natural catastrophe risk assessment modeling in the reinsurance industry, and the development of a new innovative approach for modeling the risk associated with European winter storms.The new innovative approach includes the development of physically meaningful probabilistic (i.e. simulated) events for European winter storm loss assessment. The meteorological hazard component of the new model is based on cyclone and windstorm tracks identified in the 20thCentury Reanalysis data. The knowledge of the evolution of winter storms both in time and space allows the physically meaningful perturbation of historical event properties (e.g. track, intensity, etc.). The perturbation includes a random element but also takes the local climatology and the evolution of the historical event into account.The low-resolution wind footprints taken from the 20thCentury Reanalysis are processed by a statistical-dynamical downscaling to generate high-resolution footprints for both the simulated and historical events. Downscaling transfer functions are generated using ENSEMBLES regional climate model data. The result is a set of reliable probabilistic events representing thousands of years. The event set is then combined with country and site-specific vulnerability functions and detailed market- or client-specific information to compute annual expected losses.

  14. Stand structure and dynamics of sand pine differ between the Florida panhandle and peninsula

    USGS Publications Warehouse

    Drewa, P.B.; Platt, W.J.; Kwit, C.; Doyle, T.W.

    2008-01-01

    Size and age structures of stand populations of numerous tree species exhibit uneven or reverse J-distributions that can persist after non-catastrophic disturbance, especially windstorms. Among disjunct populations of conspecific trees, alternative distributions are also possible and may be attributed to more localized variation in disturbance. Regional differences in structure and demography among disjunct populations of sand pine (Pinus clausa (Chapm. ex Engelm.) Vasey ex Sarg.) in the Florida panhandle and peninsula may result from variation in hurricane regimes associated with each of these populations. We measured size, age, and growth rates of trees from panhandle and peninsula populations and then compiled size and age class distributions. We also characterized hurricanes in both regions over the past century. Size and age structures of panhandle populations were unevenly distributed and exhibited continuous recruitment; peninsula populations were evenly sized and aged and exhibited only periodic recruitment. Since hurricane regimes were similar between regions, historical fire regimes may have been responsible for regional differences in structure of sand pine populations. We hypothesize that fires were locally nonexistent in coastal panhandle populations, while periodic high intensity fires occurred in peninsula populations over the past century. Such differences in local fire regimes could have resulted in the absence of hurricane effects in the peninsula. Increased intensity of hurricanes in the panhandle and current fire suppression patterns in the peninsula may shift characteristics of sand pine stands in both regions. ?? 2007 Springer Science+Business Media B.V.

  15. Development and validation of hydroacoustic monitoring concepts for the coastal German Bight (SE North Sea)

    NASA Astrophysics Data System (ADS)

    Mielck, Finn; Hass, H. Christian; Holler, Peter; Bartholomä, Alexander; Neumann, Andreas; Kröncke, Ingrid; Reimers, Hans-Christian; Capperucci, Ruggero

    2016-04-01

    The joint research project WIMO (Wissenschaftliche Monitoringkonzepte für die Deutsche Bucht/Scientific Monitoring Concepts for the German Bight, NE North Sea) aims at providing methods for detection and analysis of seabed habitats using modern remote sensing techniques. Our subproject focuses on hydroacoustic techniques in order to gain information about seafloor environments and sediment dynamics. In a timeframe of four years, several key areas in the German Bight were repeatedly observed using different hydroacoustic gear (i. e. sidescan sonars, single/multibeam echo sounders and sub-bottom profilers). In order to ground-truth the acoustic data, hundreds of grab samples and underwater videos were taken. With these techniques it is possible to distinguish between different seafloor habitats, which range from muddy to sandy seafloors (esp. near the barrier islands) to rugged or vegetated/populated reefs around Helgoland. The conducted monitoring program revealed seasonal changes regarding the abundance of the sand mason worm (Lanice conchilega) and the brittle star (Amphiora filiformis) as well as ongoing sedimentary processes driven by tidal currents and wind/storms. It was also possible to determine relationships between sediment characteristics and benthos in some key areas. An essential part of our project included a comparison between the datasets obtained with different hydroacoustic devices, configurations, and evaluation methods in the same study areas. The investigation reveals that there could be distinct differences in interpreting the data and hence in the determination of prevailing seafloor habitats, especially in very heterogeneous areas and at transition zones between the habitats. Therefore, it is recommended to employ more than one hydroacoustic system (preferably a singlebeam device combined with a wide-swath sonar system) synchronously during a survey in order to gain more reliable and detailed information about the seafloor environments. The results of this project study form an important contribution to ongoing and future projects, in particular with regard to the technical configuration of the sonar systems, the workflows concerning post-processing and validation of the hydroacoustic data as well as the monitoring concepts that were worked out. However, a full automation of these workflows is not feasible. For the time being, measurements, post-processing and data evaluation still need supervision and expert knowledge.

  16. Using norm-referenced tests to determine severity of language impairment in children: disconnect between U.S. policy makers and test developers.

    PubMed

    Spaulding, Tammie J; Swartwout Szulga, Margaret; Figueroa, Cecilia

    2012-04-01

    The purpose of this study was to identify various U.S. state education departments' criteria for determining the severity of language impairment in children, with particular focus on the use of norm-referenced tests. A secondary objective was to determine if norm-referenced tests of child language were developed for the purpose of identifying the severity of children's language impairment. Published procedures for severity determinations were obtained from U.S. state education departments. In addition, manuals for 45 norm-referenced tests of child language were reviewed to determine if each test was designed to identify the degree of a child's language impairment. Consistency was evaluated among state criteria, test developers' intentions, and test characteristics. At the time of this study, 8 states published guidelines for determining the severity of language impairment, and each specified the use of norm-referenced tests for this purpose. The degree of use and cutoff-point criteria for severity determination varied across states. No cutoff-point criteria aligned with the severity cutoff points described within the test manuals. Furthermore, tests that included severity information lacked empirical data on how the severity categories were derived. Researchers and clinicians should be cautious in determining the severity of children's language impairment using norm-referenced test performance given the inconsistency in guidelines and lack of empirical data within test manuals to support this use.

  17. Effects of sudden air pressure changes on hospital admissions for cardiovascular diseases in Prague, 1994-2009

    NASA Astrophysics Data System (ADS)

    Plavcová, Eva; Kyselý, Jan

    2014-08-01

    Sudden weather changes have long been thought to be associated with negative impacts on human health, but relatively few studies have attempted to quantify these relationships. We use large 6-h changes in atmospheric pressure as a proxy for sudden weather changes and evaluate their association with hospital admissions for cardiovascular diseases (CVD). Winter and summer seasons and positive and negative pressure changes are analysed separately, using data for the city of Prague (population 1.2 million) over a 16-year period (1994-2009). We found that sudden pressure drops in winter are associated with significant rise in hospital admissions. Increased CVD morbidity was observed neither for pressure drops in summer nor pressure increases in any season. Analysis of synoptic weather maps shows that large pressure drops in winter are associated with strong zonal flow and rapidly moving low-pressure systems with centres over northern Europe and atmospheric fronts affecting western and central Europe. Analysis of links between passages of strong atmospheric fronts and hospital admissions, however, shows that the links disappear if weather changes are characterised by frontal passages. Sudden pressure drops in winter are associated also with significant excess CVD mortality. As climate models project strengthening of zonal circulation in winter and increased frequency of windstorms, the negative effects of such weather phenomena and their possible changes in a warmer climate of the twenty-first century need to be better understood, particularly as their importance in inducing excess morbidity and mortality in winter may increase compared to cold spells.

  18. Correlating regional natural hazards for global reinsurance risk assessment

    NASA Astrophysics Data System (ADS)

    Steptoe, Hamish; Maynard, Trevor; Economou, Theo; Fox, Helen; Wallace, Emily; Maisey, Paul

    2016-04-01

    Concurrent natural hazards represent an uncertainty in assessing exposure for the insurance industry. The recently implemented Solvency II Directive requires EU insurance companies to fully understand and justify their capital reserving and portfolio decisions. Lloyd's, the London insurance and reinsurance market, commissioned the Met Office to investigate the dependencies between different global extreme weather events (known to the industry as perils), and the mechanisms for these dependencies, with the aim of helping them assess their compound risk to the exposure of multiple simultaneous hazards. In this work, we base the analysis of hazard-to-hazard dependency on the interaction of different modes of global and regional climate variability. Lloyd's defined 16 key hazard regions, including Australian wildfires, flooding in China and EU windstorms, and we investigate the impact of 10 key climate modes on these areas. We develop a statistical model that facilitates rapid risk assessment whilst allowing for both temporal auto-correlation and, crucially, interdependencies between drivers. The simulator itself is built conditionally using autoregressive regression models for each driver conditional on the others. Whilst the baseline assumption within the (re)insurance industry is that different natural hazards are independent of each other, the assumption of independence of meteorological risks requires greater justification. Although our results suggest that most of the 120 hazard-hazard connections considered are likely to be independent of each other, 13 have significant dependence arising from one or more global modes of climate variability. This allows us to create a matrix of linkages describing the hazard dependency structure that Lloyd's can use to inform their understanding of risk.

  19. Escalating impacts of climate extremes on critical infrastructures in Europe.

    PubMed

    Forzieri, Giovanni; Bianchi, Alessandra; Silva, Filipe Batista E; Marin Herrera, Mario A; Leblois, Antoine; Lavalle, Carlo; Aerts, Jeroen C J H; Feyen, Luc

    2018-01-01

    Extreme climatic events are likely to become more frequent owing to global warming. This may put additional stress on critical infrastructures with typically long life spans. However, little is known about the risks of multiple climate extremes on critical infrastructures at regional to continental scales. Here we show how single- and multi-hazard damage to energy, transport, industrial, and social critical infrastructures in Europe are likely to develop until the year 2100 under the influence of climate change. We combine a set of high-resolution climate hazard projections, a detailed representation of physical assets in various sectors and their sensitivity to the hazards, and more than 1100 records of losses from climate extremes in a prognostic modelling framework. We find that damages could triple by the 2020s, multiply six-fold by mid-century, and amount to more than 10 times present damage of €3.4 billion per year by the end of the century due only to climate change. Damage from heatwaves, droughts in southern Europe, and coastal floods shows the most dramatic rise, but the risks of inland flooding, windstorms, and forest fires will also increase in Europe, with varying degrees of change across regions. Economic losses are highest for the industry, transport, and energy sectors. Future losses will not be incurred equally across Europe. Southern and south-eastern European countries will be most affected and, as a result, will probably require higher costs of adaptation. The findings of this study could aid in prioritizing regional investments to address the unequal burden of impacts and differences in adaptation capacities across Europe.

  20. Dirty Snowballs and Magic Carpets: an Ontology of Geophysical Disturbance

    NASA Astrophysics Data System (ADS)

    Grant, G. E.; Lancaster, S.; O'Connor, J.; Lewis, S.

    2002-12-01

    Geologists tend to think about landscape-transforming events as "processes" while ecologists tend to view them as "disturbances". In either case, understanding the dynamics of such events is key to interpreting their effects on landforms and ecosystems. Although volcanic eruptions, meteorological and dam break floods, fires, windstorms, and other high-energy events have different origins, internal driving mechanisms, frequencies, and durations, and operate in different types of landscape settings, they share common attributes. Perhaps most importantly, they all represent transformations of energy from one form to another. In some cases the energy of an event generally increases as it propagates through a landscape, primarily through the addition of mass and momentum; examples of these "dirty snowballs" include the initiation and runout phases of volcanic lahars, avalanches, and debris flows. Explosive forest fires can also be viewed as snowballs, in the sense that the heat they generate results in convection that increases their temperatures and rates of movement. In other cases, abstraction of both mass and momentum from a moving body or fluid causes the energy of an event to dissipate with distance, similar to the unwinding of a rug; examples of these "magic carpets" include dam-break floods from a variety of origins, and the depositional phases of lahars and debris flows. Both snowballs and carpets leave distinctive imprints or tracks on the landscape and ecosystems in the form of scour and depositional features, patterns of vegetation disturbance, and rates of subsequent geomorphic or ecosystem recovery. Understanding which processes will snowball and which will unravel is key to determining both their ecosystem impacts and potential risks to humans.

  1. Flooding and subsidence in the Thames Gateway: impact on insurance loss potential

    NASA Astrophysics Data System (ADS)

    Royse, Katherine; Horn, Diane; Eldridge, Jillian; Barker, Karen

    2010-05-01

    In the UK, household buildings insurance generally covers loss and damage to the insured property from a range of natural and human perils, including windstorm, flood, subsidence, theft, accidental fire and winter freeze. Consequently, insurers require a reasoned view on the likely scale of losses that they may face to assist in strategic planning, reinsurance structuring, regulatory returns and general risk management. The UK summer 2007 flood events not only provided a clear indication of the scale of potential losses that the industry could face from an individual event, with £3 billion in claims, but also identified a need for insurers and reinsurers to better understand how events may correlate in time and space, and how to most effectively use the computational models of extreme events that are commonly applied to reflect these correlations. In addition to the potential for temporal clustering of events such as windstorms and floods, there is a possibility that seemingly uncorrelated natural perils, such as floods and subsidence, may impact an insurer's portfolio. Where aggregations of large numbers of new properties are planned, such as in the Thames Gateway, consideration of the potential future risk of aggregate losses due to the combination of perils such as subsidence and flood is increasingly important within the insurance company's strategic risk management process. Whilst perils such as subsidence and flooding are generally considered independent within risk modelling, the potential for one event to influence the magnitude and likelihood of the other should be taken into account when determining risk level. In addition, the impact of correlated, but distinctive, loss causing events on particular property types may be significant, particularly if a specific property is designed to protect against one peril but is potentially susceptible to another. We suggest that flood events can lead to increased subsidence risk due to the weight of additional water and sediment, or rehydration of sediment under flood water. The latter mechanism may be particularly critical on sites where Holocene sediments are currently protected from flooding and are no longer subsiding. Holocene deposits tend to compress, either under their own weight or under a superimposed load such as made ground, built structures or flood water. If protected dry sediments become flooded in the future, subsidence would be expected to resume. This research project aims to investigate the correlation between flood hazards and subsidence hazards and the effect that these two sources of risk will have on insurance losses in the Thames Gateway. In particular, the research will explore the potential hydrological and geophysical drivers and links between flood and subsidence events within the Thames Gateway, assessing the potential for significant event occurrence within the timescales relevant to insurers. In the first part of the project we have identified flood risk areas within the Thames Gateway development zone which have a high risk of flooding and may be affected by renewed or increased subsidence. This has been achieved through the use of national and local-scale 2D and 3D geo-environmental information such as the Geosure dataset (e.g. swell-shrink, collapsible and compressible deposits data layers), PSI data, thickness of superficial and artificial land deposits, and flood potential data etc. In the second stage of the project we will investigate the hydrological and geophysical links between flooding and subsidence events on developed sites; quantify the insurance loss potential in the Thames Gateway from correlated flooding and subsidence events; consider how climate change will affect risk to developments in the Thames Gateway in the context of subsidence and flooding; and develop new ways of communicating and visualise correlated flood and subsidence risk to a range of stakeholders, including the insurance industry, planners, policy makers and the general public.

  2. MiKlip-PRODEF: Probabilistic Decadal Forecast for Central and Western Europe

    NASA Astrophysics Data System (ADS)

    Reyers, Mark; Haas, Rabea; Ludwig, Patrick; Pinto, Joaquim

    2013-04-01

    The demand for skilful climate predictions on time-scales of several years to decades has increased in recent years, in particular for economic, societal and political terms. Within the BMBF MiKlip consortium, a decadal prediction system on the global to local scale is currently being developed. The subproject PRODEF is part of the MiKlip-Module C, which aims at the regionalisation of decadal predictability for Central and Western Europe. In PRODEF, a combined statistical-dynamical downscaling (SDD) and a probabilistic forecast tool are developed and applied to the new Earth system model of the Max-Planck Institute Hamburg (MPI-ESM), which is part of the CMIP5 experiment. Focus is given on the decadal predictability of windstorms, related wind gusts as well as wind energy potentials. SDD combines the benefits of both high resolution dynamical downscaling and purely statistical downscaling of GCM output. Hence, the SDD approach is used to obtain a very large ensemble of highly resolved decadal forecasts. With respect to the focal points of PRODEF, a clustering of temporal evolving atmospheric fields, a circulation weather type (CWT) analysis, and a storm damage indices analysis is applied to the full ensemble of the decadal hindcast experiments of the MPI-ESM in its lower resolution (MPI-ESM-LR). The ensemble consists of up to ten realisations per yearly initialised decadal hindcast experiments for the period 1960-2010 (altogether 287 realisations). Representatives of CWTs / clusters and single storm episodes are dynamical downscaled with the regional climate model COSMO-CLM with a horizontal resolution of 0.22°. For each model grid point, the distributions of the local climate parameters (e.g. surface wind gusts) are determined for different periods (e.g. each decades) by recombining dynamical downscaled episodes weighted with the respective weather type frequencies. The applicability of the SDD approach is illustrated with examples of decadal forecasts of the MPI-ESM. We are able to perform a bias correction of the frequencies of large scale weather types and to quantify the uncertainties of decadal predictability on large and local scale arising from different initial conditions. Further, probability density functions of local parameters like e.g. wind gusts for different periods and decades derived from the SDD approach is compared to observations and reanalysis data. Skill scores are used to quantify the decadal predictability for different leading time periods and to analyse whether the SDD approach shows systematic errors for some regions.

  3. The new Met Office strategy for seasonal forecasts

    NASA Astrophysics Data System (ADS)

    Hewson, T. D.

    2012-04-01

    In October 2011 the Met Office began issuing a new-format UK seasonal forecast, called "The 3-month Outlook". Government interest in a UK-relevant product had been heightened by infrastructure issues arising during the severe cold of previous winters. At the same time there was evidence that the Met Office's "GLOSEA4" long range forecasting system exhibited some hindcast skill for the UK, that was comparable to its hindcast skill for the larger (and therefore less useful) 'northern Europe' region. Also, the NAO- and AO- signals prevailing in the previous two winters had been highlighted by the GLOSEA4 model well in advance. This presentation will initially give a brief overview of GLOSEA4, describing key features such as evolving sea-ice, a well-resolved stratosphere, and the perturbation strategy. Skill measures will be shown, along with forecasts for the last 3 winters. The new structure 3-month outlook will then be described and presented. Previously, our seasonal forecasts had been based on a tercile approach. The new format outlook aims to substantially improve upon this by illustrating graphically, and with text, the full range of possible outcomes, and by placing those outcomes in the context of climatology. In one key component the forecast pdfs (probability density functions) are displayed alongside climatological pdfs. To generate the forecast pdf we take the bias-corrected GLOSEA4 output (42 members), and then incorporate, via expert team, all other relevant information. Firstly model forecasts from other centres are examined. Then external 'forcing factors', such as solar, and the state of the land-ocean-ice system, are referenced, assessing how well the models represent their influence, and bringing in statistical relationships where appropriate. The expert team thereby decides upon any changes to the GLOSEA4 data, employing an interactive tool to shift, expand or contract the forecast pdfs accordingly. The full modification process will be illustrated during the presentation. Another key component of the 3-month outlook is the focus it places on potential hazards and impacts. To date specific references have been made to snow and ice disruption, to replenishment expectation for regions suffering water supply shortages, and to windstorm frequency. This aspect will be discussed, showing also some subjective verification. In future we hope to extend the 3-month outlook framework to other parts of the world, notably Africa, a region where the Met Office, with DfID support, is working collaboratively to improve real-time long range forecasts. Brief reference will also be made to such activities.

  4. How integrating 3D LiDAR data in the dike surveillance protocol: The French case

    NASA Astrophysics Data System (ADS)

    Bretar, F.; Mériaux, P.; Fauchard, C.

    2012-04-01

    The recent and dramatic floods of the last years in Europe (e.g. Rhône river major flood, December 2003, Windstorm Xynthia, February 2010, in France) and in the United-States (Hurricane Katrina, August 2005) showed the vulnerability of flood or coastal defence systems. The first key point for avoiding these dramatic damages and the high cost of a failure and its consequences lies in the appropriate conception and construction of the dikes, but above all in the relevance of the surveillance protocol. Many factors introduce weaknesses in the fluvial or maritime dikes. Most of them are old embankment structures. For instance, some of the French Loire River dikes were built several centuries ago. They may have been rebuilt, modified, heightened several times, with some materials that do not necessarily match the original conception of the structure. In other respects, tree roots or animal burrows could modify the structure of the dike and reduce the watertightness or mechanical properties. The French government has built a national database, "BarDigues", since 1999 to inventory and characterize dikes. Today, there are approx. 9000 km of dikes protecting 1.5 to 2 millions of people. In the meantime, a GIS application, called « Dike SIRS » [Maurel P., 2004] , provides an operational and accurate tool to several great stakeholders in charge of managing more than 100 km of dikes. Today, the dike surveillance and diagnosis protocol consists in identifying the weaknesses of the structure and providing the degree of safety by making a preliminary study (historical research, geological and morphodynamic study, topography), geophysical study (e.g. electromagnetic methods and electrical resistivity tomography) and at last geotechnical study (e.g. drillings and stability modelling) at the very local scale when necessary [Mériaux P. & Royet P, 2007] . Considering the stretch of hundreds of kilometres, rapid, cost-effective and reliable techniques for surveying the dike must be carried out. A LiDAR system is able to acquire data on a dike structure of up to 80 km per day, which makes the use of this technique also valuable in case of emergency situations. It provides additional valuable products like precious information on dike slopes and crest or their near environment (river banks, etc.). Moreover, in case of vegetation, LiDAR data makes possible to study hidden structures or defaults from images like the erosion of riverbanks under forestry vegetation. The possibility of studying the vegetation is also of high importance: the development of woody vegetation near or onto the dike is a major risk factor. Surface singularities are often signs of disorder or suspected disorder in the dike itself: for example a subsidence or a sinkhole on a ridge may result from internal erosion collapse. Finally, high resolution topographic data contribute to build specific geomechanical model of the dike that, after incorporating data provided by geophysical and geotechnical surveys, are integrated in the calculations of the structure stability. Integrating the regular use of LiDAR data in the dike surveillance protocol is not yet operational in France. However, the high number of French stakeholders at the national level (on average, there is one stakeholder for only 8-9km of dike !) and the real added value of LiDAR data makes a spatial data infrastructure valuable (webservices for processing the data, consulting and filling the database on the field when performing the local diagnosis)

  5. Prediction of pelvic pathology in subfertile women with combined Chlamydia antibody and CA-125 tests.

    PubMed

    Penninx, Josien; Brandes, Monique; de Bruin, Jan Peter; Schneeberger, Peter M; Hamilton, Carl J C M

    2009-12-01

    Chlamydia antibody test (CAT) has been proposed to predict tubal disease. A correlation between CA-125 and the extent of endometriosis has been found by others. In this study we explored whether a combination of the two tests adds to the predictive value of the individual tests for predicting tubal disease or endometriosis. We also used the combination of tests as a new index test to screen for severe pelvic pathology. This retrospective study compares the findings of 240 laparoscopies with the serological test results. Findings were classified according to the existing ASRM scoring systems for adnexal adhesions, distal tubal occlusion and endometriosis. Severe pelvic pathology was defined as the presence of ASRM classes III and IV tubal disease or ASRM classes III and IV endometriosis. The predictive value was calculated for both tests separately and for the combined test. The combined test was positive if at least one test result was abnormal (CAT positive and/or CA-125 > or =35 IU/ml). 67/240 women had tubal disease, 81/240 had some degree of endometriosis. The odds ratios (ORs) of the CAT and the combined test to diagnose severe tubal disease were 6.6 (2.6-17.0) and 7.3 (2.9-19.3), respectively. The ORs of the CA-125 and the combined test to diagnose severe endometriosis were 15.6 (6.2-40.2) and 3.0 (1.2-8.0), respectively. Severe pelvic pathology was present in 65/240 women (27%). The ORs for severe pelvic pathology of the CAT, CA-125 and the combined test were 2.5 (1.4-5.3), 4.9 (1.9-9.6) and 6.6 (3.3-13.4), respectively. If the combined test was normal 15 out 131 women (11%) were shown to have severe pelvic pathology. The combined test adds hardly anything to the predictive value of CAT alone to diagnose severe tubal disease. The combined test is better than the CAT to predict severe pelvic pathology, but is not significantly better than the CA-125. If both the CAT and CA-125 are normal one could consider not performing a laparoscopy.

  6. 78 FR 70329 - Modification to the Scopes of Recognition of Several NRTLs; Final Determination

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-25

    ... determination to delete specific test standards from the scopes of recognition of several Nationally Recognized Testing Laboratories (NRTLs), and to incorporate replacement test standards into the scopes of recognition... proposed to delete specific test standards from the scopes of recognition of several NRTLs, and incorporate...

  7. Human occupants in low-speed frontal sled tests: effects of pre-impact bracing on chest compression, reaction forces, and subject acceleration.

    PubMed

    Kemper, Andrew R; Beeman, Stephanie M; Madigan, Michael L; Duma, Stefan M

    2014-01-01

    The purpose of this study was to investigate the effects of pre-impact bracing on the chest compression, reaction forces, and accelerations experienced by human occupants during low-speed frontal sled tests. A total of twenty low-speed frontal sled tests, ten low severity (∼2.5g, Δv=5 kph) and ten medium severity (∼5g, Δv=10 kph), were performed on five 50th-percentile male human volunteers. Each volunteer was exposed to two impulses at each severity, one relaxed and the other braced prior to the impulse. A 59-channel chestband, aligned at the nipple line, was used to quantify the chest contour and anterior-posterior sternum deflection. Three-axis accelerometer cubes were attached to the sternum, 7th cervical vertebra, and sacrum of each subject. In addition, three linear accelerometers and a three-axis angular rate sensor were mounted to a metal mouthpiece worn by each subject. Seatbelt tension load cells were attached to the retractor, shoulder, and lap portions of the standard three-point driver-side seatbelt. In addition, multi-axis load cells were mounted to each interface between the subject and the test buck to quantify reaction forces. For relaxed tests, the higher test severity resulted in significantly larger peak values for all resultant accelerations, all belt forces, and three resultant reaction forces (right foot, seatpan, and seatback). For braced tests, the higher test severity resulted in significantly larger peak values for all resultant accelerations, and two resultant reaction forces (right foot and seatpan). Bracing did not have a significant effect on the occupant accelerations during the low severity tests, but did result in a significant decrease in peak resultant sacrum linear acceleration during the medium severity tests. Bracing was also found to significantly reduce peak shoulder and retractor belt forces for both test severities, and peak lap belt force for the medium test severity. In contrast, bracing resulted in a significant increase in the peak resultant reaction force for the right foot and steering column at both test severities. Chest compression due to belt loading was observed for all relaxed subjects at both test severities, and was found to increase significantly with increasing severity. Conversely, chest compression due to belt loading was essentially eliminated during the braced tests for all but one subject, who sustained minor chest compression due to belt loading during the medium severity braced test. Overall, the data from this study illustrate that muscle activation has a significant effect on the biomechanical response of human occupants in low-speed frontal impacts.

  8. Air-Surface-Ground Water Cycling in an Agricultural Desert Valley of Southern Colorado

    NASA Astrophysics Data System (ADS)

    Lanzoni, M.

    2017-12-01

    In dryland areas around the world, vegetation plays an important role in stabilizing soil and encouraging recharge. In the Colorado high desert of the San Luis Valley, windstorms strip away topsoil and deposit dust on the surrounding mountain snowpack. Dust-on-snow lowers albedo and hastens melting, which in turn lowers infiltration and aquifer recharge. Since the 1990s, the San Luis Valley has experienced a sharp decline in aquifer levels due to over-development of its water resources. Where agricultural abstraction is significant, the unconfined aquifer has experienced a 9 m (30 ft) drop. Over the course of three years, this dryland hydrology study analyzed rain, snow, surface and ground water across a 20,000 km2 high desert area to establish a baseline of water inputs. δ18O and δ2H were analyzed to develop a LMWL specific to this region of the southern Rockies and isotopic differences were examined in relation to chemistry to understand environmental influences on meteoric waters. This work identifies a repeating pattern of acid rainfall with trace element contaminants, including actinides.To better understand how the area's dominant vegetation responds to a lowered water table, 76 stem water samples were collected from the facultative phreatophyte shrubs E. nauseosa and S. vermiculatus over the summer, fall, spring, and summer of 2015 and 2016 from study plots chosen for increasing depths to groundwater. This research shows distinct patterns of water capture strategy and seasonal shifts among the E. nauseosa and S. vermiculatus shrubs. These differences are most apparent where groundwater is most accessible. However, where the water table has dropped 6 m (20 feet) over the last decade, both E. nauseosa and S. vermiculatus survive only on near-surface snowmelt and rain.

  9. Short-term effects of forest disturbances on soil nematode communities in European mountain spruce forests.

    PubMed

    Čerevková, A; Renčo, M; Cagáň, L

    2013-09-01

    The nematode communities in spruce forests were compared with the short-term effects of forest damage, caused by windstorm, wildfire and management practices of forest soils. Soil samples were collected in June and October from 2006 to 2008 in four different sites: (1) forest unaffected by the wind (REF); (2) storm-felled forest with salvaged timber (EXT); (3) modified forest affected by timber salvage (wood removal) and forest fire (FIR); and (4) storm-felled forest where timber had been left unsalvaged (NEX). Nematode analysis showed that the dominant species in all four investigated sites were Acrobeloides nanus and Eudorylaimus silvaticus. An increase of A. nanus (35% of the total nematode abundance) in the first year in the FIR site led to the highest total abundance of nematodes compared with other sites, where nematode abundance reached the same level in the third year. In the FIR site bacterial feeders appeared to be the most representative trophic group, although in the second and third year, after disturbance, the abundance of this trophic group gradually decreased. In the NEX site, the number of nematode species, population densities and Maturity Index were similar to that recorded for the FIR site. In EXT and NEX sites, the other dominant species was the plant parasitic nematode Paratylenchus microdorus. Analyses of nematodes extracted from different forest soil samples showed that the highest number of species and diversity index for species (H'spp) were in the REF site. Differences between the nematode fauna in REF and other localities were clearly depicted by cluster analysis. The greatest Structure Index and Enrichment Index values were also in REF. In the EXT site, the number of nematode species, their abundance, H'spp and Maturity Index were not significantly different from those recorded in the reference site.

  10. IPPP GPS for tracking loading deformations induced by the storm Xynthia

    NASA Astrophysics Data System (ADS)

    Ferenc, Marcell; Nicolas, Joëlle; Durand, Frédéric; Li, Zhao; Boy, Jean-Paul; Perosanz, Félix; van Dam, Tonie

    2015-04-01

    Xynthia was a violent windstorm that progressed over Western Europe between the 27th of February and the 1st of March 2010. The huge low-pressure system (pressure drop of 40 mbar and storm surge of 1.5 m at La Rochelle tide gauge) crossed France from the southwest to the northeast over the course of about 20 hours. In this study, we first investigate the detailed spatial and temporal characteristics of the Xynthia storm. Then we analyse the effect of this storm on sub-daily 3D GPS (Global Positioning System) position time series computed with the iPPP (integer fixed ambiguity Precise Point Positioning) GINS-PC software method using the REPRO 2 products for about 100 stations of the French GNSS permanent network (RGP). We compare the GPS observations with the predicted time series derived from different geodynamical models for non-tidal atmospheric, oceanic and hydrological loading effects. These predicted time series are computed using different environmental data sets. For atmospheric pressure we used the ECMWF (the European Centre for Medium-Range Weather Forecasts) or MERRA (Modern-Era Retrospective Analysis for Research and Applications) pressure fields. Concerning the ocean's response we use different hypotheses such as inverse barometer (IB), non-IB or a dynamic ocean's response to winds and pressure forcing applying 2 Dimensions Gravity Waves model (MOG2D). We perform a spatial analysis to study the different behaviour of the coastal and inland sites. This study allows us to identify the ocean's dynamics on the continental shelf during the passage of this fast moving low pressure system. For comparison, these analyses are also performed for calm periods.

  11. A new approach for the assessment of temporal clustering of extratropical wind storms

    NASA Astrophysics Data System (ADS)

    Schuster, Mareike; Eddounia, Fadoua; Kuhnel, Ivan; Ulbrich, Uwe

    2017-04-01

    A widely-used methodology to assess the clustering of storms in a region is based on dispersion statistics of a simple homogeneous Poisson process. This clustering measure is determined by the ratio of the variance and the mean of the local storm statistics per grid point. Resulting values larger than 1, i.e. when the variance is larger than the mean, indicate clustering; while values lower than 1 indicate a sequencing of storms that is more regular than a random process. However, a disadvantage of this methodology is that the characteristics are valid for a pre-defined climatological time period, and it is not possible to identify a temporal variability of clustering. Also, the absolute value of the dispersion statistics is not particularly intuitive. We have developed an approach to describe temporal clustering of storms which offers a more intuitive comprehension, and at the same time allows to assess temporal variations. The approach is based on the local distribution of waiting times between the occurrence of two individual storm events, the former being computed through the post-processing of individual windstorm tracks which in turn are obtained by an objective tracking algorithm. Based on this distribution a threshold can be set, either by the waiting time expected from a random process or by a quantile of the observed distribution. Thus, it can be determined if two consecutive wind storm events count as part of a (temporal) cluster. We analyze extratropical wind storms in a reanalysis dataset and compare the results of the traditional clustering measure with our new methodology. We assess what range of clustering events (in terms of duration and frequency) is covered and identify if the historically known clustered seasons are detectable by the new clustering measure in the reanalysis.

  12. Hydrometeorological extremes derived from taxation records for south-eastern Moravia, Czech Republic, 1751-1900 AD

    NASA Astrophysics Data System (ADS)

    Brázdil, R.; Chromá, K.; Valášek, H.; Dolák, L.

    2012-03-01

    Historical written records associated with tax relief at ten estates located in south-eastern Moravia (Czech Republic) are used for the study of hydrometeorological extremes and their impacts during the period 1751-1900 AD. At the time, the taxation system in Moravia allowed farmers to request tax relief if their crop yields had been negatively affected by hydrological and meteorological extremes. The documentation involved contains information about the type of extreme event and the date of its occurrence, while the impact on crops may often be derived. A total of 175 extreme events resulting in some kind of damage are documented for 1751-1900, with the highest concentration between 1811 and 1860 (74.9% of all events analysed). The nature of events leading to damage (of a possible 272 types) include hailstorm (25.7%), torrential rain (21.7%), flood (21.0%), followed by thunderstorm, flash flood, late frost and windstorm. The four most outstanding events, affecting the highest number of settlements, were thunderstorms with hailstorms (25 June 1825, 20 May 1847 and 29 June 1890) and flooding of the River Morava (mid-June 1847). Hydrometeorological extremes in the 1816-1855 period are compared with those occurring during the recent 1961-2000 period. The results obtained are inevitably influenced by uncertainties related to taxation records, such as their temporal and spatial incompleteness, the limits of the period of outside agricultural work (i.e. mainly May-August) and the purpose for which they were originally collected (primarily tax alleviation, i.e. information about hydrometeorological extremes was of secondary importance). Taxation records constitute an important source of data for historical climatology and historical hydrology and have a great potential for use in many European countries.

  13. Hydrometeorological extremes and their impacts, as derived from taxation records for south-eastern Moravia, Czech Republic, AD 1751-1900

    NASA Astrophysics Data System (ADS)

    Brázdil, R.; Chromá, K.; Valášek, H.; Dolák, L.

    2011-12-01

    Historical written records associated with tax relief at ten estates located in south-eastern Moravia (Czech Republic) are used for the study of hydrometeorological extremes and their impacts during the period AD 1751-1900. At the time, the taxation system in Moravia allowed farmers to request tax relief if their crop yields had been negatively affected by hydrological and meteorological extremes. The documentation involved contains information about the type of extreme event and the date of its occurrence, while the impact on crops may often be derived. A total of 175 extreme events resulting in some kind of damage is documented for 1751-1900, with the highest concentration between 1811 and 1860 (74.9% of all events analysed). The nature of events leading to damage (of a possible 272 types) include hailstorm (25.7%), torrential rain (21.7%), and flood (21.0%), followed by thunderstorm, flash flood, late frost and windstorm. The four most outstanding events, affecting the highest number of settlements, were thunderstorms with hailstorms (25 June 1825, 20 May 1847 and 29 June 1890) and flooding of the River Morava (mid-June 1847). Hydrometeorological extremes in the 1816-1855 period are compared with those occurring during the recent 1961-2000 period. The results obtained are inevitably influenced by uncertainties related to taxation records, such as their temporal and spatial incompleteness, the limits of the period of outside agricultural work (i.e. mainly May-August) and the purpose for which they were originally collected (primarily tax alleviation, i.e. information about hydrometeorological extremes was of secondary importance). Taxation records constitute an important source of data for historical climatology and historical hydrology and have a great potential for use in many European countries.

  14. Human impacts of droughts, floods and other extremes in South Moravia

    NASA Astrophysics Data System (ADS)

    Dolák, Lukáš; Brázdil, Rudolf; Řezníčková, Ladislava; Valášek, Hubert; Chromá, Kateřina

    2015-04-01

    Chronicles and taxation records related to tax relief for farmers whose livelihoods were affected by droughts, floods and other hydrometeorological extremes (HMEs) in South Moravia (the Czech Republic) in the 17th-20th centuries are used to study the impacts of HMEs on the socio-economic situation of the farmers. The first flood event was reported in 1652 on the River Morava and extraordinary dry years were documented since 1718 - in this year the River Dyje totally dried up. Moreover, downpours, hailstorms, windstorms, late frosts and blizzards caused a great damage during the period studied as well and in many cases had a negative effect to human society. The impacts of HMEs are here classified into three categories: agricultural production, material property and the socio-economic situation of individual farmers. Direct impacts took the form of losses to property, supplies and farming equipment, and further of bad field and fruit yields, depletion of livestock, damage to fields and meadows, lack of water for daily use, watermills and transport and increased threat of wildfires. Simple lack of income, debt, impoverishment, reduction in livestock and deterioration in field fertility were among the longer-term effects. Impacts are discussed with respect to approaches to mitigation of the negative effects of HMEs and to problems associated with obtaining support and in terms of a hierarchy of consequent impacts. A great number of records related to HMEs, preserved in the Moravian Land Archives in Brno and other district South Moravian archives, represents a rich source of data allowing re-discovering of historical natural disasters. The paper embodies a methodological approach that is intended for the analysis of HME impacts in South Moravia from the 17th to the 20th centuries.

  15. Spatiotemporal variability of hydrometeorological extremes and their impacts in the Jihlava region in the 1650-1880 period

    NASA Astrophysics Data System (ADS)

    Dolak, Lukas; Brazdil, Rudolf; Chroma, Katerina; Valasek, Hubert; Reznickova, Ladislava

    2017-04-01

    Different documentary evidence (taxation records, chronicles, insurance reports etc.) and secondary sources (peer-reviewed papers, historical literature, newspapers) are used for reconstruction of hydrometeorological extremes (HMEs) in the former Jihlava region in the 1651-1880 period. The study describes the system of tax alleviation in Moravia, presents assessment of the impacts of HMEs with regard to physical-geographical characteristic of area studied, presents up to now non-utilized documentary evidence (early fire and hail damage insurance claims) and application of the new methodological approaches for the analysis of HMEs impacts. During the period studied more than 500 HMEs were analysed for the 19 estates (past basic economic units) in the region. Thunderstorm in 1651 in Rančířov (the Jihlava estate), which caused damage on the fields and meadows, is the first recorded extreme event. Downpours causing flash floods and hailstorms are the most frequently recorded natural disasters. Together with floods, droughts, windstorms, blizzards, late frosts and lightning strikes starting fires caused enormous damage as well. The impacts of HMEs are classified into three categories: impacts on agricultural production, material property and the socio-economic impacts. Natural disasters became the reasons of losses of human lives, property, supplies and farming equipment. HMEs caused damage to fields and meadows, depletion of livestock and triggered the secondary consequences as lack of seeds and finance, high prices, indebtedness, poverty and deterioration in field fertility. The results are discussed with respect to uncertainties associated with documentary evidences and their spatiotemporal distribution. The paper shows that particularly archival records, preserved in the Moravian Land Archives in Brno and other district archives, represent a unique source of data contributing to the better understanding of extreme events and their impacts in the past.

  16. Hydrometeorological extremes reconstructed from documentary evidence for the Jihlava region in the 17th-19th centuries

    NASA Astrophysics Data System (ADS)

    Dolak, Lukas; Brazdil, Rudolf; Chroma, Katerina; Valasek, Hubert; Belinova, Monika; Reznickova, Ladislava

    2016-04-01

    Different documentary evidence (taxation records, chronicles, insurance reports etc.) is used for reconstruction of hydrometeorological extremes (HMEs) in the Jihlava region (central part of the recent Czech Republic) in the 17th-19th centuries. The aim of the study is description of the system of tax alleviation in Moravia, presentation of utilization of early fire and hail damage insurance claims and application of the new methodological approaches for the analysis of HMEs impacts. During the period studied more than 400 HMEs were analysed for the 16 estates (past basic economic units). Late frost on 16 May 1662 on the Nove Mesto na Morave estate, which destroyed whole cereals and caused damage in the forests, is the first recorded extreme event. Downpours causing flash floods and hailstorms are the most frequently recorded natural disasters. Moreover, floods, droughts, windstorms, blizzards, late frosts and lightning strikes starting fires caused enormous damage as well. The impacts of HMEs are classified into three categories: impacts on agricultural production, material property and the socio-economic impacts. Natural disasters became the reasons of losses of human lives, property, supplies and farming equipment. HMEs caused damage to fields and meadows, depletion of livestock and triggered the secondary consequences as lack of seeds and finance, high prices, indebtedness, poverty and deterioration in field fertility. The results are discussed with respect to uncertainties associated with documentary evidences and their spatiotemporal distribution. Archival records, preserved in the Moravian Land Archives in Brno and other district archives, create a unique source of data contributing to the better understanding of extreme events and their impacts.

  17. Atmospheric radiation measurement program facilities newsletter, March 2002.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holdridge, D. J.

    2002-04-18

    International H2O Project (IHOP-2002)--The International H2O Project (IHOP-2002) will take place in west-central Oklahoma over 44 days, May 13-June 25, 2002. The main focus will be water vapor and its role in storm development and rainfall production, information needed to improve rainfall forecasting. Forecasting the amount and location of rainfall is difficult, particularly in the warm months, and improvements are urgently needed. Accurate prediction of floods would be very beneficial to society, because flooding is costly in terms of loss of life and property damage. Deaths resulting from flash flooding outnumber those from hurricanes, tornadoes, windstorms, or lightning, and damagemore » due to flooding exceeds $5 billion annually. One measure of weather forecasting success is the accuracy of the Quantitative Precipitation Forecast (QPF), which predicts the amount of precipitation to be received at a certain location. One of the research goals of IHOP-2002 is to determine whether more accurate, detailed measurement of humidity will improve a computer model's ability to forecast rainfall amounts accurately. Current water vapor measurements are inadequate. The weather balloons (radiosondes) that gather most of the water vapor data used in today's weather and global climate models have three problems. First, the radiosonde stations are located too far apart, generating a grid of data that is too coarse to show the needed details in water vapor variability. Second, the radiosonde launches occur only every 12 hours, again providing too few data points for a highly variable parameter. Third, the radiosonde instrument has biases and inaccuracies in its measurements. Questionable data quality and data sets too coarse in both time and space make accurate forecasting difficult. The key to better, more accurate, higher-resolution water vapor measurements is dependable, ground-based sensors that operate continually and accurately. Such sensors will decrease dependence on sparsely spaced, costly weather balloon releases. IHOP-2002 will give researchers an active platform for testing and evaluating the capabilities and limitations of several water vapor measurement instruments. For example, the National Oceanic and Atmospheric Administration (NOAA) Environmental Technology Laboratory will be bringing a mini-DIAL (differential absorption lidar) to the SGP central facility for comparison with the SGP Raman lidar. Lidars send beams of laser light skyward and measure scattered light not absorbed by water molecules. The collection of IHOP-2002 instruments includes 2 fixed radars, 6 mobile radars, 2 airborne radars, 8 lidars (6 of which can sample water vapor), 1 advanced wind profiler, 2 sodars, 3 interferometers, 18 special surface stations, 800 radiosondes, 400 dropsondes, 1 tethersonde system, 52 global positioning system receivers, 3 profiling radiometers, 1 mobile profiling radiometer and wind profiler, and 5 water vapor radiometers. Six research aircraft will be deployed during the course of the field campaign. The aircraft will occasionally fly low-level tracks and will deploy dropsondes. A dropsonde resembles a radiosonde, an instrument package attached to a helium-filled balloon that rises into the atmosphere, but the dropsonde is released from an airplane and collects data on its way down to the ground. Finders of dropsondes are asked to follow the instructions on the package for returning the device to the researcher. Funding for IHOP-2002 is from many sources, including NOAA, the National Science Foundation, the National Center for Atmospheric Research, and the U.S. Department of Energy. Participation is worldwide, including researchers from Australia, Canada, France, Germany, the Netherlands, the United Kingdom, and the United States.« less

  18. Feasibility and Reliability of Two Different Walking Tests in People with Severe Intellectual and Sensory Disabilities

    ERIC Educational Resources Information Center

    Waninge, A.; Evenhuis, I. J.; van Wijck, R.; van der Schans, C. P.

    2011-01-01

    Background: The purpose of this study is to describe feasibility and test-retest reliability of the six-minute walking distance test (6MWD) and an adapted shuttle run test (aSRT) in persons with severe intellectual and sensory (multiple) disabilities. Materials and Methods: Forty-seven persons with severe multiple disabilities, with Gross Motor…

  19. Severity of Organized Item Theft in Computerized Adaptive Testing: An Empirical Study. Research Report. ETS RR-06-22

    ERIC Educational Resources Information Center

    Yi, Qing; Zhang, Jinming; Chang, Hua-Hua

    2006-01-01

    Chang and Zhang (2002, 2003) proposed several baseline criteria for assessing the severity of possible test security violations for computerized tests with high-stakes outcomes. However, these criteria were obtained from theoretical derivations that assumed uniformly randomized item selection. The current study investigated potential damage caused…

  20. The "1-3-5 cough test": comparing the severity of urodynamic stress incontinence with severity measures of subjective perception of stress urinary incontinence.

    PubMed

    Grigoriadis, Themos; Giannoulis, George; Zacharakis, Dimitris; Protopapas, Athanasios; Cardozo, Linda; Athanasiou, Stavros

    2016-03-01

    The purpose of the study was to examine whether a test performed during urodynamics, the "1-3-5 cough test", could determine the severity of urodynamic stress incontinence (USI). We included women referred for urodynamics who were diagnosed with USI. The "1-3-5 cough test" was performed to grade the severity of USI at the completion of filling cystometry. A diagnosis of "severe", "moderate" or "mild" USI was given if urine leakage was observed after one, three or five consecutive coughs respectively. We examined the associations between grades of USI severity and measures of subjective perception of stress urinary incontinence (SUI): International Consultation of Incontinence Modular Questionnaire-Female Lower Urinary Tract Symptom (ICIQ-FLUTS), King's Health Questionnaire (KHQ), Urinary Distress Inventory-6 (UDI-6), Urinary Impact Questionnaire-7 (UIQ-7). A total of 1,181 patients completed the ICIQ-FLUTS and KHQ and 612 completed the UDI-6 and UIQ-7 questionnaires. There was a statistically significant association of higher grades of USI severity with higher scores of the incontinence domain of the ICIQ-FLUTS. The scores of the UDI-6, UIQ-7 and of all KHQ domains (with the exception of general health perception and personal relationships) had statistically significant larger mean values for higher USI severity grade. Groups of higher USI severity had statistically significant associations with higher scores of most of the subjective measures of SUI. Severity of USI, as defined by the "1-3-5 cough test", was associated with the severity of subjective measures of SUI. This test may be a useful tool for the objective interpretation of patients with SUI who undergo urodynamics.

  1. A Procedure To Detect Test Bias Present Simultaneously in Several Items.

    ERIC Educational Resources Information Center

    Shealy, Robin; Stout, William

    A statistical procedure is presented that is designed to test for unidirectional test bias existing simultaneously in several items of an ability test, based on the assumption that test bias is incipient within the two groups' ability differences. The proposed procedure--Simultaneous Item Bias (SIB)--is based on a multidimensional item response…

  2. Corneal ulcers and infections

    MedlinePlus

    ... eye Scratches (abrasions) on the eye surface Severely dry eyes Severe allergic eye disease Various inflammatory disorders Wearing ... response Refraction test Slit-lamp examination Tests for dry eye Visual acuity Blood tests to check for inflammatory ...

  3. Development and psychometric testing of an instrument designed to measure chronic pain in dogs with osteoarthritis

    PubMed Central

    Boston, Raymond C.; Coyne, James C.; Farrar, John T.

    2010-01-01

    Objective To develop and psychometrically test an owner self-administered questionnaire designed to assess severity and impact of chronic pain in dogs with osteoarthritis. Sample Population 70 owners of dogs with osteoarthritis and 50 owners of clinically normal dogs. Procedures Standard methods for the stepwise development and testing of instruments designed to assess subjective states were used. Items were generated through focus groups and an expert panel. Items were tested for readability and ambiguity, and poorly performing items were removed. The reduced set of items was subjected to factor analysis, reliability testing, and validity testing. Results Severity of pain and interference with function were 2 factors identified and named on the basis of the items contained in them. Cronbach’s α was 0.93 and 0.89, respectively, suggesting that the items in each factor could be assessed as a group to compute factor scores (ie, severity score and interference score). The test-retest analysis revealed κ values of 0.75 for the severity score and 0.81 for the interference score. Scores correlated moderately well (r = 0.51 and 0.50, respectively) with the overall quality-of-life (QOL) question, such that as severity and interference scores increased, QOL decreased. Clinically normal dogs had significantly lower severity and interference scores than dogs with osteoarthritis. Conclusions and Clinical Relevance A psychometrically sound instrument was developed. Responsiveness testing must be conducted to determine whether the questionnaire will be useful in reliably obtaining quantifiable assessments from owners regarding the severity and impact of chronic pain and its treatment on dogs with osteoarthritis. PMID:17542696

  4. Rapid antigen detection tests for malaria diagnosis in severely ill Papua New Guinean children: a comparative study using Bayesian latent class models.

    PubMed

    Manning, Laurens; Laman, Moses; Rosanas-Urgell, Anna; Turlach, Berwin; Aipit, Susan; Bona, Cathy; Warrell, Jonathan; Siba, Peter; Mueller, Ivo; Davis, Timothy M E

    2012-01-01

    Although rapid diagnostic tests (RDTs) have practical advantages over light microscopy (LM) and good sensitivity in severe falciparum malaria in Africa, their utility where severe non-falciparum malaria occurs is unknown. LM, RDTs and polymerase chain reaction (PCR)-based methods have limitations, and thus conventional comparative malaria diagnostic studies employ imperfect gold standards. We assessed whether, using Bayesian latent class models (LCMs) which do not require a reference method, RDTs could safely direct initial anti-infective therapy in severe ill children from an area of hyperendemic transmission of both Plasmodium falciparum and P. vivax. We studied 797 Papua New Guinean children hospitalized with well-characterized severe illness for whom LM, RDT and nested PCR (nPCR) results were available. For any severe malaria, the estimated prevalence was 47.5% with RDTs exhibiting similar sensitivity and negative predictive value (NPV) to nPCR (≥96.0%). LM was the least sensitive test (87.4%) and had the lowest NPV (89.7%), but had the highest specificity (99.1%) and positive predictive value (98.9%). For severe falciparum malaria (prevalence 42.9%), the findings were similar. For non-falciparum severe malaria (prevalence 6.9%), no test had the WHO-recommended sensitivity and specificity of >95% and >90%, respectively. RDTs were the least sensitive (69.6%) and had the lowest NPV (96.7%). RDTs appear a valuable point-of-care test that is at least equivalent to LM in diagnosing severe falciparum malaria in this epidemiologic situation. None of the tests had the required sensitivity/specificity for severe non-falciparum malaria but the number of false-negative RDTs in this group was small.

  5. System Testing of Desktop and Web Applications

    ERIC Educational Resources Information Center

    Slack, James M.

    2011-01-01

    We want our students to experience system testing of both desktop and web applications, but the cost of professional system-testing tools is far too high. We evaluate several free tools and find that AutoIt makes an ideal educational system-testing tool. We show several examples of desktop and web testing with AutoIt, starting with simple…

  6. Comparison of effluent toxicity results using Ceriodaphnia dubia cultured on several diets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Norberg-King, T.J.; Schmidt, S.

    1993-10-01

    Several diets have been proposed for Ceriodaphnia dubia, but no single diet has been universally accepted as optimal for toxicity testing. Although several diets for Ceriodaphnia dubia culturing and testing are commonly used, little or no data exist on whether toxicity varies with the diet. This study evaluated several combinations of yeast-Cerophyl-trout chow (YCT), Selenastrum capricornutum, and Selenastrum capricornutum-Cerophyl foods for routine culture performance and the sensitivity of the offspring in subsequent acute toxicity tests with effluents. Variations in the diets included use of a vitamin-fortified yeast added to the YCT, algae (Selenastrum capricornutum) grown in two different algal media,more » and different feeding rates of the algae-Cerophyl diets. Eleven diets were evaluated in a multigeneration feeding study, but only seven were used in subsequent toxicity tests. The young produced from each of the seven diets were tested in 48-h acute tests with three different effluents across the generations. Toxicity tests with the effluents gave LC50s that were within a factor of two of one another, regardless of the food used for culturing. These results indicate that several diets are satisfactory for culturing Ceriodaphnia dubia and that the results of the toxicity tests are comparable.« less

  7. Utility of Language Comprehension Tests for Unintelligible or Non-Speaking Children with Cerebral Palsy: A Systematic Review

    ERIC Educational Resources Information Center

    Geytenbeek, Joke; Harlaar, Laurike; Stam, Marloes; Ket, Hans; Becher, Jules G.; Oostrom, Kim; Vermeulen, Jeroen

    2010-01-01

    Aim: To identify the use and utility of language comprehension tests for unintelligible or non-speaking children with severe cerebral palsy (CP). Method: Severe CP was defined as severe dysarthria (unintelligible speech) or anarthria (absence of speech) combined with severe limited mobility, corresponding to Gross Motor Function Classification…

  8. Impairment of executive function in Kenyan children exposed to severe falciparum malaria with neurological involvement.

    PubMed

    Kariuki, Symon M; Abubakar, Amina; Newton, Charles R J C; Kihara, Michael

    2014-09-16

    Persistent neurocognitive impairments occur in a fifth of children hospitalized with severe falciparum malaria. There is little data on the association between different neurological phenotypes of severe malaria (seizures, impaired consciousness and prostration) and impairments in executive function. Executive functioning of children exposed to severe malaria with different neurological phenotypes (N = 58) and in those unexposed (N = 56) was examined using neuropsychological tests such as vigilance test, test for everyday attention test for children (TEA-Ch), contingency naming test (CNT) and self-ordered pointing test (SOPT). Linear regression was used to determine the association between neurological phenotypes of severe malaria and executive function performance scores, accounting for potential confounders. Children with complex seizures in severe malaria performed more poorly than unexposed controls in the vigilance (median efficiency scores (interquartile range) = 4.84 (1.28-5.68) vs. 5.84 (4.71-6.42), P = 0.030) and SOPT (mean errors (standard deviation) = 29.50 (8.82) vs. 24.80 (6.50), P = 0.029) tests, but no differences were observed in TEA-Ch and CNT tests. Performance scores for other neurological phenotypes of severe malaria were similar with those of unexposed controls. After accounting for potential confounders, such as child's age, sex, schooling; maternal age, schooling and economic activity; perinatal factors and history of seizures, complex seizures remained associated with efficiency scores in the vigilance test (beta coefficient (β) (95% confidence interval (CI)) = -0.40 (-0.67, -0.13), P = 0.006) and everyday attention scores of the TEA-Ch test (β (95% CI) = -0.57 (-1.04, -0.10), P = 0.019); the association with SOPT error scores was weak (β (95% CI) = 4.57 (-0.73-9.89), P = 0.089). Combined neurological phenotypes were not significantly associated with executive function performance scores. Executive function impairment in children with severe malaria is associated with specific neurological phenotypes, particularly complex seizures. Effective prophylaxis and management of malaria-associated acute seizures may improve executive functioning performance scores of children.

  9. Assessment Issues in the Testing of Children at School Entry

    ERIC Educational Resources Information Center

    Rock, Donald A.; Stenner, A. Jackson

    2005-01-01

    The authors introduce readers to the research documenting racial and ethnic gaps in school readiness. They describe the key tests, including the Peabody Picture Vocabulary Test (PPVT), the Early Childhood Longitudinal Study (ECLS), and several intelligence tests, and describe how they have been administered to several important national samples of…

  10. [Working memory abilities and the severity of phonological disorders].

    PubMed

    Linassi, Lisiane Zorzella; Keske-Soares, Marcia; Mota, Helena Bolli

    2005-01-01

    Working memory. To verify the performance of working memory abilities and their relation with the severity of phonological disorders. 45 children, with ages between 5.0 and 7.11 years, with evolutional phonological disorders (EFD), 17 female and 18 male, were evaluated. All subjects were assessed using the Child Phonological Evaluation proposed by Yavas et al. (1991). The severity of the disorder was determined by the Percentage of Correct Consonants (PCC) proposed by Shriberg and Kwiatkowski (1982), classifying the phonological disorder as severe, moderate-severe, average-moderate and average. After that, subtest 5 of the Psycholinguistic Abilities Test (ITPA--Bogossian & Santos, 1977) and the non-word repetition test (Kessler, 1997) were applied. After analyzing the data according the statistical tests of Kruskal Wallis and Duncan, it was verified that the performance of moderate-severe and severe individuals in the non-word repetition test was inferior to that of average-moderate and average individuals. However, performance results in the digit repetition test did not present a positive correlation with severity. The performance of phonological memory has a relation with the severity of phonological disorders. This allows us to accept the idea that the phonologic memory is related to speech production. Regarding the central executor, the results indicate that the performance in digit repetition, used to assess the central executor, did not present a correlation with the severity of the disorder. This can be justified by the fact that the central executor is more directly related to vocabulary acquisition and is responsible for processing and storing information.

  11. Dizziness: relating the severity of vertigo to the degree of handicap by measuring vestibular impairment.

    PubMed

    Pérez, Nicolás; Martin, Eduardo; Garcia-Tapia, Rafael

    2003-03-01

    We sought to correlate the severity of vertigo and handicap in patients with vestibular pathology according to measures of impairment. We conducted a prospective assessment of patients with dizziness by means of caloric, rotatory test, and computerized dynamic posturography to estimate impairment. Handicap and severity of vertigo were determined with specific questionnaires (Dizziness Handicap Inventory and UCLA-DQ). A fair relationship were found between severity of dizziness and vestibular handicap. When impairment was taken into consideration, values were still fair and only moderate for a group of patients with an abnormal caloric test as the only pathologic finding. The composite score from the sensory organization test portion of the computerized dynamic posturography is fairly correlated to severity of vertigo and handicap in the whole population of patients, but no correlation was found when they were assigned to groups of vestibular impairment. To assess vestibular impairment, the results from several tests must be taken into account. However, vestibular handicap is not solely explained with measurements of impairment and/or severity.

  12. Does acetaminophen/hydrocodone affect cold pulpal testing in patients with symptomatic irreversible pulpitis? A prospective, randomized, double-blind, placebo-controlled study.

    PubMed

    Fowler, Sara; Fullmer, Spencer; Drum, Melissa; Reader, Al

    2014-12-01

    The purpose of this prospective randomized, double-blind, placebo-controlled study was to determine the effects of a combination dose of 1000 mg acetaminophen/10 mg hydrocodone on cold pulpal testing in patients experiencing symptomatic irreversible pulpitis. One hundred emergency patients in moderate to severe pain diagnosed with symptomatic irreversible pulpitis of a mandibular posterior tooth randomly received, in a double-blind manner, identical capsules of either a combination of 1000 mg acetaminophen/10 hydrocodone or placebo. Cold testing with Endo-Ice (1,1,1,2 tetrafluoroethane; Hygenic Corp, Akron, OH) was performed at baseline and every 10 minutes for 60 minutes. Pain to cold testing was recorded by the patient using a Heft-Parker visual analog scale. Patients' reaction to the cold application was also rated. Cold testing at baseline and at 10 minutes resulted in severe pain for both the acetaminophen/hydrocodone and placebo groups. Although pain ratings decreased from 20-60 minutes, the ratings still resulted in moderate pain. Patient reaction to cold testing showed that 56%-62% had a severe reaction. Although the reactions decreased in severity over the 60 minutes, 20%-34% still had severe reactions at 60 minutes. Regarding pain and patients' reactions to cold testing, there were no significant differences between the combination acetaminophen/hydrocodone and placebo groups at any time period. A combination dose of 1000 mg of acetaminophen/10 mg of hydrocodone did not statistically affect cold pulpal testing in patients presenting with symptomatic irreversible pulpitis. Patients experienced moderate to severe pain and reactions to cold testing. Copyright © 2014 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  13. AcrySof Natural SN60AT versus AcrySof SA60AT intraocular lens in patients with color vision defects.

    PubMed

    Raj, Shetal M; Vasavada, Abhay R; Nanavaty, Mayank A

    2005-12-01

    To determine whether implantation of the AcrySof Natural intraocular lens (IOL) worsened the severity of existing color deficit in congenital partial red-green color deficient individuals (CPRG). A prospective controlled randomized double-masked analysis of 30 consecutive patients with CPRG defect and bilateral cataracts received a Natural IOL (test group) in 1 eye and a single-piece AcrySof IOL (control group) in the other eye. Patients were tested unilaterally to detect CPRG defect using Ishihara pseudoisochromatic plates and the Farnsworth D-15 test. Plates 1 to 21 measured the Ishihara error score; plates 22 to 25 indicated severity of defect based on clarity of both numerals as partial mild/moderate (both visible), partial severe defect (only 1 visible). The D-15 test is based on number of diametrical crossings on the circular diagram; severity is graded as mild (1 crossing), moderate (2 crossings), or severe (>2 crossings). Tests were performed before and after IOL implantation at 1, 3, and 6 months. At mean follow-up of 6.13 months +/- 1.2 (SD), analysis of variance test judged the difference in error scores and cross tabulation represented change in number of diametrical crossings. The mean age was 62.3 +/- 8.5 years. All patients were men. Before IOL implantation, all patients had moderate CPRG defect on both tests. The Ishihara error score in the test and control groups did not reveal statistically significant differences (P = .505 and P = .119, respectively). With D-15, none of the patients in the test or control group showed >2 crossings. The implantation of AcrySof Natural IOL did not worsen the preexisting severity of color defect in CPRG individuals.

  14. Fitness testing as a discriminative tool for the diagnosis and monitoring of fibromyalgia.

    PubMed

    Aparicio, V A; Carbonell-Baeza, A; Ruiz, J R; Aranda, P; Tercedor, P; Delgado-Fernández, M; Ortega, F B

    2013-08-01

    We aimed to determine the ability of a set of physical fitness tests to discriminate between presence/absence of fibromyalgia (FM) and moderate/severe FM. The sample comprised 94 female FM patients (52 ± 8 years) and 66 healthy women (54 ± 6 years). We assessed physical fitness by means of the 30-s chair stand, handgrip strength, chair sit and reach, back scratch, blind flamingo, 8-feet up and go, and 6-min walking tests. Patients were classified as having moderate FM if the score in the Fibromyalgia Impact Questionnaire (FIQ) was <70 and as having severe FM if the FIQ was ≥70. FM patients and patients with severe FM performed worse in most of the fitness tests studied (P < 0.001). Except the back scratch test, all the tests were able to discriminate between presence and absence of FM [area under the curve (AUC) = 0.66 to 0.92; P ≤ 0.001], and four tests also discriminated FM severity (AUC = 0.62 to 0.66; P ≤ 0.05). The 30-s chair stand test showed the highest ability to discriminate FM presence and severity (AUC = 0.92, P < 0.001; and AUC = 0.66, P = 0.008, respectively), being the corresponding discriminating cutoffs 9 and 6 repetitions, respectively. Physical fitness in general, and particularly the 30-s chair stand test, is able to discriminate between women with FM from those without FM, as well as between those with moderate FM from their peers with severe FM. © 2011 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  15. Perinatal outcomes associated with intrahepatic cholestasis of pregnancy.

    PubMed

    Herrera, Christina Annette; Manuck, Tracy A; Stoddard, Gregory J; Varner, Michael W; Esplin, Sean; Clark, Erin A S; Silver, Robert M; Eller, Alexandra G

    2018-07-01

    The objective of this study is to examine perinatal outcomes associated with cholestasis of pregnancy according to bile acid level and antenatal testing practice. Retrospective cohort study of women with symptoms and bile acid testing from 2005 to 2014. Women were stratified by bile acid level: no cholestasis (<10 μmol/L), mild (10-39 μmol/L), moderate (40-99 μmol/L), and severe (≥100 μmol/L). The primary outcome was composite neonatal morbidity (hypoxic ischemic encephalopathy, severe intraventricular hemorrhage, bronchopulmonary dysplasia, necrotizing enterocolitis, or death). 785 women were included; 487 had cholestasis (347 mild, 108 moderate, 32 severe) and 298 did not. After controlling for gestational age (GA), severe cholestasis was associated with the composite neonatal outcome (aRR 5.6, 95% CI 1.3-23.5) and meconium-stained fluid (aRR 4.82, 95%CI 1.6-14.2). Bile acid levels were not correlated with the frequency of testing (p = .50). Women who underwent twice weekly testing were delivered earlier (p = .016) than women tested less frequently, but the difference in GA was ≤4 d. Abnormal testing prompting delivery was uncommon. Among women with cholestasis, there were three stillbirths. One of these women was undergoing antenatal testing, which was normal 1 d prior to the fetal demise. Severe cholestasis is associated with neonatal morbidity which antenatal testing may not predict.

  16. Big data in Parkinson's disease: using smartphones to remotely detect longitudinal disease phenotypes.

    PubMed

    Prince, John; Arora, Siddharth; de Vos, Maarten

    2018-04-26

    To better understand the longitudinal characteristics of Parkinson's disease (PD) through the analysis of finger tapping and memory tests collected remotely using smartphones. Using a large cohort (312 PD subjects and 236 controls) of participants in the mPower study, we extract clinically validated features from a finger tapping and memory test to monitor the longitudinal behaviour of study participants. We investigate any discrepancy in learning rates associated with motor and non-motor tasks between PD subjects and healthy controls. The ability of these features to predict self-assigned severity measures is assessed whilst simultaneously inspecting the severity scoring system for floor-ceiling effects. Finally, we study the relationship between motor and non-motor longitudinal behaviour to determine if separate aspects of the disease are dependent on one another. We find that the test performances of the most severe subjects show significant correlations with self-assigned severity measures. Interestingly, less severe subjects do not show significant correlations, which is shown to be a consequence of floor-ceiling effects within the mPower self-reporting severity system. We find that motor performance after practise is a better predictor of severity than baseline performance suggesting that starting performance at a new motor task is less representative of disease severity than the performance after the test has been learnt. We find PD subjects show significant impairments in motor ability as assessed through the alternating finger tapping (AFT) test in both the short- and long-term analyses. In the AFT and memory tests we demonstrate that PD subjects show a larger degree of longitudinal performance variability in addition to requiring more instances of a test to reach a steady state performance than healthy subjects. Our findings pave the way forward for objective assessment and quantification of longitudinal learning rates in PD. This can be particularly useful for symptom monitoring and assessing medication response. This study tries to tackle some of the major challenges associated with self-assessed severity labels by designing and validating features extracted from big datasets in PD, which could help identify digital biomarkers capable of providing measures of disease severity outside of a clinical environment.

  17. Real time test bed development for power system operation, control and cyber security

    NASA Astrophysics Data System (ADS)

    Reddi, Ram Mohan

    The operation and control of the power system in an efficient way is important in order to keep the system secure, reliable and economical. With advancements in smart grid, several new algorithms have been developed for improved operation and control. These algorithms need to be extensively tested and validated in real time before applying to the real electric power grid. This work focuses on the development of a real time test bed for testing and validating power system control algorithms, hardware devices and cyber security vulnerability. The test bed developed utilizes several hardware components including relays, phasor measurement units, phasor data concentrator, programmable logic controllers and several software tools. Current work also integrates historian for power system monitoring and data archiving. Finally, two different power system test cases are simulated to demonstrate the applications of developed test bed. The developed test bed can also be used for power system education.

  18. A simple behavioral test for locomotor function after brain injury in mice.

    PubMed

    Tabuse, Masanao; Yaguchi, Masae; Ohta, Shigeki; Kawase, Takeshi; Toda, Masahiro

    2010-11-01

    To establish a simple and reliable test for assessing locomotor function in mice with brain injury, we developed a new method, the rotarod slip test, in which the number of slips of the paralytic hind limb from a rotarod is counted. Brain injuries of different severity were created in adult C57BL/6 mice, by inflicting 1-point, 2-point and 4-point cryo-injuries. These mice were subjected to the rotarod slip test, the accelerating rotarod test and the elevated body swing test (EBST). Histological analyses were performed to assess the severity of the brain damage. Significant and consistent correlations between test scores and severity were observed for the rotarod slip test and the EBST. Only the rotarod slip test detected the mild hindlimb paresis in the acute and sub-acute phase after injury. Our results suggest that the rotarod slip test is the most sensitive and reliable method for assessing locomotor function after brain damage in mice. Copyright © 2010 Elsevier Ltd. All rights reserved.

  19. Standardization of Laboratory Methods for the PERCH Study

    PubMed Central

    Karron, Ruth A.; Morpeth, Susan C.; Bhat, Niranjan; Levine, Orin S.; Baggett, Henry C.; Brooks, W. Abdullah; Feikin, Daniel R.; Hammitt, Laura L.; Howie, Stephen R. C.; Knoll, Maria Deloria; Kotloff, Karen L.; Madhi, Shabir A.; Scott, J. Anthony G.; Thea, Donald M.; Adrian, Peter V.; Ahmed, Dilruba; Alam, Muntasir; Anderson, Trevor P.; Antonio, Martin; Baillie, Vicky L.; Dione, Michel; Endtz, Hubert P.; Gitahi, Caroline; Karani, Angela; Kwenda, Geoffrey; Maiga, Abdoul Aziz; McClellan, Jessica; Mitchell, Joanne L.; Morailane, Palesa; Mugo, Daisy; Mwaba, John; Mwansa, James; Mwarumba, Salim; Nyongesa, Sammy; Panchalingam, Sandra; Rahman, Mustafizur; Sawatwong, Pongpun; Tamboura, Boubou; Toure, Aliou; Whistler, Toni; O’Brien, Katherine L.; Murdoch, David R.

    2017-01-01

    Abstract The Pneumonia Etiology Research for Child Health study was conducted across 7 diverse research sites and relied on standardized clinical and laboratory methods for the accurate and meaningful interpretation of pneumonia etiology data. Blood, respiratory specimens, and urine were collected from children aged 1–59 months hospitalized with severe or very severe pneumonia and community controls of the same age without severe pneumonia and were tested with an extensive array of laboratory diagnostic tests. A standardized testing algorithm and standard operating procedures were applied across all study sites. Site laboratories received uniform training, equipment, and reagents for core testing methods. Standardization was further assured by routine teleconferences, in-person meetings, site monitoring visits, and internal and external quality assurance testing. Targeted confirmatory testing and testing by specialized assays were done at a central reference laboratory. PMID:28575358

  20. A Nonparametric Test for Homogeneity of Variances: Application to GPAs of Students across Academic Majors

    ERIC Educational Resources Information Center

    Bakir, Saad T.

    2010-01-01

    We propose a nonparametric (or distribution-free) procedure for testing the equality of several population variances (or scale parameters). The proposed test is a modification of Bakir's (1989, Commun. Statist., Simul-Comp., 18, 757-775) analysis of means by ranks (ANOMR) procedure for testing the equality of several population means. A proof is…

  1. European climate reconstructed for the past 500 years based on documentary and instrumental evidence

    NASA Astrophysics Data System (ADS)

    Wheeler, Dennis; Brazdil, Rudolf; Pfister, Christian

    2010-05-01

    European climate reconstructed for the past 500 years based on documentary and instrumental evidence Dennis Wheeler, Rudolf Brázdil, Christian Pfister and the Millennium project SG1 team The paper summarises the results of historical-climatological research conducted as part of the EU-funded 6th FP project MILLENNIUM the principal focus of which was the investigation of European climate during the past one thousand years (http://www.millenniumproject.net/). This project represents a major advance in bringing together, for the first time on such a scale, historical climatologists with other palaeoclimatological communities and climate modellers from many European countries. As part of MILLENNIUM, a sub-group (SG1) of historical climatologists from ten countries had the responsibility of collating and comprehensively analysing evidence from instrumental and documentary archives. This paper presents the main results of this undertaking but confines its attention to the study of the climate of the past 500 years and represents a summary of 10 themed papers submitted for a special issue of Climatic Change. They range across a variety of topics including newly-studied documentary data sources (e.g. early instrumental records, opening of the Stockholm harbour, ship log book data), temperature reconstructions for Central Europe, the Stockholm area and Mediterranean based on different types of documentary evidence, the application of standard paleoclimatological approaches to reconstructions based on index series derived from the documentary data, the influence of circulation dynamics on January-April climate , a comparison of reconstructions based on documentary data with the model runs (ECHO-G), a study of the quality of instrumental data in climate reconstructions, a 500-year flood chronology in Europe, and selected disastrous European windstorms and their reflection in documentary evidence and human memory. Finally, perspectives of historical-climatological research and future challenges and directions in this rapidly-developing and important field are presented together with an overview of the potential of documentary sources for climatic reconstructions.

  2. New Insights into the Consequences of Post-Windthrow Salvage Logging Revealed by Functional Structure of Saproxylic Beetles Assemblages

    PubMed Central

    Thorn, Simon; Bässler, Claus; Gottschalk, Thomas; Hothorn, Torsten; Bussler, Heinz; Raffa, Kenneth; Müller, Jörg

    2014-01-01

    Windstorms, bark beetle outbreaks and fires are important natural disturbances in coniferous forests worldwide. Wind-thrown trees promote biodiversity and restoration within production forests, but also cause large economic losses due to bark beetle infestation and accelerated fungal decomposition. Such damaged trees are often removed by salvage logging, which leads to decreased biodiversity and thus increasingly evokes discussions between economists and ecologists about appropriate strategies. To reveal the reasons behind species loss after salvage logging, we used a functional approach based on four habitat-related ecological traits and focused on saproxylic beetles. We predicted that salvage logging would decrease functional diversity (measured as effect sizes of mean pairwise distances using null models) as well as mean values of beetle body size, wood diameter niche and canopy cover niche, but would increase decay stage niche. As expected, salvage logging caused a decrease in species richness, but led to an increase in functional diversity by altering the species composition from habitat-filtered assemblages toward random assemblages. Even though salvage logging removes tree trunks, the most negative effects were found for small and heliophilous species and for species specialized on wood of small diameter. Our results suggested that salvage logging disrupts the natural assembly process on windthrown trees and that negative ecological impacts are caused more by microclimate alteration of the dead-wood objects than by loss of resource amount. These insights underline the power of functional approaches to detect ecosystem responses to anthropogenic disturbance and form a basis for management decisions in conservation. To mitigate negative effects on saproxylic beetle diversity after windthrows, we recommend preserving single windthrown trees or at least their tops with exposed branches during salvage logging. Such an extension of the green-tree retention approach to windthrown trees will preserve natural succession and associated communities of disturbed spruce forests. PMID:25050914

  3. New insights into the consequences of post-windthrow salvage logging revealed by functional structure of saproxylic beetles assemblages.

    PubMed

    Thorn, Simon; Bässler, Claus; Gottschalk, Thomas; Hothorn, Torsten; Bussler, Heinz; Raffa, Kenneth; Müller, Jörg

    2014-01-01

    Windstorms, bark beetle outbreaks and fires are important natural disturbances in coniferous forests worldwide. Wind-thrown trees promote biodiversity and restoration within production forests, but also cause large economic losses due to bark beetle infestation and accelerated fungal decomposition. Such damaged trees are often removed by salvage logging, which leads to decreased biodiversity and thus increasingly evokes discussions between economists and ecologists about appropriate strategies. To reveal the reasons behind species loss after salvage logging, we used a functional approach based on four habitat-related ecological traits and focused on saproxylic beetles. We predicted that salvage logging would decrease functional diversity (measured as effect sizes of mean pairwise distances using null models) as well as mean values of beetle body size, wood diameter niche and canopy cover niche, but would increase decay stage niche. As expected, salvage logging caused a decrease in species richness, but led to an increase in functional diversity by altering the species composition from habitat-filtered assemblages toward random assemblages. Even though salvage logging removes tree trunks, the most negative effects were found for small and heliophilous species and for species specialized on wood of small diameter. Our results suggested that salvage logging disrupts the natural assembly process on windthrown trees and that negative ecological impacts are caused more by microclimate alteration of the dead-wood objects than by loss of resource amount. These insights underline the power of functional approaches to detect ecosystem responses to anthropogenic disturbance and form a basis for management decisions in conservation. To mitigate negative effects on saproxylic beetle diversity after windthrows, we recommend preserving single windthrown trees or at least their tops with exposed branches during salvage logging. Such an extension of the green-tree retention approach to windthrown trees will preserve natural succession and associated communities of disturbed spruce forests.

  4. Doppler Lidar Vector Retrievals and Atmospheric Data Visualization in Mixed/Augmented Reality

    NASA Astrophysics Data System (ADS)

    Cherukuru, Nihanth Wagmi

    Environmental remote sensing has seen rapid growth in the recent years and Doppler wind lidars have gained popularity primarily due to their non-intrusive, high spatial and temporal measurement capabilities. While lidar applications early on, relied on the radial velocity measurements alone, most of the practical applications in wind farm control and short term wind prediction require knowledge of the vector wind field. Over the past couple of years, multiple works on lidars have explored three primary methods of retrieving wind vectors viz., using homogeneous windfield assumption, computationally extensive variational methods and the use of multiple Doppler lidars. Building on prior research, the current three-part study, first demonstrates the capabilities of single and dual Doppler lidar retrievals in capturing downslope windstorm-type flows occurring at Arizona's Barringer Meteor Crater as a part of the METCRAX II field experiment. Next, to address the need for a reliable and computationally efficient vector retrieval for adaptive wind farm control applications, a novel 2D vector retrieval based on a variational formulation was developed and applied on lidar scans from an offshore wind farm and validated with data from a cup and vane anemometer installed on a nearby research platform. Finally, a novel data visualization technique using Mixed Reality (MR)/ Augmented Reality (AR) technology is presented to visualize data from atmospheric sensors. MR is an environment in which the user's visual perception of the real world is enhanced with live, interactive, computer generated sensory input (in this case, data from atmospheric sensors like Doppler lidars). A methodology using modern game development platforms is presented and demonstrated with lidar retrieved wind fields. In the current study, the possibility of using this technology to visualize data from atmospheric sensors in mixed reality is explored and demonstrated with lidar retrieved wind fields as well as a few earth science datasets for education and outreach activities.

  5. Meteorological and hydrological extremes derived from taxation records: case study for south-western Moravia (Czech Republic)

    NASA Astrophysics Data System (ADS)

    Chromá, Kateřina; Brázdil, Rudolf; Valášek, Hubert; Zahradníček, Pavel

    2013-04-01

    Meteorological and hydrological extremes (MHEs) cause great material damage or even loss of human lives in the present time, similarly as it was in the past. In the Czech Lands (recently the Czech Republic), systematic meteorological and hydrological observations started generally in the latter half of the 19th century. Therefore, in order to create long-term series of such extremes, it is necessary to search for other sources of information. Different types of documentary evidence are used in historical climatology and hydrology to find such information. Some of them are related to records connected with taxation system. The taxation system in Moravia allowed farmers to request tax relief if their crops have been damaged by MHEs. The corresponding documents contain information about the type of extreme event and the date of its occurrence; often also impacts on crops or land may be derived. The nature of events leading to damage include particularly hailstorms, torrential rain, flash floods, floods (in regions along larger rivers), less frequently windstorms, late frosts and in some cases also information about droughts or extreme snow depths. However, the results obtained are influenced by uncertainties related to taxation records - their temporal and spatial incompleteness, limitation of the MHEs occurrence in the period of main agricultural work (May-August) and the purpose for which they were originally collected (primarily tax alleviation, i.e. information about MHEs was of secondary importance). All these aspects related to the study of MHEs from taxation records are demonstrated for five estates (Bítov, Budkov, Jemnice with Staré Hobzí, Nové Syrovice and Uherčice) in the south-western part of Moravia for the 18th-19th centuries. The analysis shows importance of taxation records for the study of past MHEs as well as great potential for their use.

  6. Severity of Organized Item Theft in Computerized Adaptive Testing: A Simulation Study

    ERIC Educational Resources Information Center

    Yi, Qing; Zhang, Jinming; Chang, Hua-Hua

    2008-01-01

    Criteria had been proposed for assessing the severity of possible test security violations for computerized tests with high-stakes outcomes. However, these criteria resulted from theoretical derivations that assumed uniformly randomized item selection. This study investigated potential damage caused by organized item theft in computerized adaptive…

  7. 78 FR 38389 - Proposed Modification to the Scopes of Recognition of Several NRTLs

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-26

    ... the NRTL's written testing procedures.\\2\\ \\2\\ Such datasheets may be electronic records or hardcopies... particular test standard. OSHA reviews each NRTL's procedures to determine which approach the NRTL will use... standards from the scopes of recognition of several nationally recognized testing laboratories (NRTLs), and...

  8. Using Norm-Referenced Tests to Determine Severity of Language Impairment in Children: Disconnect between U.S. Policy Makers and Test Developers

    ERIC Educational Resources Information Center

    Spaulding, Tammie J.; Szulga, Margaret Swartwout; Figueroa, Cecilia

    2012-01-01

    Purpose: The purpose of this study was to identify various U.S. state education departments' criteria for determining the severity of language impairment in children, with particular focus on the use of norm-referenced tests. A secondary objective was to determine if norm-referenced tests of child language were developed for the purpose of…

  9. Feasibility and Reliability of Tests Measuring Health-Related Physical Fitness in Children with Moderate to Severe Levels of Intellectual Disability

    ERIC Educational Resources Information Center

    Wouters, Marieke; van der Zanden, Anna M.; Evenhuis, Heleen M.; Hilgenkamp, Thessa I. M.

    2017-01-01

    Physical fitness is an important marker for health. In this study we investigated the feasibility and reliability of health-related physical fitness tests in children with moderate to severe levels of intellectual disability. Thirty-nine children (2-18 yrs) performed tests for muscular strength and endurance, the modified 6-minute walk test (6mwt)…

  10. Effects of Test Level Discrimination and Difficulty on Answer-Copying Indices

    ERIC Educational Resources Information Center

    Sunbul, Onder; Yormaz, Seha

    2018-01-01

    In this study Type I Error and the power rates of omega (?) and GBT (generalized binomial test) indices were investigated for several nominal alpha levels and for 40 and 80-item test lengths with 10,000-examinee sample size under several test level restrictions. As a result, Type I error rates of both indices were found to be below the acceptable…

  11. Effect of aquatic versus land based exercise programs on physical performance in severely burned patients: a randomized controlled trial.

    PubMed

    Zoheiry, Ibrahim M; Ashem, Haidy N; Ahmed, Hamada Ahmed Hamada; Abbas, Rami

    2017-12-01

    [Purpose] To compare the effect of an aquatic-based versus a land-based exercise regimen on the physical performance of severely burned patients. [Subjects and Methods] Forty patients suffering from severe burn (total body surface area more than 30%) were recruited from several outpatient clinics in Greater Cairo. Their ages ranged between 20 to 40 years and were randomly assigned into two equal groups: group (A), which received an aquatic based exercise program, and group (B), which received a land-based exercise program. The exercise program, which took place in 12 consecutive weeks, consisted of flexibility, endurance, and lower and upper body training. Physical performance was assessed using 30 seconds chair stand test, stair climb test, 30 meter fast paced walk test, time up and go test, 6-minute walk test and a VO2max evaluation. [Results] Significantly increase in the 30 second chair stand, 6-minute walk, 30 meter fast paced walk, stair climb, and VO2 max tests and significantly decrease in the time up and go test in group A (aquatic based exercise) compared with group B (a land-based exercise) at the post treatment. [Conclusion] Twelve-week program of an aquatic program yields improvement in both physical performance and VO2 max in patients with severe burns.

  12. Evidence-based severity assessment: Impact of repeated versus single open-field testing on welfare in C57BL/6J mice.

    PubMed

    Bodden, Carina; Siestrup, Sophie; Palme, Rupert; Kaiser, Sylvia; Sachser, Norbert; Richter, S Helene

    2018-01-15

    According to current guidelines on animal experiments, a prospective assessment of the severity of each procedure is mandatory. However, so far, the classification of procedures into different severity categories mainly relies on theoretic considerations, since it is not entirely clear which of the various procedures compromise the welfare of animals, or, to what extent. Against this background, a systematic empirical investigation of the impact of each procedure, including behavioral testing, seems essential. Therefore, the present study was designed to elucidate the effects of repeated versus single testing on mouse welfare, using one of the most commonly used paradigms for behavioral phenotyping in behavioral neuroscience, the open-field test. In an independent groups design, laboratory mice (Mus musculus f. domestica) experienced either repeated, single, or no open-field testing - procedures that are assigned to different severity categories. Interestingly, testing experiences did not affect fecal corticosterone metabolites, body weights, elevated plus-maze or home cage behavior differentially. Thus, with respect to the assessed endocrinological, physical, and behavioral outcome measures, no signs of compromised welfare could be detected in mice that were tested in the open-field repeatedly, once, or, not at all. These findings challenge current classification guidelines and may, furthermore, stimulate systematic research on the severity of single procedures involving living animals. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Effect of aquatic versus land based exercise programs on physical performance in severely burned patients: a randomized controlled trial

    PubMed Central

    Zoheiry, Ibrahim M.; Ashem, Haidy N.; Ahmed, Hamada Ahmed Hamada; Abbas, Rami

    2017-01-01

    [Purpose] To compare the effect of an aquatic-based versus a land-based exercise regimen on the physical performance of severely burned patients. [Subjects and Methods] Forty patients suffering from severe burn (total body surface area more than 30%) were recruited from several outpatient clinics in Greater Cairo. Their ages ranged between 20 to 40 years and were randomly assigned into two equal groups: group (A), which received an aquatic based exercise program, and group (B), which received a land-based exercise program. The exercise program, which took place in 12 consecutive weeks, consisted of flexibility, endurance, and lower and upper body training. Physical performance was assessed using 30 seconds chair stand test, stair climb test, 30 meter fast paced walk test, time up and go test, 6-minute walk test and a VO2max evaluation. [Results] Significantly increase in the 30 second chair stand, 6-minute walk, 30 meter fast paced walk, stair climb, and VO2 max tests and significantly decrease in the time up and go test in group A (aquatic based exercise) compared with group B (a land-based exercise) at the post treatment. [Conclusion] Twelve-week program of an aquatic program yields improvement in both physical performance and VO2 max in patients with severe burns. PMID:29643605

  14. Line - organised convection putting fire to forest area of Halkidiki, Northern Greece

    NASA Astrophysics Data System (ADS)

    Vlachou, M.; Brikas, D.; Pytharoulis, I.

    2010-09-01

    The organisation of convection in a line often coincides with the end of heat waves in the Southern Balkans. This was indeed the case on the 21st of August 2006, when the tail of an eastward moving cold front put an end to the preceding heat wave and, at the same time, triggered thunderstorms and windstorms in Southern Bulgaria and Northern Greece. The associated electric activity initiated a fire in Kassandra, Halkidiki, Greece. Due to the prolonged drought and the strong winds, the fire spread quickly. It lasted for three days, costing two human lives, burning an extended forest area, as well as destroying hotels and resort facilities. Availabla data are: i) European Centre for Medium - range Weather Forecasts (ECMWF) analyses, ii) RADAR reflectivity data from the Weather Modification Dept. of the Hellenic Agricultural Insurance Organisation and iii) surface and upper air data from the airport ‘Makedonia’ of Thessaloniki, Greece. The heat wave, that affected Greece during the 5 - day period prior to the line convection, was associated with the establishment of a hot, but very stable at low levels, boundary layer, probably modified part of the Saharan air layer, advected to the area of interest. Destabilisation occurred due to surface heating, as well as upper level cold air advection. From the synoptic point of view, upward motion prevails under the inflection point of the subtropical and polar jet streams, indicating once more how important are, for upper level divergence, the curvature changes along the flow. In the meso-α scale, the line convection formed along and just ahead of a shallow, frontogenetically active cold frontal zone. Hence, the line under study may be called a squall line. It is suggested that such zones play a key role in triggering severe weather in the same area, as well as cyclogenesis in the Mediterranean area. Previous studies have shown numerous severe weather events to occur along such zones. In the meso-β scale, the line under study fits the 2-D model of squall lines, as transverse vertical cross sections show. On the isentropic level, as the system moves eastward, warm low level air flows in from the east - southeast, whereas cold upper level air is ingested from the north. The hourly sea level pressure field exhibits pre - squall and wake lows and a meso - high, the classic features observed before, after and during the passage of a squall line, respectively. More interestingly, the succession of clouds and associated weather was typical of a squall line. Convective activity peaked suddenly to the cumulonimbus level, with no cumulus clouds observed prior to the squall line. The lack of a dense observing network, as far as upper air and even surface observations are concerned, limits the study of the meso-β features of the squall line (movement etc.). The lack of Doppler RADAR data precludes the representation of the transverse circulation across the line. However, it is hoped that the present study adds to the research on severe weather in the Mediterranean, as it highlights the crucial role of synoptic - meso-α scale features. In view of the failure of the operational global NWP models (ECMWF, NCEP GFS) to predict squall line associated precipitation, which fell even in the form of hail, forecasters’ attention is drawn upon convergence zones, especially low level frontogenetically active ones. It is suggested that summertime convection is rather dynamically than thermodynamically driven, this general rule applying even to non-frontal convection. Equivalently, convection is more likely to occur when the dynamics promote boundary layer convergence and upward motion, rather than when the thermodynamics favor air parcel ascent and saturation.

  15. The utility of home sleep apnea tests in patients with low versus high pre-test probability for moderate to severe OSA.

    PubMed

    Goldstein, Cathy A; Karnib, Hala; Williams, Katherine; Virk, Zunaira; Shamim-Uzzaman, Afifa

    2017-11-22

    Home sleep apnea tests (HSATs) are an alternative to attended polysomnograms (PSGs) when the pre-test probability for moderate to severe OSA is high. However, insurers often mandate use anytime OSA is suspected regardless of the pre-test probability. Our objective was to determine the ability of HSATs to rule in OSA when the pre-test probability of an apnea hypopnea index (AHI) in the moderate to severe range is low. Patients who underwent HSATs were characterized as low or high pre-test probability based on the presence of two symptoms of the STOP instrument plus either BMI > 35 or male gender. The odds of HSAT diagnostic for OSA dependent on pre-test probability was calculated. Stepwise selection determined predictors of non-diagnostic HSAT. As PSG is performed after HSATs that do not confirm OSA, false negative results were assessed. Among 196 individuals, pre-test probability was low in 74 (38%) and high in 122 (62%). A lower percentage of individuals with a low versus high pre-test probability for moderate to severe OSA had HSAT results that confirmed OSA (61 versus 84%, p = 0.0002) resulting in an odds ratio (OR) of 0.29 for confirmatory HSAT in the low pre-test probability group (95% CI [0.146, 0.563]). Multivariate logistic regression demonstrated that age ≤ 50 (OR 3.10 [1.24-7.73]), female gender (OR 3.58[1.50-8.66]), non-enlarged neck circumference (OR 11.50 [2.50-52.93]), and the absence of loud snoring (OR 3.47 [1.30-9.25]) best predicted non-diagnostic HSAT. OSA was diagnosed by PSG in 54% of individuals with negative HSAT which was similar in both pre-test probability groups. HSATs should be reserved for individuals with high pre-test probability for moderate to severe disease as opposed to any individual with suspected OSA.

  16. Bioclinical Test to Predict Nephropathia Epidemica Severity at Hospital Admission.

    PubMed

    Hentzien, Maxime; Mestrallet, Stéphanie; Halin, Pascale; Pannet, Laure-Anne; Lebrun, Delphine; Dramé, Moustapha; Bani-Sadr, Firouzé; Galempoix, Jean-Marc; Strady, Christophe; Reynes, Jean-Marc; Penalba, Christian; Servettaz, Amélie

    2018-06-01

    We conducted a multicenter, retrospective cohort study of hospitalized patients with serologically proven nephropathia epidemica (NE) living in Ardennes Department, France, during 2000-2014 to develop a bioclinical test predictive of severe disease. Among 205 patients, 45 (22.0%) had severe NE. We found the following factors predictive of severe NE: nephrotoxic drug exposure (p = 0.005, point value 10); visual disorders (p = 0.02, point value 8); microscopic or macroscopic hematuria (p = 0.04, point value 7); leukocyte count >10 × 10 9 cells/L (p = 0.01, point value 9); and thrombocytopenia <90 × 10 9 /L (p = 0.003, point value 11). When point values for each factor were summed, we found a score of <10 identified low-risk patients (3.3% had severe disease), and a score >20 identified high-risk patients (45.3% had severe disease). If validated in future studies, this test could be used to stratify patients by severity in research studies and in clinical practice.

  17. 21. VALVES, GAUGES, AND SEVERAL TYPES OF LIGHTING ALONG ROAD ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    21. VALVES, GAUGES, AND SEVERAL TYPES OF LIGHTING ALONG ROAD AT SOUTH REAR OF TEST STAND 1-A. RP1 TANK FARM IN MIDDLE DISTANCE. Looking northeast. - Edwards Air Force Base, Air Force Rocket Propulsion Laboratory, Test Stand 1-A, Test Area 1-120, north end of Jupiter Boulevard, Boron, Kern County, CA

  18. Remote control circuit breaker evaluation testing. [for space shuttles

    NASA Technical Reports Server (NTRS)

    Bemko, L. M.

    1974-01-01

    Engineering evaluation tests were performed on several models/types of remote control circuit breakers marketed in an attempt to gain some insight into their potential suitability for use on the space shuttle vehicle. Tests included the measurement of several electrical and operational performance parameters under laboratory ambient, space simulation, acceleration and vibration environmental conditions.

  19. Leptospirosis outbreak following severe flooding: a rapid assessment and mass prophylaxis campaign; Guyana, January-February 2005.

    PubMed

    Dechet, Amy M; Parsons, Michele; Rambaran, Madan; Mohamed-Rambaran, Pheona; Florendo-Cumbermack, Anita; Persaud, Shamdeo; Baboolal, Shirematee; Ari, Mary D; Shadomy, Sean V; Zaki, Sherif R; Paddock, Christopher D; Clark, Thomas A; Harris, Lazenia; Lyon, Douglas; Mintz, Eric D

    2012-01-01

    Leptospirosis is a zoonosis usually transmitted through contact with water or soil contaminated with urine from infected animals. Severe flooding can put individuals at greater risk for contracting leptospirosis in endemic areas. Rapid testing for the disease and large-scale interventions are necessary to identify and control infection. We describe a leptospirosis outbreak following severe flooding and a mass chemoprophylaxis campaign in Guyana. From January-March 2005, we collected data on suspected leptospirosis hospitalizations and deaths. Laboratory testing included anti-leptospiral dot enzyme immunoassay (DST), immunohistochemistry (IHC) staining, and microscopic agglutination testing (MAT). DST testing was conducted for 105 (44%) of 236 patients; 52 (50%) tested positive. Four (57%) paired serum samples tested by MAT were confirmed leptospirosis. Of 34 total deaths attributed to leptospirosis, postmortem samples from 10 (83%) of 12 patients were positive by IHC. Of 201 patients interviewed, 89% reported direct contact with flood waters. A 3-week doxycycline chemoprophylaxis campaign reached over 280,000 people. A confirmed leptospirosis outbreak in Guyana occurred after severe flooding, resulting in a massive chemoprophylaxis campaign to try to limit morbidity and mortality.

  20. Image analysis software for following progression of peripheral neuropathy

    NASA Astrophysics Data System (ADS)

    Epplin-Zapf, Thomas; Miller, Clayton; Larkin, Sean; Hermesmeyer, Eduardo; Macy, Jenny; Pellegrini, Marco; Luccarelli, Saverio; Staurenghi, Giovanni; Holmes, Timothy

    2009-02-01

    A relationship has been reported by several research groups [1 - 4] between the density and shapes of nerve fibers in the cornea and the existence and severity of peripheral neuropathy. Peripheral neuropathy is a complication of several prevalent diseases or conditions, which include diabetes, HIV, prolonged alcohol overconsumption and aging. A common clinical technique for confirming the condition is intramuscular electromyography (EMG), which is invasive, so a noninvasive technique like the one proposed here carries important potential advantages for the physician and patient. A software program that automatically detects the nerve fibers, counts them and measures their shapes is being developed and tested. Tests were carried out with a database of subjects with levels of severity of diabetic neuropathy as determined by EMG testing. Results from this testing, that include a linear regression analysis are shown.

  1. A Comparison of Four Item-Selection Methods for Severely Constrained CATs

    ERIC Educational Resources Information Center

    He, Wei; Diao, Qi; Hauser, Carl

    2014-01-01

    This study compared four item-selection procedures developed for use with severely constrained computerized adaptive tests (CATs). Severely constrained CATs refer to those adaptive tests that seek to meet a complex set of constraints that are often not conclusive to each other (i.e., an item may contribute to the satisfaction of several…

  2. Combustion Stability Analyses of Coaxial Element Injectors with Liquid Oxygen/Liquid Methane Propellants

    NASA Technical Reports Server (NTRS)

    Hulka, J. R.

    2010-01-01

    Liquid rocket engines using oxygen and methane propellants are being considered by the National Aeronautics and Space Administration (NASA) for in-space vehicles. This propellant combination has not been previously used in a flight-qualified engine system, so limited test data and analysis results are available at this stage of early development. NASA has funded several hardware-oriented activities with oxygen and methane propellants over the past several years with the Propulsion and Cryogenic Advanced Development (PCAD) project, under the Exploration Technology Development Program. As part of this effort, the NASA Marshall Space Flight Center has conducted combustion stability analyses of several of the configurations. This paper presents test data and analyses of combustion stability from the recent PCAD-funded test programs at the NASA MSFC. These test programs used swirl coaxial element injectors with liquid oxygen and liquid methane propellants. Oxygen was injected conventionally in the center of the coaxial element, and swirl was provided by tangential entry slots. Injectors with 28-element and 40-element patterns were tested with several configurations of combustion chambers, including ablative and calorimeter spool sections, and several configurations of fuel injection design. Low frequency combustion instability (chug) occurred with both injectors, and high-frequency combustion instability occurred at the first tangential (1T) transverse mode with the 40-element injector. In most tests, a transition between high-amplitude chug with gaseous methane flow and low-amplitude chug with liquid methane flow was readily observed. Chug analyses of both conditions were conducted using techniques from Wenzel and Szuch and from the Rocket Combustor Interactive Design and Analysis (ROCCID) code. The 1T mode instability occurred in several tests and was apparent by high-frequency pressure measurements as well as dramatic increases in calorimeter-measured heat flux throughout the chamber. Analyses of the transverse mode were conducted with ROCCID and empirical methods such as Hewitt d/V. This paper describes the test hardware configurations, test data, analysis methods, and presents results of the various analyses.

  3. A Multiscale Material Testing System for In Situ Optical and Electron Microscopes and Its Application

    PubMed Central

    Ye, Xuan; Cui, Zhiguo; Fang, Huajun; Li, Xide

    2017-01-01

    We report a novel material testing system (MTS) that uses hierarchical designs for in-situ mechanical characterization of multiscale materials. This MTS is adaptable for use in optical microscopes (OMs) and scanning electron microscopes (SEMs). The system consists of a microscale material testing module (m-MTM) and a nanoscale material testing module (n-MTM). The MTS can measure mechanical properties of materials with characteristic lengths ranging from millimeters to tens of nanometers, while load capacity can vary from several hundred micronewtons to several nanonewtons. The m-MTM is integrated using piezoelectric motors and piezoelectric stacks/tubes to form coarse and fine testing modules, with specimen length from millimeters to several micrometers, and displacement distances of 12 mm with 0.2 µm resolution for coarse level and 8 µm with 1 nm resolution for fine level. The n-MTM is fabricated using microelectromechanical system technology to form active and passive components and realizes material testing for specimen lengths ranging from several hundred micrometers to tens of nanometers. The system’s capabilities are demonstrated by in-situ OM and SEM testing of the system’s performance and mechanical properties measurements of carbon fibers and metallic microwires. In-situ multiscale deformation tests of Bacillus subtilis filaments are also presented. PMID:28777341

  4. Oxygen-driving and atomized mucosolvan inhalation combined with holistic nursing in the treatment of children severe bronchial pneumonia.

    PubMed

    Yang, Fang

    2015-07-01

    This paper aimed to discuss the method, effect and safety of oxygen-driving and atomized Mucosolvan inhalation combined with holistic nursing in the treatment of children severe bronchial pneumonia. Totally 90 children with severe bronchial pneumonia who were treated in our hospital from March 2013 to November 2013 were selected as the research objects. Based on randomized controlled principle, those children were divided into control group, test group I and test group II according to the time to enter the hospital, 30 in each group. Patients in control group was given conventional therapy; test group I was given holistic nursing combined with conventional therapy; test group II was given oxygen-driving and atomized Mucosolvan inhalation combined with holistic nursing on the basis of conventional therapy. After test, the difference of main symptoms in control group, test group I and II was of no statistical significance (P>0.05). Test group II was found with the best curative effect, secondary was test group I and control group was the last. It can be concluded that, oxygen-driving and atomized Mucosolvan inhalation combined with holistic nursing has certain effect in the treatment of children severe bronchial pneumonia and is better than holistic nursing only.

  5. Formal thought disorder, neuropsychology and insight in schizophrenia.

    PubMed

    Barrera, Alvaro; McKenna, Peter J; Berrios, German E

    2009-01-01

    Information provided by patients with schizophrenia and their respective carers is used to study the descriptive psychopathology and neuropsychology of formal thought disorder (FTD). Relatively intellectually preserved schizophrenia patients (n = 31) exhibiting from no to severe positive FTD completed a self-report scale of FTD, a scale of insight as well as several tests of executive and semantic function. The patients' carers completed another scale of FTD to assess the patients' speech. FTD as self-reported by patients was significantly associated with the synonyms test performance and severity of the reality distortion dimension. FTD as assessed by a clinician and by the patients' carers was significantly associated with executive test performance and performance in a test of associative semantics. Overall insight was significantly associated with severity of the reality distortion dimension and graded naming test performance, but was not associated with self-reported FTD or severity of FTD as assessed by the clinician or carers. The self-reported experience of FTD has different clinical and neuropsychological correlates from those of FTD as assessed by clinicians and carers. The assessment of FTD by patients and carers used along with the clinician's assessment may further the study of this group of symptoms. 2009 S. Karger AG, Basel.

  6. Implementing Sympson-Hetter Item-Exposure Control in a Shadow-Test Approach to Constrained Adaptive Testing

    ERIC Educational Resources Information Center

    Veldkamp, Bernard P.; van der Linden, Wim J.

    2008-01-01

    In most operational computerized adaptive testing (CAT) programs, the Sympson-Hetter (SH) method is used to control the exposure of the items. Several modifications and improvements of the original method have been proposed. The Stocking and Lewis (1998) version of the method uses a multinomial experiment to select items. For severely constrained…

  7. Adapting Educational Measurement to the Demands of Test-Based Accountability

    ERIC Educational Resources Information Center

    Koretz, Daniel

    2015-01-01

    Accountability has become a primary function of large-scale testing in the United States. The pressure on educators to raise scores is vastly greater than it was several decades ago. Research has shown that high-stakes testing can generate behavioral responses that inflate scores, often severely. I argue that because of these responses, using…

  8. Goodness-of-fit tests for discrete data: a review and an application to a health impairment scale.

    PubMed

    Horn, S D

    1977-03-01

    We review the advantages and disadvantages of several goodness-of-fit tests which may be used with discrete data: the multinomial test, the likelihood ratio test, the X2 test, the two-stage X2 test and the discrete Kolmogorov-Smirnov test. Although the X2 test is the best known and most widely used of these tests, its use with small sample sizes is controversial. If one has data which fall into ordered categories, then the discrete Kolmogorov-Smirnov test is an exact test which uses the information from the ordering and can be used for small sample sizes. We illustrate these points with an example of several analyses of health impairment data.

  9. Minor salivary glands and labial mucous membrane graft in the treatment of severe symblepharon and dry eye in patients with Stevens-Johnson syndrome.

    PubMed

    Sant' Anna, Ana Estela B P P; Hazarbassanov, Rossen M; de Freitas, Denise; Gomes, José Álvaro P

    2012-02-01

    To evaluate minor salivary glands and labial mucous membrane graft in patients with severe symblepharon and dry eye secondary to Stevens-Johnson syndrome (SJS). A prospective, non-comparative, interventional case series of 19 patients with severe symblepharon and dry eye secondary to SJS who underwent labial mucous membrane and minor salivary glands transplantation. A complete ophthalmic examination including the Schirmer I test was performed prior to and following surgery. All patients had a preoperative Schirmer I test value of zero. Nineteen patients with severe symblepharon and dry eye secondary to SJS were included in the study. There was a statistically significant improvement in the best spectacle-corrected visual acuity in eight patients (t test; p=0.0070). Values obtained in the Schirmer I test improved significantly in 14 eyes (73.7%) 6 months following surgery (χ(2) test; p=0.0094). A statistically significant increase in tear production (Schirmer I test) was found in eyes that received more than 10 glands per graft compared with eyes that received fewer glands (χ(2) test; p=0.0096). Corneal transparency improved significantly in 11 (72.2%) eyes and corneal neovascularisation improved significantly in five eyes (29.4%) (McNemar test; p=0.001 and p=0.0005). The symptoms questionnaire revealed improvement in foreign body sensation in 53.6% of the patients, in photophobia in 50.2% and in pain in 54.8% (Kruskal-Wallis test; p=0.0167). Labial mucous membrane and minor salivary glands transplantation were found to constitute a good option for the treatment of severe symblepharon and dry eye secondary to SJS. This may be considered as a step prior to limbal stem cell and corneal transplantation in these patients.

  10. Evaluation of the Waggoner Computerized Color Vision Test.

    PubMed

    Ng, Jason S; Self, Eriko; Vanston, John E; Nguyen, Andrew L; Crognale, Michael A

    2015-04-01

    Clinical color vision evaluation has been based primarily on the same set of tests for the past several decades. Recently, computer-based color vision tests have been devised, and these have several advantages but are still not widely used. In this study, we evaluated the Waggoner Computerized Color Vision Test (CCVT), which was developed for widespread use with common computer systems. A sample of subjects with (n = 59) and without (n = 361) color vision deficiency (CVD) were tested on the CCVT, the anomaloscope, the Richmond HRR (Hardy-Rand-Rittler) (4th edition), and the Ishihara test. The CCVT was administered in two ways: (1) on a computer monitor using its default settings and (2) on one standardized to a correlated color temperature (CCT) of 6500 K. Twenty-four subjects with CVD performed the CCVT both ways. Sensitivity, specificity, and correct classification rates were determined. The screening performance of the CCVT was good (95% sensitivity, 100% specificity). The CCVT classified subjects as deutan or protan in agreement with anomaloscopy 89% of the time. It generally classified subjects as having a more severe defect compared with other tests. Results from 18 of the 24 subjects with CVD tested under both default and calibrated CCT conditions were the same, whereas the results from 6 subjects had better agreement with other test results when the CCT was set. The Waggoner CCVT is an adequate color vision screening test with several advantages and appears to provide a fairly accurate diagnosis of deficiency type. Used in conjunction with other color vision tests, it may be a useful addition to a color vision test battery.

  11. Neurophysiological testing in anorectal disorders

    PubMed Central

    Remes-Troche, Jose M; Rao, Satish SC

    2013-01-01

    Neurophysiological tests of anorectal function can provide useful information regarding the integrity of neuronal innervation, as well as neuromuscular function. This information can give insights regarding the pathophysiological mechanisms that lead to several disorders of anorectal function, particularly fecal incontinence, pelvic floor disorders and dyssynergic defecation. Currently, several tests are available for the neurophysiological evaluation of anorectal function. These tests are mostly performed on patients referred to tertiary care centers, either following negative evaluations or when there is lack of response to conventional therapy. Judicious use of these tests can reveal significant and new understanding of the underlying mechanism(s) that could pave the way for better management of these disorders. In addition, these techniques are complementary to other modalities of investigation, such as pelvic floor imaging. The most commonly performed neurophysiological tests, along with their indications and clinical utility are discussed. Several novel techniques are evolving that may reveal new information on brain–gut interactions. PMID:19072383

  12. Standardized likelihood ratio test for comparing several log-normal means and confidence interval for the common mean.

    PubMed

    Krishnamoorthy, K; Oral, Evrim

    2017-12-01

    Standardized likelihood ratio test (SLRT) for testing the equality of means of several log-normal distributions is proposed. The properties of the SLRT and an available modified likelihood ratio test (MLRT) and a generalized variable (GV) test are evaluated by Monte Carlo simulation and compared. Evaluation studies indicate that the SLRT is accurate even for small samples, whereas the MLRT could be quite liberal for some parameter values, and the GV test is in general conservative and less powerful than the SLRT. Furthermore, a closed-form approximate confidence interval for the common mean of several log-normal distributions is developed using the method of variance estimate recovery, and compared with the generalized confidence interval with respect to coverage probabilities and precision. Simulation studies indicate that the proposed confidence interval is accurate and better than the generalized confidence interval in terms of coverage probabilities. The methods are illustrated using two examples.

  13. Association Between Severe Vitamin D Deficiency, Lung Function and Asthma Control.

    PubMed

    Beyhan-Sagmen, Seda; Baykan, Ozgur; Balcan, Baran; Ceyhan, Berrin

    2017-04-01

    To examine the relationship between severe vitamin D deficiency, asthma control, and pulmonary function in Turkish adults with asthma. One hundred six asthmatic patients underwent pulmonary function tests skin prick test, peripheral blood eosinophil counts, IgE, body mass index and vitamin D levels were determined. Patients were divided into 2 subgroups according to vitamin D levels (vitamin D level<10ng/ml and vitamin D level≥10 ng/ml). Asthma control tests were performed. The mean age of subgroup i (vitamin D level<10) was 37±10 and the mean age of subgroup ii (vitamin D level≥10ng/ml) was 34±8. Sixty-six percent of patients had severe vitamin D deficiency (vitamin D level<10 ng/ml). There was a significant trend towards lower absolute FEV 1 (L) values in patients with lower vitamin D levels (P=.001). Asthma control test scores were significantly low in the severe deficiency group than the other group (P=.02). There were a greater number of patients with uncontrolled asthma (asthma control test scores<20) in the severe vitamin D deficiency group (P=.040). Patients with severe vitamin D deficiency had a higher usage of inhaled corticosteroids than the group without severe vitamin D deficiency (P=.015). There was a significant trend towards lower absolute FEV 1 (L) (P=.005, r=.272) values in patients with lower vitamin D levels. Vitamin D levels were inversely related with body mass index (P=.046). The incidence of severe vitamin D deficiency was high in adult Turkish asthmatics. In addition, lower vitamin D levels were associated with poor asthma control and decreased pulmonary function. Copyright © 2016 SEPAR. Publicado por Elsevier España, S.L.U. All rights reserved.

  14. Cognitive Decline in Down Syndrome: A Validity/Reliability Study of the Test for Severe Impairment.

    ERIC Educational Resources Information Center

    Cosgrave, Mary P.; McCarron, Mary; Anderson, Mary; Tyrrell, Janette; Gill, Michael; Lawlor, Brian A.

    1998-01-01

    The utility of the Test for Severe Impairment was studied with 60 older persons who had Down Syndrome. Construct validity, test-retest reliability, and interrater reliability were established for the full study group and for subgroups based on degree of mental retardation and dementia status. Some possible applications and limitations of the test…

  15. Considering the Use of General and Modified Assessment Items in Computerized Adaptive Testing

    ERIC Educational Resources Information Center

    Wyse, Adam E.; Albano, Anthony D.

    2015-01-01

    This article used several data sets from a large-scale state testing program to examine the feasibility of combining general and modified assessment items in computerized adaptive testing (CAT) for different groups of students. Results suggested that several of the assumptions made when employing this type of mixed-item CAT may not be met for…

  16. A Pilot Study of a Test for Visual Recognition Memory in Adults with Moderate to Severe Intellectual Disability

    ERIC Educational Resources Information Center

    Pyo, Geunyeong; Ala, Tom; Kyrouac, Gregory A.; Verhulst, Steven J.

    2010-01-01

    Objective assessment of memory functioning is an important part of evaluation for Dementia of Alzheimer Type (DAT). The revised Picture Recognition Memory Test (r-PRMT) is a test for visual recognition memory to assess memory functioning of persons with intellectual disabilities (ID), specifically targeting moderate to severe ID. A pilot study was…

  17. United States nuclear tests, July 1945 through September 1992

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1994-12-01

    This document lists chronologically and alphabetically by name all nuclear tests and simultaneous detonations conducted by the United States from July 1945 through September 1992. Several tests conducted during Operation Dominic involved missile launches from Johnston Atoll. Several of these missile launches were aborted, resulting in the destruction of the missile and nuclear device either on the pad or in the air.

  18. FORTRAN implementation of Friedman's test for several related samples

    NASA Technical Reports Server (NTRS)

    Davidson, S. A.

    1982-01-01

    The FRIEDMAN program is a FORTRAN-coded implementation of Friedman's nonparametric test for several related samples with one observation per treatment/-block combination, or as it is sometimes called, the two-way analysis of variance by ranks. The FRIEDMAN program is described and a test data set and its results are presented to aid potential users of this program.

  19. A Validity Study of the Working Group's Orientation Test for Individuals with Moderate to Severe Intellectual Disability

    ERIC Educational Resources Information Center

    Pyo, G.; Curtis, K.; Curtis, R.; Markwell, S.

    2009-01-01

    Background: Decline in orientation skill has been reported as an early indicator of Dementia of Alzheimer's Type (DAT). Orientation subtest of the Working Group's Test Battery was examined whether this test is useful to identify DAT patients among adults with moderate to severe ID. Methods: Sixteen DAT patients and 35 non-demented normal controls…

  20. A risk prediction model for severe intraventricular hemorrhage in very low birth weight infants and the effect of prophylactic indomethacin.

    PubMed

    Luque, M J; Tapia, J L; Villarroel, L; Marshall, G; Musante, G; Carlo, W; Kattan, J

    2014-01-01

    Develop a risk prediction model for severe intraventricular hemorrhage (IVH) in very low birth weight infants (VLBWI). Prospectively collected data of infants with birth weight 500 to 1249 g born between 2001 and 2010 in centers from the Neocosur Network were used. Forward stepwise logistic regression model was employed. The model was tested in the 2011 cohort and then applied to the population of VLBWI that received prophylactic indomethacin to analyze its effect in the risk of severe IVH. Data from 6538 VLBWI were analyzed. The area under ROC curve for the model was 0.79 and 0.76 when tested in the 2011 cohort. The prophylactic indomethacin group had lower incidence of severe IVH, especially in the highest-risk groups. A model for early severe IVH prediction was developed and tested in our population. Prophylactic indomethacin was associated with a lower risk-adjusted incidence of severe IVH.

  1. Different Cognitive Profiles of Patients with Severe Aphasia.

    PubMed

    Marinelli, Chiara Valeria; Spaccavento, Simona; Craca, Angela; Marangolo, Paola; Angelelli, Paola

    2017-01-01

    Cognitive dysfunction frequently occurs in aphasic patients and primarily compromises linguistic skills. However, patients suffering from severe aphasia show heterogeneous performance in basic cognition. Our aim was to characterize the cognitive profiles of patients with severe aphasia and to determine whether they also differ as to residual linguistic abilities. We examined 189 patients with severe aphasia with standard language tests and with the CoBaGA (Cognitive Test Battery for Global Aphasia), a battery of nonverbal tests that assesses a wide range of cognitive domains such as attention, executive functions, intelligence, memory, visual-auditory recognition, and visual-spatial abilities. Twenty patients were also followed longitudinally in order to assess their improvement in cognitive skills after speech therapy. Three different subgroups of patients with different types and severity of cognitive impairment were evidenced. Subgroups differed as to residual linguistic skills, in particular comprehension and reading-writing abilities. Attention, reasoning, and executive functions improved after language rehabilitation. This study highlights the importance of an extensive evaluation of cognitive functions in patients with severe aphasia.

  2. Leptospirosis Outbreak following Severe Flooding: A Rapid Assessment and Mass Prophylaxis Campaign; Guyana, January–February 2005

    PubMed Central

    Dechet, Amy M.; Parsons, Michele; Rambaran, Madan; Mohamed-Rambaran, Pheona; Florendo-Cumbermack, Anita; Persaud, Shamdeo; Baboolal, Shirematee; Ari, Mary D.; Shadomy, Sean V.; Zaki, Sherif R.; Paddock, Christopher D.; Clark, Thomas A.; Harris, Lazenia; Lyon, Douglas; Mintz, Eric D.

    2012-01-01

    Background Leptospirosis is a zoonosis usually transmitted through contact with water or soil contaminated with urine from infected animals. Severe flooding can put individuals at greater risk for contracting leptospirosis in endemic areas. Rapid testing for the disease and large-scale interventions are necessary to identify and control infection. We describe a leptospirosis outbreak following severe flooding and a mass chemoprophylaxis campaign in Guyana. Methodology/Principal Findings From January–March 2005, we collected data on suspected leptospirosis hospitalizations and deaths. Laboratory testing included anti-leptospiral dot enzyme immunoassay (DST), immunohistochemistry (IHC) staining, and microscopic agglutination testing (MAT). DST testing was conducted for 105 (44%) of 236 patients; 52 (50%) tested positive. Four (57%) paired serum samples tested by MAT were confirmed leptospirosis. Of 34 total deaths attributed to leptospirosis, postmortem samples from 10 (83%) of 12 patients were positive by IHC. Of 201 patients interviewed, 89% reported direct contact with flood waters. A 3-week doxycycline chemoprophylaxis campaign reached over 280,000 people. Conclusions A confirmed leptospirosis outbreak in Guyana occurred after severe flooding, resulting in a massive chemoprophylaxis campaign to try to limit morbidity and mortality. PMID:22808049

  3. A Validity Study of the Working Group's Autobiographical Memory Test for Individuals with Moderate to Severe Intellectual Disability

    ERIC Educational Resources Information Center

    Pyo, Geunyeong; Ala, Tom; Kyrouac, Gregory A.; Verhulst, Steven J.

    2011-01-01

    The purpose of the present study was to investigate the validity of the Working Group's Autobiographical Memory Test as a dementia screening tool for individuals with moderate to severe intellectual disabilities (ID). Twenty-one participants with Dementia of Alzheimer's Type (DAT) and moderate to severe ID and 42 controls with similar levels of ID…

  4. Extension of the Caucasus Seismic Information Network Study into Central Asia

    DTIC Science & Technology

    2008-09-01

    nuclear tests at the Semipalatinsk test site in Kazakhstan, Lop Nor in China, Pokharan in India, and Chagai in Pakistan, as well as for several peaceful... Semipalatinsk test site in Kazakhstan, Lop Nor in China, Pokharan in India, and Chagai in Pakistan, and several peaceful nuclear explosion (PNE) events...truth in tomography studies. Figures 5 and 6 show waveforms for a nuclear explosion at the Semipalatinsk Test Site in northeast Kazakhstan and for a

  5. Double Cantilever Beam Fracture Toughness Testing of Several Composite Materials

    NASA Technical Reports Server (NTRS)

    Kessler, Jeff A.; Adams, Donald F.

    1992-01-01

    Double-cantilever beam fracture toughness tests were performed by the Composite Materials Research Group on several different unidirectional composite materials provided by NASA Langley Research Center. The composite materials consisted of Hercules IM-7 carbon fiber and various matrix resin formulations. Multiple formulations of four different families of matrix resins were tested: LaRC - ITPI, LaRC - IA, RPT46T, and RP67/RP55. Report presents the materials tested and pertinent details supplied by NASA. For each material, three replicate specimens were tested. Multiple crack extensions were performed on each replicate.

  6. Diagnosis of Helicobacter pylori infection: Current options and developments

    PubMed Central

    Wang, Yao-Kuang; Kuo, Fu-Chen; Liu, Chung-Jung; Wu, Meng-Chieh; Shih, Hsiang-Yao; Wang, Sophie SW; Wu, Jeng-Yih; Kuo, Chao-Hung; Huang, Yao-Kang; Wu, Deng-Chyang

    2015-01-01

    Accurate diagnosis of Helicobacter pylori (H. pylori) infection is a crucial part in the effective management of many gastroduodenal diseases. Several invasive and non-invasive diagnostic tests are available for the detection of H. pylori and each test has its usefulness and limitations in different clinical situations. Although none can be considered as a single gold standard in clinical practice, several techniques have been developed to give the more reliable results. Invasive tests are performed via endoscopic biopsy specimens and these tests include histology, culture, rapid urease test as well as molecular methods. Developments of endoscopic equipment also contribute to the real-time diagnosis of H. pylori during endoscopy. Urea breathing test and stool antigen test are most widely used non-invasive tests, whereas serology is useful in screening and epidemiological studies. Molecular methods have been used in variable specimens other than gastric mucosa. More than detection of H. pylori infection, several tests are introduced into the evaluation of virulence factors and antibiotic sensitivity of H. pylori, as well as screening precancerous lesions and gastric cancer. The aim of this article is to review the current options and novel developments of diagnostic tests and their applications in different clinical conditions or for specific purposes. PMID:26523098

  7. Physical fitness and levels of physical activity in people with severe mental illness: a cross-sectional study.

    PubMed

    Perez-Cruzado, David; Cuesta-Vargas, Antonio I; Vera-Garcia, Elisa; Mayoral-Cleries, Fermín

    2017-01-01

    Physical fitness is a crucial variable in people with severe mental illness as these people could be more independent and improve their job opportunities. The present study compared the physical fitness of physically active and inactive people with severe mental illness. Physical fitness was evaluated in sixty-two people with severe mental illness using 11 physical tests that include strength, flexibility, balance and aerobic condition. Significant differences were found between both groups in muscle strength (handgrip test) and balance (single leg balance test and functional reach) with better performance in the group of physically active people. The results of the present study suggest that physical fitness (strength and balance) is higher in people with severe mental illness who practise regular physical activity that those who are inactive people. Physical active people may have a reduced risk of falls and fractures due to their higher levels of physical fitness.

  8. The NASA broad-specification fuels combustion technology program: An assessment of phase 1 test results

    NASA Technical Reports Server (NTRS)

    Fear, J. S.

    1983-01-01

    An assessment is made of the results of Phase 1 screening testing of current and advanced combustion system concepts using several broadened-properties fuels. The severity of each of several fuels-properties effects on combustor performance or liner life is discussed, as well as design techniques with the potential to offset these adverse effects. The selection of concepts to be pursued in Phase 2 refinement testing is described. This selection takes into account the relative costs and complexities of the concepts, the current outlook on pollutant emissions control, and practical operational problems.

  9. Coccidioides precipitin test

    MedlinePlus

    Coccidioidomycosis antibody test; Coccidioides blood test; Valley fever blood test ... There is no special preparation for the test. ... The precipitin test is one of several tests that can be done to determine if you are infected with coccidioides, which ...

  10. Ovulation home test

    MedlinePlus

    ... test (home test); Ovulation prediction test; Ovulation predictor kit; Urinary LH immunoassays; At-home ovulation prediction test; ... Ovulation prediction test kits most often come with five to seven sticks. You may need to test for several days to detect a ...

  11. Case Report of a Hypobaric Chamber Fitness to Fly Test in a Child With Severe Cystic Lung Disease.

    PubMed

    Loo, Sarah; Campbell, Andrew; Vyas, Julian; Pillarisetti, Naveen

    2017-07-01

    Patients with severe cystic lung disease are considered to be at risk for cyst rupture during air travel because of the possibility of increase in cyst size and impaired equilibration of pressure between the cysts and other parts of the lung. This may have clinically devastating consequences for the patient but may also result in significant costs for emergency alteration of flight schedule. We report the use of a hypobaric chamber to simulate cabin pressure changes encountered on a commercial flight to assess the safety to fly of a child with severe cystic lung disease secondary to Langerhans cell histiocytosis. The test did not result in an air leak, and the child subsequently undertook air travel without mishap. This is the first reported use of a hypobaric chamber test in a child with severe cystic lung disease. This test has the potential to be used as a fitness to fly test in children at risk for air leak syndromes who are being considered for air travel. Copyright © 2017 by the American Academy of Pediatrics.

  12. Utility of behavioral versus cognitive measures in differentiating between subtypes of frontotemporal lobar degeneration and Alzheimer's disease.

    PubMed

    Heidler-Gary, Jennifer; Gottesman, Rebecca; Newhart, Melissa; Chang, Shannon; Ken, Lynda; Hillis, Argye E

    2007-01-01

    We hypothesized that a modified version of the Frontal Behavioral Inventory (FBI-mod), along with a few cognitive tests, would be clinically useful in distinguishing between clinically defined Alzheimer's disease (AD) and subtypes of frontotemporal lobar degeneration (FTLD): frontotemporal dementia (dysexecutive type), progressive nonfluent aphasia, and semantic dementia. We studied 80 patients who were diagnosed with AD (n = 30) or FTLD (n = 50), on the basis of a comprehensive neuropsychological battery, imaging, neurological examination, and history. We found significant between-group differences on the FBI-mod, two subtests of the Rey Auditory Verbal Learning Test (verbal learning and delayed recall), and the Trail Making Test Part B (one measure of 'executive functioning'). AD was characterized by relatively severe impairment in verbal learning, delayed recall, and executive functioning, with relatively normal scores on the FBI-mod. Frontotemporal dementia was characterized by relatively severe impairment on the FBI-mod and executive functioning in the absence of severe impairment in verbal learning and recall. Progressive nonfluent aphasia was characterized by severe impairment in executive functioning with relatively normal scores on verbal learning and recall and FBI-mod. Finally, semantic dementia was characterized by relatively severe deficits in delayed recall, but relatively normal performance on new learning, executive functioning, and on FBI-mod. Discriminant function analysis confirmed that the FBI-mod, in conjunction with the Rey Auditory Verbal Learning Test, and the Trail Making Test Part B categorized the majority of patients as subtypes of FTLD or AD in the same way as a full neuropsychological battery, neurological examination, complete history, and imaging. These tests may be useful for efficient clinical diagnosis, although progressive nonfluent aphasia and semantic dementia are likely to be best distinguished by language tests not included in standard neuropsychological test batteries.

  13. Effects of neural progenitor cells on post-stroke neurological impairment—a detailed and comprehensive analysis of behavioral tests

    PubMed Central

    Doeppner, Thorsten R.; Kaltwasser, Britta; Bähr, Mathias; Hermann, Dirk M.

    2014-01-01

    Systemic transplantation of neural progenitor cells (NPCs) in rodents reduces functional impairment after cerebral ischemia. In light of upcoming stroke trials regarding safety and feasibility of NPC transplantation, experimental studies have to successfully analyze the extent of NPC-induced neurorestoration on the functional level. However, appropriate behavioral tests for analysis of post-stroke motor coordination deficits and cognitive impairment after NPC grafting are not fully established. We therefore exposed male C57BL6 mice to either 45 min (mild) or 90 min (severe) of cerebral ischemia, using the thread occlusion model followed by intravenous injection of PBS or NPCs 6 h post-stroke with an observation period of three months. Post-stroke motor coordination was assessed by means of the rota rod, tight rope, corner turn, inclined plane, grip strength, foot fault, adhesive removal, pole test and balance beam test, whereas cognitive impairment was analyzed using the water maze, the open field and the passive avoidance test. Significant motor coordination differences after both mild and severe cerebral ischemia in favor of NPC-treated mice were observed for each motor coordination test except for the inclined plane and the grip strength test, which only showed significant differences after severe cerebral ischemia. Cognitive impairment after mild cerebral ischemia was successfully assessed using the water maze test, the open field and the passive avoidance test. On the contrary, the water maze test was not suitable in the severe cerebral ischemia paradigm, as it too much depends on motor coordination capabilities of test mice. In terms of both reliability and cost-effectiveness considerations, we thus recommend the corner turn, foot fault, balance beam, and open field test, which do not depend on durations of cerebral ischemia. PMID:25374509

  14. Evaluation of a PfHRP2 and a pLDH-based Rapid Diagnostic Test for the Diagnosis of Severe Malaria in 2 Populations of African Children

    PubMed Central

    Hendriksen, Ilse C. E.; Mtove, George; Pedro, Alínia José; Gomes, Ermelinda; Silamut, Kamolrat; Lee, Sue J.; Mwambuli, Abraham; Gesase, Samwel; Reyburn, Hugh; Day, Nicholas P. J.; White, Nicholas J.; von Seidlein, Lorenz

    2011-01-01

    Background. Rapid diagnostic tests (RDTs) now play an important role in the diagnosis of falciparum malaria in many countries where the disease is endemic. Although these tests have been extensively evaluated in uncomplicated falciparum malaria, reliable data on their performance for diagnosing potentially lethal severe malaria is lacking. Methods. We compared a Plasmodium falciparum histidine-rich-protein2 (PfHRP2)–based RDT and a Plasmodium lactate dehydrogenase (pLDH)–based RDT with routine microscopy of a peripheral blood slide and expert microscopy as a reference standard for the diagnosis of severe malaria in 1898 children who presented with severe febrile illness at 2 centers in Mozambique and Tanzania. Results. The overall sensitivity, specificity, positive predictive value, and negative predictive values of the PfHRP2-based test were 94.0%, 70.9%, 85.4%, and 86.8%, respectively, and for the pLDH-based test, the values were 88.0%, 88.3%, 93.2%, and 80.3%, respectively. At parasite counts <1000 parasites/μL (n = 173), sensitivity of the pLDH-based test was low (45.7%), compared with that of the PfHRP2-based test (69.9%). Both RDTs performed better than did the routine slide reading in a clinical laboratory as assessed in 1 of the centers. Conclusion. The evaluated PfHRP2-based RDT is an acceptable alternative to routine microscopy for diagnosing severe malaria in African children and performed better than did the evaluated pLDH-based RDT. PMID:21467015

  15. Heart Health Tests

    MedlinePlus

    ... early, when it is easier to treat. Blood tests and heart health tests can help find heart diseases or identify problems ... There are several different types of heart health tests. Your doctor will decide which test or tests ...

  16. AIDS

    MedlinePlus

    ... people with HIV infection and AIDS. Exams and Tests DIAGNOSTIC TESTS These are tests that are done to check if you've ... general, testing is a 2-step process: Screening test. There are several kinds of tests. Some are ...

  17. A MAOA gene*cocaine severity interaction on impulsivity and neuropsychological measures of orbitofrontal dysfunction: preliminary results.

    PubMed

    Verdejo-García, Antonio; Albein-Urios, Natalia; Molina, Esther; Ching-López, Ana; Martínez-González, José M; Gutiérrez, Blanca

    2013-11-01

    Based on previous evidence of a MAOA gene*cocaine use interaction on orbitofrontal cortex volume attrition, we tested whether the MAOA low activity variant and cocaine use severity are interactively associated with impulsivity and behavioral indices of orbitofrontal dysfunction: emotion recognition and decision-making. 72 cocaine dependent individuals and 52 non-drug using controls (including healthy individuals and problem gamblers) were genotyped for the MAOA gene and tested using the UPPS-P Impulsive Behavior Scale, the Iowa Gambling Task and the Ekman's Facial Emotions Recognition Test. To test the main hypothesis, we conducted hierarchical multiple regression analyses including three sets of predictors: (1) age, (2) MAOA genotype and severity of cocaine use, and (3) the interaction between MAOA genotype and severity of cocaine use. UPPS-P, Ekman Test and Iowa Gambling Task's scores were the outcome measures. We computed the statistical significance of the prediction change yielded by each consecutive set, with 'a priori' interest in the MAOA*cocaine severity interaction. We found significant effects of the MAOA gene*cocaine use severity interaction on the emotion recognition scores and the UPPS-P's dimensions of Positive Urgency and Sensation Seeking: Low activity carriers with higher cocaine exposure had poorer emotion recognition and higher Positive Urgency and Sensation Seeking. Cocaine users carrying the MAOA low activity show a greater impact of cocaine use on impulsivity and behavioral measures of orbitofrontal cortex dysfunction. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  18. Dimensional indicators of generalized anxiety disorder severity for DSM-V.

    PubMed

    Niles, Andrea N; Lebeau, Richard T; Liao, Betty; Glenn, Daniel E; Craske, Michelle G

    2012-03-01

    For DSM-V, simple dimensional measures of disorder severity will accompany diagnostic criteria. The current studies examine convergent validity and test-retest reliability of two potential dimensional indicators of worry severity for generalized anxiety disorder (GAD): percent of the day worried and number of worry domains. In study 1, archival data from diagnostic interviews from a community sample of individuals diagnosed with one or more anxiety disorders (n = 233) were used to assess correlations between percent of the day worried and number of worry domains with other measures of worry severity (clinical severity rating (CSR), age of onset, number of comorbid disorders, Penn state worry questionnaire (PSWQ)) and DSM-IV criteria (excessiveness, uncontrollability and number of physical symptoms). Both measures were significantly correlated with CSR and number of comorbid disorders, and with all three DSM-IV criteria. In study 2, test-retest reliability of percent of the day worried and number of worry domains were compared to test-retest reliability of DSM-IV diagnostic criteria in a non-clinical sample of undergraduate students (n = 97) at a large west coast university. All measures had low test-retest reliability except percent of the day worried, which had moderate test-retest reliability. Findings suggest that these two indicators capture worry severity, and percent of the day worried may be the most reliable existing indicator. These measures may be useful as dimensional measures for DSM-V. Copyright © 2012 Elsevier Ltd. All rights reserved.

  19. Hibernation-Based Therapy to Improve Survival of Severe Blood Loss

    DTIC Science & Technology

    2016-06-01

    leaks extravascularly • Necrosis and inflammation involving the ear tip is considered to be a more severe manifestation of vascular damage associated...similar lesions to the 2M test solution, it appears that 2M test solution is more likely to cause vascular necrosis and inflammation (noted at 24 hours...injections • Although DMSO induced similar lesions to the 4M test solution, it appears that 4M test solution is more likely to cause vascular necrosis and

  20. Hibernation Based Therapy to Improve Survival of Severe Blood Loss

    DTIC Science & Technology

    2016-06-01

    leaks extravascularly • Necrosis and inflammation involving the ear tip is considered to be a more severe manifestation of vascular damage associated...similar lesions to the 2M test solution, it appears that 2M test solution is more likely to cause vascular necrosis and inflammation (noted at 24 hours...injections • Although DMSO induced similar lesions to the 4M test solution, it appears that 4M test solution is more likely to cause vascular necrosis and

  1. Synthesis of published and unpublished corrosion data from long term tests of fasteners embedded in wood : calculation of corrosion rates and the effect of corrosion on lateral joint strength

    Treesearch

    Samuel L. Zelinka; Douglas R. Rammer

    2011-01-01

    In the past 5 years, several accelerated test methods have been developed to measure the corrosion of metals in contact with wood. It is desirable to contrast these accelerated results against those of long term exposure tests. While there have been several published long-term exposure tests performed on metals in treated wood, the data from these studies could not be...

  2. Validation and clinical utility of the executive function performance test in persons with traumatic brain injury.

    PubMed

    Baum, C M; Wolf, T J; Wong, A W K; Chen, C H; Walker, K; Young, A C; Carlozzi, N E; Tulsky, D S; Heaton, R K; Heinemann, A W

    2017-07-01

    This study examined the relationships between the Executive Function Performance Test (EFPT), the NIH Toolbox Cognitive Function tests, and neuropsychological executive function measures in 182 persons with traumatic brain injury (TBI) and 46 controls to evaluate construct, discriminant, and predictive validity. Construct validity: There were moderate correlations between the EFPT and the NIH Toolbox Crystallized (r = -.479), Fluid Tests (r = -.420), and Total Composite Scores (r = -.496). Discriminant validity: Significant differences were found in the EFPT total and sequence scores across control, complicated mild/moderate, and severe TBI groups. We found differences in the organisation score between control and severe, and between mild and severe TBI groups. Both TBI groups had significantly lower scores in safety and judgement than controls. Compared to the controls, the severe TBI group demonstrated significantly lower performance on all instrumental activities of daily living (IADL) tasks. Compared to the mild TBI group, the controls performed better on the medication task, the severe TBI group performed worse in the cooking and telephone tasks. Predictive validity: The EFPT predicted the self-perception of independence measured by the TBI-QOL (beta = -0.49, p < .001) for the severe TBI group. Overall, these data support the validity of the EFPT for use in individuals with TBI.

  3. Benefits of a Low Severity Frontal Crash Test

    PubMed Central

    Digges, Kennerly; Dalmotas, Dainius

    2007-01-01

    The US Federal Motor Vehicle Safety Standard for frontal protection requires vehicle crash tests into a rigid barrier with two belted dummies in the front seats. The standard was recently modified to require two separate 56 Kph frontal tests. In one test the dummies are 50% males. In the other test, the dummies are 5% females. Analysis of crash test data indicates that the 56 Kph test does not encourage technology to reduce chest injuries in lower severity crashes. Tests conducted by Transport Canada provide data from belted 5% female dummies in the front seats of vehicles that were subjected crashes into a rigid barrier at 40 Kph. An analysis of the results showed that for many vehicles, the risks of serious chest injuries were higher in the 40 Kph test than in a 56 Kph test. This paper examines the benefits that would result from a requirement for a low severity (40 Kph) frontal barrier crash test with two belted 5% female dummies and more stringent chest injury requirements. A preliminary benefits analysis for chest deflection allowable in the range of 28 mm. to 36 mm. was conducted. A standard that limits the chest deflection to 34 mm. would reduce serious chest injury by 16% to 24% for the belted population in frontal crashes. PMID:18184499

  4. Benefits of a low severity frontal crash test.

    PubMed

    Digges, Kennerly; Dalmotas, Dainius

    2007-01-01

    The US Federal Motor Vehicle Safety Standard for frontal protection requires vehicle crash tests into a rigid barrier with two belted dummies in the front seats. The standard was recently modified to require two separate 56 Kph frontal tests. In one test the dummies are 50% males. In the other test, the dummies are 5% females. Analysis of crash test data indicates that the 56 Kph test does not encourage technology to reduce chest injuries in lower severity crashes. Tests conducted by Transport Canada provide data from belted 5% female dummies in the front seats of vehicles that were subjected crashes into a rigid barrier at 40 Kph. An analysis of the results showed that for many vehicles, the risks of serious chest injuries were higher in the 40 Kph test than in a 56 Kph test. This paper examines the benefits that would result from a requirement for a low severity (40 Kph) frontal barrier crash test with two belted 5% female dummies and more stringent chest injury requirements. A preliminary benefits analysis for chest deflection allowable in the range of 28 mm. to 36 mm. was conducted. A standard that limits the chest deflection to 34 mm. would reduce serious chest injury by 16% to 24% for the belted population in frontal crashes.

  5. Fabrication of titanium multi-wall Thermal Protection System (TPS) test panel arrays

    NASA Technical Reports Server (NTRS)

    Blair, W.; Meaney, J. E.; Rosenthal, H. A.

    1980-01-01

    Several arrays were designed and tested. Tests included vibrational and acoustical tests, radiant heating tests, and thermal conductivity tests. A feasible manufacturing technique was established for producing the protection system panels.

  6. Severe hypoglycaemia and late-life cognitive ability in older people with Type 2 diabetes: the Edinburgh Type 2 Diabetes Study.

    PubMed

    Aung, P P; Strachan, M W J; Frier, B M; Butcher, I; Deary, I J; Price, J F

    2012-03-01

    To determine the association between lifetime severe hypoglycaemia and late-life cognitive ability in older people with Type 2 diabetes. Cross-sectional, population-based study of 1066 men and women aged 60-75 years, with Type 2 diabetes. Frequency of severe hypoglycaemia over a person's lifetime and in the year prior to cognitive testing was assessed using a previously validated self-completion questionnaire. Results of age-sensitive neuropsychological tests were combined to derive a late-life general cognitive ability factor, 'g'. Vocabulary test scores, which are stable during ageing, were used to estimate early life (prior) cognitive ability. After age- and sex- adjustment, 'g' was lower in subjects reporting at least one prior severe hypoglycaemia episode (n = 113), compared with those who did not report severe hypoglycaemia (mean 'g'-0.34 vs. 0.05, P < 0.001). Mean vocabulary test scores did not differ significantly between the two groups (30.2 vs. 31.0, P = 0.13). After adjustment for vocabulary, difference in 'g' between the groups persisted (means -0.25 vs. 0.04, P < 0.001), with the group with severe hypoglycaemia demonstrating poorer performance on tests of Verbal Fluency (34.5 vs. 37.3, P = 0.02), Digit Symbol Testing (45.9 vs. 49.9, P = 0.002), Letter-Number Sequencing (9.1 vs. 9.8, P = 0.005) and Trail Making (P < 0.001). These associations persisted after adjustment for duration of diabetes, vascular disease and other potential confounders. Self-reported history of severe hypoglycaemia was associated with poorer late-life cognitive ability in people with Type 2 diabetes. Persistence of this association after adjustment for estimated prior cognitive ability suggests that the association may be attributable, at least in part, to an effect of hypoglycaemia on age-related cognitive decline. © 2011 The Authors. Diabetic Medicine © 2011 Diabetes UK.

  7. Research on Oxidation Wear Behavior of a New Hot Forging Die Steel

    NASA Astrophysics Data System (ADS)

    Shi, Yuanji; Wu, Xiaochun

    2018-01-01

    Dry sliding tests for the hot forging die steel DM were performed in air under the test temperature at 400-700 °C and the time of 0.5-4 h by a UMT-3 high-temperature wear tester. The wear behavior and characteristics were studied systematically to explore the general characters in severe oxidation conditions. The results showed that a mild-to-severe oxidation wear transition occurred with an increase in the test temperature and duration. The reason was clarified as the unstable M6C carbides coarsening should be responsible for the severe delamination of tribo-oxide layer. More importantly, an intense oxidation wear with lower wear rates was found when the experimental temperature reaches 700 °C or after 4 h of test time at 600 °C, which was closely related to the degradation behavior during wear test. Furthermore, a new schematic diagram of oxidation wear of DM steel was proposed.

  8. Forebody/Inlet of the Joint Strike Fighter Tested at Low Speeds

    NASA Technical Reports Server (NTRS)

    Johns, Albert L.

    1998-01-01

    As part of a national cooperative effort to develop a multinational fighter aircraft, a model of a Joint Strike Fighter concept was tested in several NASA Lewis Research Center wind tunnels at low speeds over a range of headwind velocities and model attitudes. This Joint Strike Fighter concept, which is scheduled to go into production in 2005, will greatly improve the range, capability, maneuverability, and survivability of fighter aircraft, and the production program could ultimately be worth $100 billion. The test program was a team effort between Lewis and Lockheed Martin Tactical Aircraft Systems. Testing was completed in September 1997, several weeks ahead of schedule, allowing Lockheed additional time to review the results and analysis data before the next test and resulting in significant cost savings for Lockheed. Several major milestones related to dynamic and steady-state data acquisition and overall model performance were reached during this model test. Results from this program will contribute to both the concept demonstration phase and the production aircraft.

  9. 14 CFR 135.293 - Initial and recurrent pilot testing requirements.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ..., high altitude weather; (7) Procedures for— (i) Recognizing and avoiding severe weather situations; (ii) Escaping from severe weather situations, in case of inadvertent encounters, including low-altitude windshear (except that rotorcraft pilots are not required to be tested on escaping from low-altitude...

  10. 14 CFR 135.293 - Initial and recurrent pilot testing requirements.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ..., high altitude weather; (7) Procedures for— (i) Recognizing and avoiding severe weather situations; (ii) Escaping from severe weather situations, in case of inadvertent encounters, including low-altitude windshear (except that rotorcraft pilots are not required to be tested on escaping from low-altitude...

  11. Laboratory comparison of several tests for evaluating the transport properties of concrete.

    DOT National Transportation Integrated Search

    2006-01-01

    The transport properties of concrete are a primary element in determining the durability of concrete. In this study, several new test methods that directly measure aspects of fluid and ionic transport in concrete were examined. ASTM C 1543 and ASTM C...

  12. Fiberoptic characteristics for extreme operating environments

    NASA Technical Reports Server (NTRS)

    Delcher, R. C.

    1992-01-01

    Fiberoptics could offer several major benefits for cryogenic liquid-fueled rocket engines, including lightning immunity, weight reduction, and the possibility of implementing a number of new measurements for engine condition monitoring. The technical feasibility of using fiberoptics in the severe environments posed by cryogenic liquid-fueled rocket engines was determined. The issues of importance and subsequent requirements for this use of fiberoptics were compiled. These included temperature ranges, moisture embrittlement succeptability, and the ability to withstand extreme shock and vibration levels. Different types of optical fibers were evaluated and several types of optical fibers' ability to withstand use in cryogenic liquid-fueled rocket engines was demonstrated through environmental testing of samples. This testing included: cold-bend testing, moisture embrittlement testing, temperature cycling, temperature extremes testing, vibration testing, and shock testing. Three of five fiber samples withstood the tests to a level proving feasibility, and two of these remained intact in all six of the tests. A fiberoptic bundle was also tested, and completed testing without breakage. Preliminary cabling and harnessing for fiber protection was also demonstrated. According to cable manufacturers, the successful -300 F cold bend, vibration, and shock tests are the first instance of any major fiberoptic cable testing below roughly -55 F. This program has demonstrated the basic technical feasibility of implementing optical fibers on cryogenic liquid-fueled rocket engines, and a development plan is included highlighting requirements and issues for such an implementation.

  13. Relocation of the Cryo-Test Facility to NASA-MSFC

    NASA Technical Reports Server (NTRS)

    Sisco, Jimmy D.; McConnaughey, Paul K. (Technical Monitor)

    2002-01-01

    The Environmental Test Facility (ETF), located at NASA-Marshall Space Flight Center, Huntsville, Alabama, has provided thermal vacuum testing for several major programs since the 1960's. The ETF consists of over 13 thermal vacuum chambers sized and configured to handle the majority of test payloads. Testing is performed around the clock with multiple tests being conducted simultaneously. Chamber selection to achieve the best match with test articles and juggling program schedules, at times, can be a challenge. The ETF's Sunspot chamber has had tests scheduled and operated back-to-back for several years and provides the majority of schedule conflicts. Future test programs have been identified which surpass the current Sunspot availability. This paper describes a very low cost alternate to reduce schedule conflicts by utilizing government excess equipment

  14. Occupant kinematics in low-speed frontal sled tests: Human volunteers, Hybrid III ATD, and PMHS.

    PubMed

    Beeman, Stephanie M; Kemper, Andrew R; Madigan, Michael L; Franck, Christopher T; Loftus, Stephen C

    2012-07-01

    A total of 34 dynamic matched frontal sled tests were performed, 17 low (2.5g, Δv=4.8kph) and 17 medium (5.0g, Δv=9.7kph), with five male human volunteers of approximately 50th percentile height and weight, a Hybrid III 50th percentile male ATD, and three male PMHS. Each volunteer was exposed to two impulses at each severity, one relaxed and one braced prior to the impulse. A total of four tests were performed at each severity with the ATD and one trial was performed at each severity with each PMHS. A Vicon motion analysis system, 12 MX-T20 2 megapixel cameras, was used to quantify subject 3D kinematics (±1mm) (1kHz). Excursions of select anatomical regions were normalized to their respective initial positions and compared by test condition and between subject types. The forward excursions of the select anatomical regions generally increased with increasing severity. The forward excursions of relaxed human volunteers were significantly larger than those of the ATD for nearly every region at both severities. The forward excursions of the upper body regions of the braced volunteers were generally significantly smaller than those of the ATD at both severities. Forward excursions of the relaxed human volunteers and PMHSs were fairly similar except the head CG response at both severities and the right knee and C7 at the medium severity. The forward excursions of the upper body of the PMHS were generally significantly larger than those of the braced volunteers at both severities. Forward excursions of the PMHSs exceeded those of the ATD for all regions at both severities with significant differences within the upper body regions. Overall human volunteers, ATD, and PMHSs do not have identical biomechanical responses in low-speed frontal sled tests but all contribute valuable data that can be used to refine and validate computational models and ATDs used to assess injury risk in automotive collisions. Copyright © 2012 Elsevier Ltd. All rights reserved.

  15. Differences in severity at admission for heart failure between rural and urban patients: the value of adding laboratory results to administrative data.

    PubMed

    Smith, Mark W; Owens, Pamela L; Andrews, Roxanne M; Steiner, Claudia A; Coffey, Rosanna M; Skinner, Halcyon G; Miyamura, Jill; Popescu, Ioana

    2016-04-18

    Rural/urban variations in admissions for heart failure may be influenced by severity at hospital presentation and local practice patterns. Laboratory data reflect clinical severity and guide hospital admission decisions and treatment for heart failure, a costly chronic illness and a leading cause of hospitalization among the elderly. Our main objective was to examine the role of laboratory test results in measuring disease severity at the time of admission for inpatients who reside in rural and urban areas. We retrospectively analyzed discharge data on 13,998 hospital discharges for heart failure from three states, Hawai'i, Minnesota, and Virginia. Hospital discharge records from 2008 to 2012 were derived from the State Inpatient Databases of the Healthcare Cost and Utilization Project, and were merged with results of laboratory tests performed on the admission day or up to two days before admission. Regression models evaluated the relationship between clinical severity at admission and patient urban/rural residence. Models were estimated with and without use of laboratory data. Patients residing in rural areas were more likely to have missing laboratory data on admission and less likely to have abnormal or severely abnormal tests. Rural patients were also less likely to be admitted with high levels of severity as measured by the All Patient Refined Diagnosis Related Groups (APR-DRG) severity subclass, derivable from discharge data. Adding laboratory data to discharge data improved model fit. Also, in models without laboratory data, the association between urban compared to rural residence and APR-DRG severity subclass was significant for major and extreme levels of severity (OR 1.22, 95% CI 1.03-1.43 and 1.55, 95% CI 1.26-1.92, respectively). After adding laboratory data, this association became non-significant for major severity and was attenuated for extreme severity (OR 1.12, 95% CI 0.94-1.32 and 1.43, 95% CI 1.15-1.78, respectively). Heart failure patients from rural areas are hospitalized at lower severity levels than their urban counterparts. Laboratory test data provide insight on clinical severity and practice patterns beyond what is available in administrative discharge data.

  16. In vitro eye corrosion study of agrochemicals on isolated chicken eye.

    PubMed

    Buda, I; Budai, P; Szabó, R; Lehel, J

    2013-01-01

    Agrochemicals must undergo numberless toxicological tests before marketing. The eye irritation test is part of this test packet. Nowadays, OECD 405 can be used to classify the irritation potential of substances, the base of the OECD 405 guideline is the Draize test, which is one of the most criticized in vivo methods because of the injuries of the test animals and subjective nature of the test in recording the results. Therefore, several in vitro tests have been developed to replace totally or partly the in vivo eye irritation testing. The isolated chicken eye test method (OECD 438), which was used, is one of these alternative methods. Five different agrochemicals were examined in the following way: All test compounds were applied in a single dose onto the cornea of isolated chicken eyes in order to potentially classify the test compounds as ocular corrosive and/or severe irritant. The damages caused by the test substances were assessed by the determination of corneal swelling, opacity, fluorescein retention and morphological effects. These parameters were evaluated pre-treatment and starting at approximately 30, 75, 120, 180, and 240 minutes after the post-treatment rinse. The endpoints evaluated were corneal opacity, swelling, fluorescein retention and morphological effects. All of the endpoints, with the exception of fluorescein retention (which was determined only at pre-treatment and 30 minutes after test substance exposure) were determined at each of the above time points. Positive and negative controls were used and they showed the expected results in each study. In these in vitro eye corrosives and severe irritants studies, using the Isolated Chicken Eye model with five different products, no ocular corrosion or severe irritation potential were observed. These results correspond to the available information about the tested agrochemicals, so these studies with isolated chicken eye are considered to be successful.

  17. Identification of Shifts and Trends in Hydrometric Data in Canada Based on Several Detection Tests

    NASA Astrophysics Data System (ADS)

    Lauzon, N.; Lence, B. J.

    2004-05-01

    This work proposes new detection tests based on the Kohonen neural network and on fuzzy c-means for the identification of shifts and trends in data sequences. Annual mean and maximum flow sequences are considered as application case, for they have often been considered for the study of shifts and trends in hydrologic data. In recent years, several studies for the identification of trends have been accomplished with North American hydrometric data, often making use of only one detection test. The assumption here is that one cannot rely on only one test, and consequently several are employed in this work. A total of eight tests are considered, four for shifts and four for trends. Four of these tests, two for shifts and two for trends, are conventional statistical tests that are regularly employed, while the other four are developed based on the Kohonen neural network and on fuzzy c-means. Data from a group of 40 hydrometric stations across Canada are assessed for the detection of shifts and trends in time periods of 30, 40 and 50 years. While the results obtained confirm the conclusions of previous studies performed on similar groups of data, they also indicate that each test may behave differently from one another. For example, one test may detect a trend in a given sequence while the other tests do not, or vice-versa. Thus, the strategy of using several tests ensures not only that they may confirm each others diagnostics but also may complement each other in the case of divergent diagnostics, with the possibility of improving the final conclusion on the detection of shifts and trends. Using artificial intelligence techniques for the construction of detection tests constitutes also a departure from the use of statistics, and a discussion in this work on complementary studies (i.e. detection on multivariate cases) highlights the possibility of enhanced performance by the artificial intelligence-based tests compared with conventional detection tests.

  18. Preseason Functional Movement Screen Component Tests Predict Severe Contact Injuries in Professional Rugby Union Players.

    PubMed

    Tee, Jason C; Klingbiel, Jannie F G; Collins, Robert; Lambert, Mike I; Coopoo, Yoga

    2016-11-01

    Tee, JC, Klingbiel, JFG, Collins, R, Lambert, MI, and Coopoo, Y. Preseason Functional Movement Screen component tests predict severe contact injuries in professional rugby union players. J Strength Cond Res 30(11): 3194-3203, 2016-Rugby union is a collision sport with a relatively high risk of injury. The ability of the Functional Movement Screen (FMS) or its component tests to predict the occurrence of severe (≥28 days) injuries in professional players was assessed. Ninety FMS test observations from 62 players across 4 different time periods were compared with severe injuries sustained during 6 months after FMS testing. Mean composite FMS scores were significantly lower in players who sustained severe injury (injured 13.2 ± 1.5 vs. noninjured 14.5 ± 1.4, Effect Size = 0.83, large) because of differences in in-line lunge (ILL) and active straight leg raise scores (ASLR). Receiver-operated characteristic curves and 2 × 2 contingency tables were used to determine that ASLR (cut-off 2/3) was the injury predictor with the greatest sensitivity (0.96, 95% confidence interval [CI] = 0.79-1.0). Adding the ILL in combination with ASLR (ILL + ASLR) improved the specificity of the injury prediction model (ASLR specificity = 0.29, 95% CI = 0.18-0.43 vs. ASLR + ILL specificity = 0.53, 95% CI = 0.39-0.66, p ≤ 0.05). Further analysis was performed to determine whether FMS tests could predict contact and noncontact injuries. The FMS composite score and various combinations of component tests (deep squat [DS] + ILL, ILL + ASLR, and DS + ILL + ASLR) were all significant predictors of contact injury. The FMS composite score also predicted noncontact injury, but no component test or combination thereof produced a similar result. These findings indicate that low scores on various FMS component tests are risk factors for injury in professional rugby players.

  19. Comparison of skin prick tests with specific serum immunoglobulin E in the diagnosis of fungal sensitization in patients with severe asthma.

    PubMed

    O'Driscoll, B R; Powell, G; Chew, F; Niven, R M; Miles, J F; Vyas, A; Denning, D W

    2009-11-01

    It has been shown that patients with allergic bronchopulmonary aspergillosis (ABPA) and patients with severe asthma with fungal sensitization (SAFS) can benefit from antifungal therapy. It is not known whether allergy skin prick tests (SPT) or specific IgE tests are more sensitive in the identification of patients who are sensitized to fungi and who are therefore candidates for antifungal therapy. To compare SPT and specific serum IgE tests for fungal sensitization in patients with severe asthma. We have undertaken SPT and specific serum IgE tests to six fungi (Aspergillus fumigatus, Candida albicans, Penicillium notatum, Cladosporium herbarum, Alternaria alternata and Botrytis cineria) and specific serum IgE test for Trichophyton in 121 patients with severe asthma (British Thoracic Society/SIGN steps 4 and 5). Sixty-six percent of patients were sensitized to one or more fungi based on SPT and/or specific serum IgE results. Positivity to SPT and/or specific serum IgE was as follows: A. fumigatus 45%, C. albicans 36%, P. notatum 29%, C. herbarum 24%, A. alternata 22%, B. cineria 18%, Trichophyton 17% (specific serum IgE only). Concordance between the tests was 77% overall but only 14-56% for individual fungi. Twenty-nine (24%) patients were sensitized to a single fungus and seven (6%) were sensitized to all seven fungal species. Fifty percent of patients were sensitized to fungal and non-fungal extracts, 21% were sensitized only to non-fungal extracts, 16% were sensitized only to fungal extracts and 13% had no positive tests. This study is consistent with previous reports that fungal sensitization is common in patients with severe asthma. At present, it remains necessary to undertake both SPT and specific serum IgE testing to identify all cases of fungal sensitization. This may be important in the identification of patients with ABPA and SAFS who may benefit from antifungal therapy.

  20. Seeing the Invisible: Embedding Tests in Code That Cannot be Modified

    NASA Technical Reports Server (NTRS)

    O'Malley, Owen; Mansouri-Samani, Masoud; Mehlitz, Peter; Penix, John

    2005-01-01

    The difficulty of characterizing and observing valid software behavior during testing can be very difficult in flight systems. To address this issue, we evaluated several approaches to increasing test observability on the Shuttle Abort Flight Management (SAFM) system. To increase test observability, we added probes into the running system to evaluate the internal state and analyze test data. To minimize the impact of the instrumentation and reduce manual effort, we used Aspect-Oriented Programming (AOP) tools to instrument the source code. We developed and elicited a spectrum of properties, from generic to application specific properties, to be monitored via the instrumentation. To evaluate additional approaches, SAFM was ported to Linux, enabling the use of gcov for measuring test coverage, Valgrind for looking for memory usage errors, and libraries for finding non-normal floating point values. An in-house C++ source code scanning tool was also used to identify violations of SAFM coding standards, and other potentially problematic C++ constructs. Using these approaches with the existing test data sets, we were able to verify several important properties, confirm several problems and identify some previously unidentified issues.

  1. Predicting Success in a Graduate Psychology Program

    ERIC Educational Resources Information Center

    Kordinak, S. Thomas; Kercher, Melanie; Harman, Marsha J.; Bruce, A. Jerry

    2009-01-01

    The Graduate Record Examination (GRE) General Tests and GRE advanced Psychology (PSYGRE) Test were correlated with several measures of success in our graduate program at Sam Houston State University including some specific courses. Significant correlations were obtained for several of these measures, but the PSYGRE provided incremental validity…

  2. Acro-osteolysis as an indicator of severity in systemic sclerosis.

    PubMed

    Arana-Ruiz, Juan Carlos; Amezcua-Guerra, Luis Manuel

    2016-01-01

    Systemic sclerosis is a rare disease that predominantly affects women. The Medsger severity scale has been used to assess the severity, but it requires expensive and poorly accessible studies and it does not include complications such acrosteolysis, calcinosis, pericardial disease or hypothyroidism that occur on a relatively frequent basis in this disease. There is no study that considers if comorbidities, such as primary biliary cirrhosis, are related to gravity. To determine the correlation between severity and the presence of such complications. 40 patients with systemic sclerosis, dividing them into tertiles according to severity were studied. Dichotomous variables were described using percentages, while dimensional by averages+SD. Statistical inference was performed using chi square test or Kruskal-Wallis test with Dunn post-test, as appropriate. A significance at P<.05 was set. Of all the complications studied there were only differences in severity with acrosteolysis. Within comorbidities, primary biliary cirrhosis is not associated with gravity. Copyright © 2015 Elsevier España, S.L.U. and Sociedad Española de Reumatología y Colegio Mexicano de Reumatología. All rights reserved.

  3. Physiological Requirements to Perform the Glittre Activities of Daily Living Test by Subjects With Mild-to-Severe COPD.

    PubMed

    Souza, Gérson F; Moreira, Graciane L; Tufanin, Andréa; Gazzotti, Mariana R; Castro, Antonio A; Jardim, José R; Nascimento, Oliver A

    2017-08-01

    The Glittre activities of daily living (ADL) test is supposed to evaluate the functional capacity of COPD patients. The physiological requirements of the test and the time taken to perform it by COPD patients in different disease stages are not well known. The objective of this work was to compare the metabolic, ventilatory, and cardiac requirements and the time taken to carry out the Glittre ADL test by COPD subjects with mild, moderate, and severe disease. Spirometry, Medical Research Council questionnaire, cardiopulmonary exercise test, and 2 Glittre ADL tests were evaluated in 62 COPD subjects. Oxygen uptake (V̇ O 2 ), carbon dioxide production, pulmonary ventilation, breathing frequency, heart rate, S pO 2 , and dyspnea were analyzed before and at the end of the tests. Maximum voluntary ventilation, Glittre peak V̇ O 2 /cardiopulmonary exercise test (CPET) peak V̇ O 2 , Glittre V̇ E /maximum voluntary ventilation, and Glittre peak heart rate/CPET peak heart rate ratios were calculated to analyze their reserves. Subjects carried out the Glittre ADL test with similar absolute metabolic, ventilatory, and cardiac requirements. Ventilatory reserve decreased progressively from mild to severe COPD subjects ( P < .001 for Global Initiative for Chronic Obstructive Lung Disease [GOLD] 1 vs GOLD 2, P < .001 for GOLD 1 vs GOLD 3, and P < .001 for GOLD 2 vs GOLD 3). Severe subjects with COPD presented a significantly lower metabolic reserve than the mild and moderate subjects ( P = .006 and P = .043, respectively) and significantly lower Glittre peak heart rate/CPET peak heart rate than mild subjects ( P = .01). Time taken to carry out the Glittre ADL test was similar among the groups ( P = .82 for GOLD 1 vs GOLD 2, P = .19 for GOLD 1 vs GOLD 3, and P = .45 for GOLD 2 vs GOLD 3). As the degree of air-flow obstruction progresses, the COPD subjects present significant lower ventilatory reserve to perform the Glittre ADL test. In addition, metabolic and cardiac reserves may differentiate the severe subjects. These variables may be better measures to differentiate functional performance than Glittre ADL time. Copyright © 2017 by Daedalus Enterprises.

  4. Changes in serial laboratory test results in snakebite patients: when can we safely exclude envenoming?

    PubMed

    Ireland, Graham; Brown, Simon G A; Buckley, Nicholas A; Stormer, Jeff; Currie, Bart J; White, Julian; Spain, David; Isbister, Geoffrey K

    2010-09-06

    To determine which laboratory tests are first associated with severe envenoming after a snakebite, when (ie, how long after the bite) the test results become abnormal, and whether this can determine a safe observation period after suspected snakebite. Prospective cohort study of 478 patients with suspected or confirmed snakebite recruited to the Australian Snakebite Project from January 2002 to April 2009, who had at least three sets of laboratory test results and at least 12 hours of observation in hospital after the bite. Severe envenoming was defined as venom-induced consumption coagulopathy (VICC), myotoxicity, neurotoxicity or thrombotic microangiopathy. International normalised ratio (INR), activated partial thromboplastin time (aPTT), creatine kinase (CK) level, and neurological examination. There were 240 patients with severe envenoming, 75 with minor envenoming and 163 non-envenomed patients. Of 206 patients with VICC, 178 had an INR > 1.2 (abnormal) on admission, and the remaining 28 had an INR > 1.2 within 12 hours of the bite. Of 33 patients with myotoxicity, a combination of CK > 250 U/L and an abnormal aPTT identified all but two cases by 12 hours; one of these two was identified within 12 hours by leukocytosis. Nine cases of isolated neurotoxicity had a median time of onset after the bite of 4 hours (range, 35 min - 12 h). The combination of serial INR, aPTT and CK tests and repeated neurological examination identified 213 of 222 severe envenoming cases (96%) by 6 hours and 238 of 240 (99%) by 12 hours. Laboratory parameters (INR, aPTT and CK) and neurological reassessments identified nearly all severe envenoming cases within 12 hours of the bite, even in this conservative analysis that assumed normal test results if the test was not done.

  5. Assessment by airway ellipticity on cine-MRI to differentiate severe obstructive sleep apnea.

    PubMed

    Kojima, Tsukasa; Kawakubo, Masateru; Nishizaka, Mari K; Rahmawati, Anita; Ando, Shin-Ichi; Chishaki, Akiko; Nakamura, Yasuhiko; Nagao, Michinobu

    2018-03-01

    The severity of obstructive sleep apnea (OSA) is assessed by the apnea-hypopnea index (AHI) determined from polysomnography (PSG). However, PSG requires a specialized facility with well-trained specialists and takes overnight. Therefore, simple tools, which could distinguish severe OSA, have been needed before performing PSG. We propose the new index using cine-MRI as a screening test to differentiate severe OSA patients, who would need PSG and proper treatment. Thirty-six patients with suspected OSA (mean age 54.6 y, mean AHI 52.6 events/h, 33 males) underwent airway cine-MRI at the fourth cervical vertebra level during 30 s of free breathing and PSG. The minimum airway ellipticity (AE) in 30 s duration was measured, and was defined as the severity of OSA. Patients were divided into severe OSA, not-severe OSA, and normal groups, according to PSG results. The comparison of AE between any two of the three groups was performed by Wilcoxon rank-sum test. Receiver operating characteristic (ROC) curve analysis was performed to determine the optimal cut-off of AE for identifying severe OSA patients. The minimum AE for severe OSA was significantly lower than that for not-severe OSA and normal (severe, 0.17 ± 0.16; not severe, 0.31 ± 0.17; normal, 0.38 ± 0.19, P < .05). ROC analysis revealed that the optimal cutoff of the minimum AE 0.21 identified severe OSA patients, with an area under the curve of 0.75, 68% sensitivity, and 83% specificity. AE is a feasible quantitative index, and a promising screening test for detecting severe OSA patients. © 2016 John Wiley & Sons Ltd.

  6. Correlation of MRI Visual Scales with Neuropsychological Profile in Mild Cognitive Impairment of Parkinson's Disease.

    PubMed

    Vasconcellos, Luiz Felipe; Pereira, João Santos; Adachi, Marcelo; Greca, Denise; Cruz, Manuela; Malak, Ana Lara; Charchat-Fichman, Helenice; Spitz, Mariana

    2017-01-01

    Few studies have evaluated magnetic resonance imaging (MRI) visual scales in Parkinson's disease-Mild Cognitive Impairment (PD-MCI). We selected 79 PD patients and 92 controls (CO) to perform neurologic and neuropsychological evaluation. Brain MRI was performed to evaluate the following scales: Global Cortical Atrophy (GCA), Fazekas, and medial temporal atrophy (MTA). The analysis revealed that both PD groups (amnestic and nonamnestic) showed worse performance on several tests when compared to CO. Memory, executive function, and attention impairment were more severe in amnestic PD-MCI group. Overall analysis of frequency of MRI visual scales by MCI subtype did not reveal any statistically significant result. Statistically significant inverse correlation was observed between GCA scale and Mini-Mental Status Examination (MMSE), Montreal Cognitive Assessment (MoCA), semantic verbal fluency, Stroop test, figure memory test, trail making test (TMT) B, and Rey Auditory Verbal Learning Test (RAVLT). The MTA scale correlated with Stroop test and Fazekas scale with figure memory test, digit span, and Stroop test according to the subgroup evaluated. Visual scales by MRI in MCI should be evaluated by cognitive domain and might be more useful in more severely impaired MCI or dementia patients.

  7. Thoracic Injury Risk as a Function of Crash Severity – Car-to-car Side Impact Tests with WorldSID Compared to Real-life Crashes

    PubMed Central

    Sunnevång, Cecilia; Rosén, Erik; Boström, Ola; Lechelt, Ulf

    2010-01-01

    Side airbags reduce the risk of fatal injury by approximately 30%. Due to limited real-life data the risk reducing effect for serious injury has not yet been established. Since side airbags are mainly designed and validated for crash severities used in available test procedures little is known regarding the protective effect when severity increases. The objective of this study was to understand for which crash severities AIS3+ thorax occupant protection in car-to-car nearside collisions need to and can be improved. The aim was fulfilled by means of real life data, for older cars without side airbag, and a series of car-to-car tests performed with the WorldSID 50%-ile in modern and older cars at different impact speeds. The real life data showed that the risk of AIS3+ injury was highest for the thorax followed by the pelvis and head. For both non-senior and senior occupants, most thorax injuries were sustained at lateral delta-v from 20 km/h to 40 km/h. In this severity range, senior occupants were found to have approximately four times higher risk of thoracic injury than non-senior occupants. The crash tests at lateral impact speed 55 km/h (delta-v 32 km/h) confirmed the improved performance at severities represented in current legal and rating tests. The structural integrity of the modern car impacted at 70 km/h showed a potential for improved side impact protection by interior countermeasures. PMID:21050600

  8. Thoracic Injury Risk as a Function of Crash Severity - Car-to-car Side Impact Tests with WorldSID Compared to Real-life Crashes.

    PubMed

    Sunnevång, Cecilia; Rosén, Erik; Boström, Ola; Lechelt, Ulf

    2010-01-01

    Side airbags reduce the risk of fatal injury by approximately 30%. Due to limited real-life data the risk reducing effect for serious injury has not yet been established. Since side airbags are mainly designed and validated for crash severities used in available test procedures little is known regarding the protective effect when severity increases.The objective of this study was to understand for which crash severities AIS3+ thorax occupant protection in car-to-car nearside collisions need to and can be improved. The aim was fulfilled by means of real life data, for older cars without side airbag, and a series of car-to-car tests performed with the WorldSID 50%-ile in modern and older cars at different impact speeds.The real life data showed that the risk of AIS3+ injury was highest for the thorax followed by the pelvis and head. For both non-senior and senior occupants, most thorax injuries were sustained at lateral delta-v from 20 km/h to 40 km/h. In this severity range, senior occupants were found to have approximately four times higher risk of thoracic injury than non-senior occupants. The crash tests at lateral impact speed 55 km/h (delta-v 32 km/h) confirmed the improved performance at severities represented in current legal and rating tests. The structural integrity of the modern car impacted at 70 km/h showed a potential for improved side impact protection by interior countermeasures.

  9. Permanent make-up colorants may cause severe skin reactions.

    PubMed

    Wenzel, Sabrina M; Welzel, Julia; Hafner, Christian; Landthaler, Michael; Bäumler, Wolfgang

    2010-10-01

    In recent years, cosmetic tattoos [permanent make-up (PMU)] on eyelids, eyebrows and lips have become increasingly popular. However, most colorants are manufactured for non-medical purposes, without any established history of safe use in humans. To investigate severe adverse reactions, such as swelling, burning, and the development of papules, of the lips and the surrounding area in 4 patients who had had at least two PMU procedures on their lips. Adverse skin reactions were examined with patch and prick testing of the colorants. In addition, skin biopsies were taken in the centre of the prick test for histology. One patient declined prick testing. Beauticians tended to use various PMU products, but all contained Pigment Red 181 (CI 73360). All patients tested showed a clear delayed reaction to Pigment Red 181 or the tattoo ink, or both, after prick testing. Histology indicated an allergic reaction. Each lip lesion slowly abated after several months of topical or systemic therapy with steroids in combination with tacrolimus, but none has yet completely resolved. In light of the severe and often therapy-resistant skin reactions, we strongly recommend the regulation and control of the substances used in PMU colorants. © 2010 John Wiley & Sons A/S.

  10. Inlet flow test calibration for a small axial compressor rig. Part 2: CFD compared with experimental results

    NASA Technical Reports Server (NTRS)

    Miller, D. P.; Prahst, P. S.

    1995-01-01

    An axial compressor test rig has been designed for the operation of small turbomachines. A flow test was run to calibrate and determine the source and magnitudes of the loss mechanisms in the compressor inlet for a highly loaded two-stage axial compressor test. Several flow conditions and inlet guide vane (IGV) angle settings were established, for which detailed surveys were completed. Boundary layer bleed was also provided along the casing of the inlet behind the support struts and ahead of the IGV. Several computational fluid dynamics (CFD) calculations were made for selected flow conditions established during the test. Good agreement between the CFD and test data were obtained for these test conditions.

  11. Tests for H. pylori

    MedlinePlus

    Peptic ulcer disease - H. pylori ; PUD - H. pylori ... There are several methods to test for H. pylori infection. Breath Test (Carbon Isotope-urea Breath Test, or UBT) Up to 2 weeks before the test, you need to stop taking ...

  12. Proctalgia fugax.

    PubMed

    Babb, Richard R

    1996-04-01

    Preview Severe, episodic pain in the rectum may prompt extensive- and expensive- diagnostic testing. If the cause is proctalgia fugax, such testing is unnecessary and wasteful. The author, a gastroenterologist, offers a guide to prompt recognition of the disorder based on patient history and suggests several therapeutic stategies that may help to relieve the pain.

  13. The effects of rater bias and assessment method used to estimate disease severity on hypothesis testing

    USDA-ARS?s Scientific Manuscript database

    The effects of bias (over and underestimates) in estimates of disease severity on hypothesis testing using different assessment methods was explored. Nearest percent estimates (NPE), the Horsfall-Barratt (H-B) scale, and two different linear category scales (10% increments, with and without addition...

  14. Third phase of pocket-sized electronic dosimeter testing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fox, R.A.; Hooker, C.D.; Hogan, B.T.

    1982-05-01

    The experiences of industrial radiographers have indicated that electronic radiation-warning devices become inoperative when they are used under some types of ambient conditions. This report, as a follow-up to NUREG/CR-0554 and NUREG/CR-1452, documents the nature of tests performed on several additional commercially available models. None of the four models tested passed the tests for ruggedness and severe environmental conditions. However, all models passed most of the requirements of a Health Physics Society draft standard of performance specifications for these devices. The test procedures used in the project and the results obtained are discussed. Conclusions from the tests and recommendations concerningmore » potentially useful modifications to existing devices are presented.« less

  15. The Electrophysiologic Mechanisms of Halogenated Alkane Arrhythmogenesis.

    DTIC Science & Technology

    1983-03-01

    recording of normal parameters (Fig. 19), the animal was tested with several con- centrations of isoproterenol to determine an arrhythmogenic dose...agent to control these seizures, several compounds were tested . Valium at a dose of 10 mg per animal or 20 mg per animal of Rompun did not induce the... animals tested in the following experiments were pretreated with this dose (15 mg/kg) of phenobarbital. Table 16 summarizes the results of 10 dogs breathing

  16. Severity of Illness and the Teaching Hospital.

    ERIC Educational Resources Information Center

    Berman, Richard A.; And Others

    1986-01-01

    The Medicare prospective payment system does not adequately account for severity of illness. Whether teaching hospitals treat a case mix of patients with more severe illness than do nonteaching hospitals was tested in a study using two severity measures, Horn's severity of illness index and Gonnnella's "disease staging." (Author/MLW)

  17. Burn Severity Based Stream Buffers for Post Wildfire Salvage Logging Erosion

    NASA Astrophysics Data System (ADS)

    Bone, E. D.; Robichaud, P. R.; Brooks, E. S.; Brown, R. E.

    2017-12-01

    Riparian buffers may be managed for timber harvest disturbances to decrease the risk of hillslope erosion entering stream channels during runoff events. After a wildfire, burned riparian buffers may become less efficient at infiltrating runoff and reducing sedimentation, requiring wider dimensions. Testing riparian buffers under post-wildfire conditions may provide managers guidance on how to manage post-fire salvage logging operations on hillslopes and protect water quality in adjacent streams. We tested burned, unlogged hillslopes at the 2015 North Star Fire and 2016 Cayuse Mountain Fire locations in Washington, USA for their ability to reduce runoff flows and sedimentation. Our objectives were to: 1) measure the travel distances of concentrated flows using three sediment-laden flow rates, 2) measure the change in sediment concentration as each flow moves downslope, 3) test hillslopes under high burn-severity, low burn-severity and unburned conditions, and 4) conduct experiments at 0, 1 and 2 years since the fire events. Mean total flow length at the North Star Fire in year 1 was 211% greater at low burn-severity sites than unburned sites, and 467% greater at high burn-severity sites than unburned sites. Results decreased for all burned sites in year 2; by 40% at the high burn-severity sites, and by 30% at the low burn-severity sites, with no significant changes at the unburned sites. We tested only high burn-severity sites at the Cayuse Mountain Fire in year 0 and 1 where the mean total flow length between year 0 and year 1 decreased by 65%. The results of sediment concentration changes tracked closely with the magnitude of changes in flow travel lengths between treatments. Results indicate that managers may need to increase the widths of burned stream buffers during post-wildfire salvage logging for water quality protection, but stream buffer widths may decrease with less severe burn severity and increasing elapsed time (years) since fire.

  18. Development of a severe local storm prediction system: A 60-day test of a mesoscale primitive equation model

    NASA Technical Reports Server (NTRS)

    Paine, D. A.; Zack, J. W.; Kaplan, M. L.

    1979-01-01

    The progress and problems associated with the dynamical forecast system which was developed to predict severe storms are examined. The meteorological problem of severe convective storm forecasting is reviewed. The cascade hypothesis which forms the theoretical core of the nested grid dynamical numerical modelling system is described. The dynamical and numerical structure of the model used during the 1978 test period is presented and a preliminary description of a proposed multigrid system for future experiments and tests is provided. Six cases from the spring of 1978 are discussed to illustrate the model's performance and its problems. Potential solutions to the problems are examined.

  19. Clinical Objective Dry Eye Tests in a Population of Tannery Workers in North India.

    PubMed

    Ranjan, Ratnesh; Kushwaha, Raj Nath; Khan, Perwez; Mohan, Shalini; Gupta, Ramesh Chandra

    2016-10-01

    To analyze the correlation between subjective symptoms and clinical signs of dry eye among tannery workers. In this cross-sectional study, three classic clinical tests, namely the fluorescein tear film break-up time (FTBUT) test, the fluorescein staining (FS) test, and the Schirmer test (ST), were performed to assess the clinical signs of dry eye disease in 246 tanners who were found symptomatic for dry eye in a prior ocular surface disease index survey. All workers were male with a mean age of 35 ± 9 years, and the mean duration of work at tanneries was 8 ± 5 years. Among 246 symptomatic subjects, the FTBUT test, the FS test and the ST were positive in 63.8%, 30.9% and 41.9% workers, respectively. Mean FTBUT and ST scores were 10.6 ± 4.2 seconds and 10.1 ± 7.7mm, respectively. Mean FTBUT for mild, moderate and severe symptom categories differed significantly. Mean ST scores for the mild symptom group were significantly higher than that of the moderate group (p < 0.0001). The FTBUT and ST score showed a strong negative correlation with severity of symptoms (p < 0.0001). A moderate positive correlation was observed between FS positivity and increasing symptom severity (p < 0.0001). The effect of age was insignificant for FTBUT (p = 0.10), while significant for ST score (p < 0.001). The effect of duration of tannery work was significant for both FTBUT and ST scores (p < 0.0001). Clinical tests correlated well with symptom severity among tanners, and a multifactorial etiology is suggested for dry eye diseases.

  20. Severe Loss of Tritan Color Discrimination in RPE65 Associated Leber Congenital Amaurosis

    PubMed Central

    Kumaran, Neruban; Ripamonti, Caterina; Kalitzeos, Angelos; Rubin, Gary S.; Bainbridge, James W. B.

    2018-01-01

    Purpose RPE65-associated Leber congenital amaurosis (RPE65-LCA) is a progressive severe retinal dystrophy with early profound dysfunction of rod photoreceptors followed by progressive cone photoreceptor degeneration. We aim to provide detailed information about how cone dysfunction affects color discrimination. Methods Seven adults (aged 16–21) with RPE65-LCA underwent monocular color discrimination assessment using the Trivector and Ellipse versions of three computerized tests: Cambridge Colour Test (CCT), low vision version of the Cambridge Colour Test (lvvCCT), and the Universal Colour Discrimination Test (UCDT). For comparison, subjects were also tested using the American Optical Hardy Rand Rittler (AO-HRR) plates. Each assessment was repeated three times. Results The Trivector version of the tests demonstrated that color discrimination along the tritan axis was undetectable in four subjects, and severely reduced in three subjects. These findings were confirmed by the Ellipse version of the tests. Color discrimination along the protan and deutan axes was evident but reduced in six of seven subjects. Four of seven subjects were unable to read any of the HRR plates. Conclusions The computerized color vision tests adopted in this study provide detailed information about color discrimination in adult RPE65-LCA patients. The condition is associated with severe impairment of color discrimination, particularly along the tritan axis indicating possible early involvement of S-cones, with additional protan and deutan loss to a lesser extent. This psychophysical assessment strategy is likely to be valuable in measuring the impact of therapeutic intervention on cone function. PMID:29332120

  1. Persistent grief in the aftermath of mass violence: the predictive roles of posttraumatic stress symptoms, self-efficacy, and disrupted worldview.

    PubMed

    Smith, Andrew J; Abeyta, Andrew A; Hughes, Michael; Jones, Russell T

    2015-03-01

    This study tested a conceptual model merging anxiety buffer disruption and social-cognitive theories to predict persistent grief severity among students who lost a close friend, significant other, and/or professor/teacher in tragic university campus shootings. A regression-based path model tested posttraumatic stress (PTS) symptom severity 3 to 4 months postshooting (Time 1) as a predictor of grief severity 1 year postshootings (Time 2), both directly and indirectly through cognitive processes (self-efficacy and disrupted worldview). Results revealed a model that predicted 61% of the variance in Time 2 grief severity. Hypotheses were supported, demonstrating that Time 1 PTS severity indirectly, positively predicted Time 2 grief severity through undermining self-efficacy and more severely disrupting worldview. Findings and theoretical interpretation yield important insights for future research and clinical application. (PsycINFO Database Record (c) 2015 APA, all rights reserved).

  2. Standard tests for toughened resin composites, revised edition

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Several toughened resin systems are evaluated to achieve commonality for certain kinds of tests used to characterize toughened resin composites. Specifications for five tests were standardized; these test standards are described.

  3. Predictive validities of several clinical color vision tests for aviation signal light gun performance.

    DOT National Transportation Integrated Search

    1975-01-01

    Scores on the American Optical Company (AOC) test (1965 edition), Dvorine test, Farnsworth Lantern test, Color Threshold Tester, Farnsworth-Munsell 100-Hue test, Farnsworth Panel D-15 test, and Schmidt-Haensch Anomaloscope were obtained from 137 men ...

  4. Appropriate Implementation of Severity Ratings, Regulations, and State Guidance: A Response to "Using Norm-Referenced Tests to Determine Severity of Language Impairment in Children: Disconnect between U.S. Policy Makers and Test Developers" by Spaulding, Szulga, & Figueria (2012)

    ERIC Educational Resources Information Center

    Ireland, Marie; Hall-Mills, Shannon; Millikin, Cindy

    2013-01-01

    In this response to Spaulding et al.'s examination of state education agency (SEA) guidance on severity ratings, these authors contend that Spaulding et al. provided an incomplete view of current practices in public schools. These authors state that, ultimately, school speech-language pathologists (SLPs) must follow all state regulations and local…

  5. Testing for the Presence of Correlation Changes in a Multivariate Time Series: A Permutation Based Approach.

    PubMed

    Cabrieto, Jedelyn; Tuerlinckx, Francis; Kuppens, Peter; Hunyadi, Borbála; Ceulemans, Eva

    2018-01-15

    Detecting abrupt correlation changes in multivariate time series is crucial in many application fields such as signal processing, functional neuroimaging, climate studies, and financial analysis. To detect such changes, several promising correlation change tests exist, but they may suffer from severe loss of power when there is actually more than one change point underlying the data. To deal with this drawback, we propose a permutation based significance test for Kernel Change Point (KCP) detection on the running correlations. Given a requested number of change points K, KCP divides the time series into K + 1 phases by minimizing the within-phase variance. The new permutation test looks at how the average within-phase variance decreases when K increases and compares this to the results for permuted data. The results of an extensive simulation study and applications to several real data sets show that, depending on the setting, the new test performs either at par or better than the state-of-the art significance tests for detecting the presence of correlation changes, implying that its use can be generally recommended.

  6. Multiple-Choice Tests with Correction Allowed in Autism: An Excel Applet

    ERIC Educational Resources Information Center

    Martinez, Elisabetta Monari

    2010-01-01

    The valuation of academic achievements in students with severe language impairment is problematic if they also have difficulties in sustaining attention and in praxic skills. In severe autism all of these difficulties may occur together. Multiple-choice tests offer the advantage that simple praxic skills are required, allowing the tasks to be…

  7. MSFC Doppler Lidar Science experiments and operations plans for 1981 airborne test flight

    NASA Technical Reports Server (NTRS)

    Fichtl, G. H.; Bilbro, J. W.; Kaufman, J. W.

    1981-01-01

    The flight experiment and operations plans for the Doppler Lidar System (DLS) are provided. Application of DLS to the study of severe storms and local weather penomena is addressed. Test plans involve 66 hours of flight time. Plans also include ground based severe storm and local weather data acquisition.

  8. Benchmark Lisp And Ada Programs

    NASA Technical Reports Server (NTRS)

    Davis, Gloria; Galant, David; Lim, Raymond; Stutz, John; Gibson, J.; Raghavan, B.; Cheesema, P.; Taylor, W.

    1992-01-01

    Suite of nonparallel benchmark programs, ELAPSE, designed for three tests: comparing efficiency of computer processing via Lisp vs. Ada; comparing efficiencies of several computers processing via Lisp; or comparing several computers processing via Ada. Tests efficiency which computer executes routines in each language. Available for computer equipped with validated Ada compiler and/or Common Lisp system.

  9. Validating the Posttraumatic Stress Disorder Symptom Scale with Persons Who Have Severe Mental Illnesses

    ERIC Educational Resources Information Center

    O'Hare, Thomas; Shen, Ce; Sherrer, Margaret

    2007-01-01

    Objective: Interview data collected from 275 clients with severe mental illnesses are used to test the construct and criterion validity of the Posttraumatic Stress Disorder Symptom Scale (PSS). Method: First, exploratory and confirmatory factor analyses are used to test whether the scale reflects the posttraumatic stress disorder (PTSD) symptom…

  10. Conditional Covariance-Based Subtest Selection for DIMTEST

    ERIC Educational Resources Information Center

    Froelich, Amy G.; Habing, Brian

    2008-01-01

    DIMTEST is a nonparametric hypothesis-testing procedure designed to test the assumptions of a unidimensional and locally independent item response theory model. Several previous Monte Carlo studies have found that using linear factor analysis to select the assessment subtest for DIMTEST results in a moderate to severe loss of power when the exam…

  11. Client-centred development of an infrared thermal access switch for a young adult with severe spastic quadriplegic cerebral palsy.

    PubMed

    Memarian, Negar; Venetsanopoulos, Anastasios N; Chau, Tom

    2011-01-01

    This study reports a client-centred development of a non-contact access switch based on an infrared thermal imaging of mouth opening-closing activity of an individual with severe spastic quadriplegic cerebral palsy. Over a 6-month period, the client participated in five test sessions to inform the development of an infrared thermal switch. The client completed eight stimulus-response trials (switch test) and eight word-matching trials (scan test) using the infrared thermal switch and provided subjective feedback throughout. For the switch test, the client achieved an average correct activation rate of 90% and average response time of 2.4 s. His mean correct activation rate on the scan test improved from 65 to 80% over the course of system development, with an average response time of 11.7 s. An infrared thermography switch tuned to a client's extant orofacial gestures is a practical non-invasive access solution and warrants further research in clients with severe physical disability.

  12. Genomic testing to determine drug response: measuring preferences of the public and patients using Discrete Choice Experiment (DCE)

    PubMed Central

    2013-01-01

    Background The extent to which a genomic test will be used in practice is affected by factors such as ability of the test to correctly predict response to treatment (i.e. sensitivity and specificity of the test), invasiveness of the testing procedure, test cost, and the probability and severity of side effects associated with treatment. Methods Using discrete choice experimentation (DCE), we elicited preferences of the public (Sample 1, N = 533 and Sample 2, N = 525) and cancer patients (Sample 3, N = 38) for different attributes of a hypothetical genomic test for guiding cancer treatment. Samples 1 and 3 considered the test/treatment in the context of an aggressive curable cancer (scenario A) while the scenario for sample 2 was based on a non-aggressive incurable cancer (scenario B). Results In aggressive curable cancer (scenario A), everything else being equal, the odds ratio (OR) of choosing a test with 95% sensitivity was 1.41 (versus a test with 50% sensitivity) and willingness to pay (WTP) was $1331, on average, for this amount of improvement in test sensitivity. In this scenario, the OR of choosing a test with 95% specificity was 1.24 times that of a test with 50% specificity (WTP = $827). In non-aggressive incurable cancer (scenario B), the OR of choosing a test with 95% sensitivity was 1.65 (WTP = $1344), and the OR of choosing a test with 95% specificity was 1.50 (WTP = $1080). Reducing severity of treatment side effects from severe to mild was associated with large ORs in both scenarios (OR = 2.10 and 2.24 in scenario A and B, respectively). In contrast, patients had a very large preference for 95% sensitivity of the test (OR = 5.23). Conclusion The type and prognosis of cancer affected preferences for genomically-guided treatment. In aggressive curable cancer, individuals emphasized more on the sensitivity rather than the specificity of the test. In contrast, for a non-aggressive incurable cancer, individuals put similar emphasis on sensitivity and specificity of the test. While the public expressed strong preference toward lowering severity of side effects, improving sensitivity of the test had by far the largest influence on patients’ decision to use genomic testing. PMID:24176050

  13. Ethical Issues of Predictive Genetic Testing for Diabetes

    PubMed Central

    Haga, Susanne B.

    2009-01-01

    With the rising number of individuals affected with diabetes and the significant health care costs of treatment, the emphasis on prevention is key to controlling the health burden of this disease. Several genetic and genomic studies have identified genetic variants associated with increased risk to diabetes. As a result, commercial testing is available to predict an individual's genetic risk. Although the clinical benefits of testing have not yet been demonstrated, it is worth considering some of the ethical implications of testing for this common chronic disease. In this article, I discuss several issues that should be considered during the translation of predictive testing for diabetes, including familial implications, improvement of risk communication, implications for behavioral change and health outcomes, the Genetic Information Nondiscrimination Act, direct-to-consumer testing, and appropriate age of testing. PMID:20144329

  14. A standardized test battery for the study of synesthesia

    PubMed Central

    Eagleman, David M.; Kagan, Arielle D.; Nelson, Stephanie S.; Sagaram, Deepak; Sarma, Anand K.

    2014-01-01

    Synesthesia is an unusual condition in which stimulation of one modality evokes sensation or experience in another modality. Although discussed in the literature well over a century ago, synesthesia slipped out of the scientific spotlight for decades because of the difficulty in verifying and quantifying private perceptual experiences. In recent years, the study of synesthesia has enjoyed a renaissance due to the introduction of tests that demonstrate the reality of the condition, its automatic and involuntary nature, and its measurable perceptual consequences. However, while several research groups now study synesthesia, there is no single protocol for comparing, contrasting and pooling synesthetic subjects across these groups. There is no standard battery of tests, no quantifiable scoring system, and no standard phrasing of questions. Additionally, the tests that exist offer no means for data comparison. To remedy this deficit we have devised the Synesthesia Battery. This unified collection of tests is freely accessible online (http://www.synesthete.org). It consists of a questionnaire and several online software programs, and test results are immediately available for use by synesthetes and invited researchers. Performance on the tests is quantified with a standard scoring system. We introduce several novel tests here, and offer the software for running the tests. By presenting standardized procedures for testing and comparing subjects, this endeavor hopes to speed scientific progress in synesthesia research. PMID:16919755

  15. Evaluation of exercise capacity after severe stroke using robotics-assisted treadmill exercise: a proof-of-concept study.

    PubMed

    Stoller, O; de Bruin, E D; Schindelholz, M; Schuster, C; de Bie, R A; Hunt, K J

    2013-01-01

    Robotics-assisted treadmill exercise (RATE) with focus on motor recovery has become popular in early post-stroke rehabilitation but low endurance for exercise is highly prevalent in these individuals. This study aimed to develop an exercise testing method using robotics-assisted treadmill exercise to evaluate aerobic capacity after severe stroke. Constant load testing (CLT) based on body weight support (BWS) control, and incremental exercise testing (IET) based on guidance force (GF) control were implemented during RATE. Analyses focussed on step change, step response kinetics, and peak performance parameters of oxygen uptake. Three subjects with severe motor impairment 16-23 days post-stroke were included. CLT yielded reasonable step change values in oxygen uptake, whereas response kinetics of oxygen uptake showed low goodness of fit. Peak performance parameters were not obtained during IET. Exercise testing in post-stroke individuals with severe motor impairments using a BWS control strategy for CLT is deemed feasible and safe. Our approach yielded reasonable results regarding cardiovascular performance parameters. IET based on GF control does not provoke peak cardiovascular performance due to uncoordinated walking patterns. GF control needs further development to optimally demand active participation during RATE. The findings warrant further research regarding the evaluation of exercise capacity after severe stroke.

  16. Does concomitant anterior fundoplication promote dysphagia after laparoscopic Heller myotomy?

    PubMed

    Tapper, Donovan; Morton, Connor; Kraemer, Emily; Villadolid, Desiree; Ross, Sharona B; Cowgill, Sarah M; Rosemurgy, Alexander S

    2008-07-01

    Concerns for gastroesophageal reflux after laparoscopic Heller myotomy for achalasia justify considerations of concomitant anterior fundoplication. This study was undertaken to determine if concomitant anterior fundoplication reduces symptoms of reflux after myotomy without promoting dysphagia. From 1992 to 2004, 182 patients underwent laparoscopic Heller myotomy without fundoplication. After a prospective randomized trial justified its concomitant application, anterior fundoplication was undertaken with laparoscopic Heller myotomy in 171 patients from 2004 to 2007. All patients have been prospectively followed. Pre and postoperatively, patients scored the frequency and severity of symptoms of achalasia (including dysphagia, choking, vomiting, regurgitation, chest pain, and heartburn) using a Likert Scale (0 = never/not bothersome to 10 = always/very bothersome). Before myotomy, symptoms of achalasia were frequent and severe for all patients. After myotomy, the frequency and severity of all symptoms of achalasia significantly decreased for all patients (P < 0.001, Wilcoxon matched pairs test). Notably, relative to patients undergoing laparoscopic Heller myotomy alone, concomitant anterior fundoplication led to significantly less frequent and severe heartburn after myotomy (P < 0.05, Mann-Whitney Test) and to less frequent and severe dysphagia and choking (P < 0.05, Mann-Whitney Test). Laparoscopic Heller myotomy reduces the frequency and severity of symptoms of achalasia. Concomitant anterior fundoplication decreases the frequency and severity of heartburn and dysphagia after laparoscopic Heller myotomy. Concomitant anterior fundoplication promotes salutary relief in the frequency and severity of symptoms after myotomy and is warranted.

  17. [Impairment of executive function in elderly patients with major unipolar depression: influence of psychomotor retardation].

    PubMed

    Baudic, Sophie; Benisty, Sarah; Dalla Barba, Gianfrano; Traykov, Latchezar

    2007-03-01

    The results from several studies assessing the executive function in depressed patients compared to control subjects varied from significant impairment to normal performance. To assess the executive impairment in elderly patients with major unipolar depression and to evaluate the influence of psychomotor retardation and severity of depression in the executive deficits, the performance of 15 elderly patients with unipolar depression was compared to that of 15 elderly control subjects on executive tasks. The severity of depression was evaluated by the Montgomery and Asberg depressive scale and that of psychomotor retardation by the Widlöcher's scale. In depressed patients, deficits were found on tasks assessing cognitive flexibility (Modified card sorting test (MCST) and Trail making test B), planification and elaboration of strategies (cognitive estimates), motor initiation (graphic sequences), categorisation and hypothesis making (MCST) and interference resistance (Stroop test). However, depressed patients performed normally on the Hayling test assessing the inhibition processes. Intensity of psychomotor retardation was not correlated to the performance of executive tasks. Conversely, severity of depression was related to the scores of MCST (number of errors and perseverations), Stroop and Hayling tests (time taken to complete the end of the sentence). Unipolar depressed patients showed deficits in most tasks assessing executive function. However, inhibition processes appeared to be intact in depressed patients although their implementation was difficult. The severity of depression but not that of psychomotor retardation was associated with executive deficits.

  18. Non-allergic cutaneous reactions in airborne chemical sensitivity--a population based study.

    PubMed

    Berg, Nikolaj Drimer; Linneberg, Allan; Thyssen, Jacob Pontoppidan; Dirksen, Asger; Elberling, Jesper

    2011-06-01

    Multiple chemical sensitivity (MCS) is characterised by adverse effects due to exposure to low levels of chemical substances. The aetiology is unknown, but chemical related respiratory symptoms have been found associated with positive patch test. The purpose of this study was to investigate the relationship between cutaneous reactions from patch testing and self-reported severity of chemical sensitivity to common airborne chemicals. A total of 3460 individuals participating in a general health examination, Health 2006, were patch tested with allergens from the European standard series and screened for chemical sensitivity with a standardised questionnaire dividing the participants into four severity groups of chemical sensitivity. Both allergic and non-allergic cutaneous reactions--defined as irritative, follicular, or doubtful allergic reactions--were analysed in relationship with severity of chemical sensitivity. Associations were controlled for the possible confounding effects of sex, age, asthma, eczema, atopic dermatitis, psychological and social factors, and smoking habits. In unadjusted analyses we found associations between allergic and non-allergic cutaneous reactions on patch testing and the two most severe groups of self-reported sensitivity to airborne chemicals. When adjusting for confounding, associations were weakened, and only non-allergic cutaneous reactions were significantly associated with individuals most severely affected by inhalation of airborne chemicals (odds ratio = 2.5, p = 0.006). Our results suggest that individuals with self-reported chemical sensitivity show increased non-allergic cutaneous reactions based on day 2 readings of patch tests. Copyright © 2011 Elsevier GmbH. All rights reserved.

  19. Validation of structural analysis methods using the in-house liner cyclic rigs

    NASA Technical Reports Server (NTRS)

    Thompson, R. L.

    1982-01-01

    Test conditions and variables to be considered in each of the test rigs and test configurations, and also used in the validation of the structural predictive theories and tools, include: thermal and mechanical load histories (simulating an engine mission cycle; different boundary conditions; specimens and components of different dimensions and geometries; different materials; various cooling schemes and cooling hole configurations; several advanced burner liner structural design concepts; and the simulation of hot streaks. Based on these test conditions and test variables, the test matrices for each rig and configurations can be established to verify the predictive tools over as wide a range of test conditions as possible using the simplest possible tests. A flow chart for the thermal/structural analysis of a burner liner and how the analysis relates to the tests is shown schematically. The chart shows that several nonlinear constitutive theories are to be evaluated.

  20. Kruskal-Wallis test: BASIC computer program to perform nonparametric one-way analysis of variance and multiple comparisons on ranks of several independent samples.

    PubMed

    Theodorsson-Norheim, E

    1986-08-01

    Multiple t tests at a fixed p level are frequently used to analyse biomedical data where analysis of variance followed by multiple comparisons or the adjustment of the p values according to Bonferroni would be more appropriate. The Kruskal-Wallis test is a nonparametric 'analysis of variance' which may be used to compare several independent samples. The present program is written in an elementary subset of BASIC and will perform Kruskal-Wallis test followed by multiple comparisons between the groups on practically any computer programmable in BASIC.

  1. Benchmarking expert system tools

    NASA Technical Reports Server (NTRS)

    Riley, Gary

    1988-01-01

    As part of its evaluation of new technologies, the Artificial Intelligence Section of the Mission Planning and Analysis Div. at NASA-Johnson has made timing tests of several expert system building tools. Among the production systems tested were Automated Reasoning Tool, several versions of OPS5, and CLIPS (C Language Integrated Production System), an expert system builder developed by the AI section. Also included in the test were a Zetalisp version of the benchmark along with four versions of the benchmark written in Knowledge Engineering Environment, an object oriented, frame based expert system tool. The benchmarks used for testing are studied.

  2. Investigation of optimized graded concrete for Oklahoma.

    DOT National Transportation Integrated Search

    2013-07-01

    This report presents the results of several novel test methods to investigate concrete for slip formed paving. These tests include the Box Test, a novel test to evaluate the response of concrete to vibration, the AIMS2, an automated test for aggregat...

  3. Venting test analysis using Jacob`s approximation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edwards, K.B.

    1996-03-01

    There are many sites contaminated by volatile organic compounds (VOCs) in the US and worldwide. Several technologies are available for remediation of these sites, including excavation, pump and treat, biological treatment, air sparging, steam injection, bioventing, and soil vapor extraction (SVE). SVE is also known as soil venting or vacuum extraction. Field venting tests were conducted in alluvial sands residing between the water table and a clay layer. Flow rate, barometric pressure, and well-pressure data were recorded using pressure transmitters and a personal computer. Data were logged as frequently as every second during periods of rapid change in pressure. Testsmore » were conducted at various extraction rates. The data from several tests were analyzed concurrently by normalizing the well pressures with respect to extraction rate. The normalized pressures vary logarithmically with time and fall on one line allowing a single match of the Jacob approximation to all tests. Though the Jacob approximation was originally developed for hydraulic pump test analysis, it is now commonly used for venting test analysis. Only recently, however, has it been used to analyze several transient tests simultaneously. For the field venting tests conducted in the alluvial sands, the air permeability and effective porosity determined from the concurrent analysis are 8.2 {times} 10{sup {minus}7} cm{sup 2} and 20%, respectively.« less

  4. [The lymphocyte transformation test in dermatology].

    PubMed

    Zinn, K; Braun-Falco, O

    1976-03-01

    At first, immunologie and methodic basies of the lymphocyte transformation test are discussed. Then the results gained by this test in several dermatologic diseases are summarized. Finally, practice of the lymphocyte transformation test is critically reviewed.

  5. Swallowing Disorders in Severe Brain Injury in the Arousal Phase.

    PubMed

    Bremare, A; Rapin, A; Veber, B; Beuret-Blanquart, F; Verin, E

    2016-08-01

    The objective of this study was to determine the clinical characteristics of swallowing disorders in severe brain injury in the arousal phase after coma. Between December 1, 2013 and June 30, 2014, eleven patients with severe acquired brain injury who were admitted to rehabilitation center (Male 81.8 %; 40.7 ± 14.6 years) were included in the study. Evaluation of swallowing included a functional examination, clinical functional swallowing test, and naso-endoscopic swallowing test. All patients had swallowing disorders at admission. The first functional swallowing test showed oral (77.8 %) and pharyngeal (66.7 %) food bolus transport disorders; and alterations in airway protection mechanisms (80 %). Swallowing test under endoscopic control showed a disorder in swallowing coordination in 55.6 % of patients tested. Seven (63.6 %) patients resumed oral feeding within an average of 6 weeks after admission to rehabilitation center and 14 weeks after acquired brain injury. Six (85.7 %) of these seven patients continued to require modified solid and liquid textures. Swallowing disorders are a major concern in severe brain injury in the arousal phase. Early bedside assessment of swallowing is essential for detection of swallowing disorders to propose appropriate medical rehabilitation care to these patients in a state of altered consciousness.

  6. Blood Tests for People with Severe Learning Disabilities Receiving Dental Treatment under General Anaesthesia.

    PubMed

    Clough, Stacey; Shehabi, Zahra; Morgan, Claire; Sheppey, Claire

    2016-11-01

    People with learning disabilities (LDs) have poorer health than their non-disabled peers due to failures in reasonable adjustments. One hundred patients with severe LD and challenging behaviour attended for dental treatment under GA, during which routine blood testing was provided. Communication with general medical practitioners (GMPs) and blood test results were evaluated, showing poor communication with GMPs and significant undiagnosed disease among this group. Blood tests generate similar costs in primary and secondary care but a holistic approach to care under GA reduces expenses brought by lost clinical time and resources due to complex behaviours in an out-patient setting. Clinical relevance: This article discusses a holistic approach to healthcare for people with severe LD, including patient outcomes, financial and resource implications, and offers practical guidance on venepuncture technique, which is relevant to many aspects of both community and hospital dental practice.

  7. A Vision-Based Dynamic Rotational Angle Measurement System for Large Civil Structures

    PubMed Central

    Lee, Jong-Jae; Ho, Hoai-Nam; Lee, Jong-Han

    2012-01-01

    In this paper, we propose a vision-based rotational angle measurement system for large-scale civil structures. Despite the fact that during the last decade several rotation angle measurement systems were introduced, they however often required complex and expensive equipment. Therefore, alternative effective solutions with high resolution are in great demand. The proposed system consists of commercial PCs, commercial camcorders, low-cost frame grabbers, and a wireless LAN router. The calculation of rotation angle is obtained by using image processing techniques with pre-measured calibration parameters. Several laboratory tests were conducted to verify the performance of the proposed system. Compared with the commercial rotation angle measurement, the results of the system showed very good agreement with an error of less than 1.0% in all test cases. Furthermore, several tests were conducted on the five-story modal testing tower with a hybrid mass damper to experimentally verify the feasibility of the proposed system. PMID:22969348

  8. A vision-based dynamic rotational angle measurement system for large civil structures.

    PubMed

    Lee, Jong-Jae; Ho, Hoai-Nam; Lee, Jong-Han

    2012-01-01

    In this paper, we propose a vision-based rotational angle measurement system for large-scale civil structures. Despite the fact that during the last decade several rotation angle measurement systems were introduced, they however often required complex and expensive equipment. Therefore, alternative effective solutions with high resolution are in great demand. The proposed system consists of commercial PCs, commercial camcorders, low-cost frame grabbers, and a wireless LAN router. The calculation of rotation angle is obtained by using image processing techniques with pre-measured calibration parameters. Several laboratory tests were conducted to verify the performance of the proposed system. Compared with the commercial rotation angle measurement, the results of the system showed very good agreement with an error of less than 1.0% in all test cases. Furthermore, several tests were conducted on the five-story modal testing tower with a hybrid mass damper to experimentally verify the feasibility of the proposed system.

  9. Hygrothermal properties of composites

    NASA Technical Reports Server (NTRS)

    Arsenovic, Petar

    1996-01-01

    The testing procedure and acceptance criteria for outgassing selection of materials to be used in spacecraft has been reviewed. Outgassing testing should be conducted according to ASTM Standard E 595-90. In general, materials with CVCM less than or equal to 0.10% and TML less than or equal to 1.00% are acceptable for space applications. Next, test data on several types of graphite-epoxy composite materials are presented over time at various relative humidity levels at room temperature for moisture absorption, and under vacuum at several temperatures for moisture desorption (outgassing). The data can be accurately represented by simple equations which are useful for materials characterization. Finally, a laser dilatometer systems of extremely high sensitivity and accuracy was assembled and used to measure the coefficient of thermal expansion (CTE) of several types of graphite-epoxy structures, culminating in the ability to perform loading and thermal expansion tests on a prototype optical bench.

  10. Douglas Aircraft cabin fire tests

    NASA Technical Reports Server (NTRS)

    Klinck, D.

    1978-01-01

    Program objectives are outlined as follows: (1) examine the thermal and environmental characteristics of three types of fuels burned in two quantities contained within a metal lavatory; (2) determine the hazard experienced in opening the door of a lavatory containing a developed fire; (3) select the most severe source fuel for use in a baseline test; and (4) evaluate the effect of the most severe source upon a lavatory constructed of contemporary materials. All test were conducted in the Douglas Cabin Fire Simulator.

  11. Measuring Quadriceps strength in adults with severe or moderate intellectual and visual disabilities: Feasibility and reliability.

    PubMed

    Dijkhuizen, Annemarie; Douma, Rob K; Krijnen, Wim P; van der Schans, Cees P; Waninge, Aly

    2018-05-30

    A feasible and reliable instrument to measure strength in persons with severe intellectual and visual disabilities (SIVD) is lacking. The aim of our study was to determine feasibility, learning period and reliability of three strength tests. Twenty-nine participants with SIVD performed the Minimum Sit-to-Stand Height test (MSST), the Leg Extension test (LE) and the 30 seconds Chair-Stand test (30sCS), once per week for 5 weeks. Feasibility was determined by the percentage of successful measurements; learning effect by using paired t test between two consecutive measurements; test-retest reliability by intraclass correlation coefficient and Limits of Agreement and, correlations by Pearson correlations. A sufficient feasibility and learning period of the tests was shown. The methods had sufficient test-retest reliability and moderate-to-sufficient correlations. The MSST, the LE, and the 30sCS are feasible tests for measuring muscle strength in persons with SIVD, having sufficient test re-test reliability. © 2018 John Wiley & Sons Ltd.

  12. Changing pattern of natural hazards due to extreme hydro-meteorological conditions (Apulia, southern Italy)

    NASA Astrophysics Data System (ADS)

    Polemio, Maurizio; Lonigro, Teresa

    2013-04-01

    Recent international researches have underlined the evidences of climate changes throughout the world. Among the consequences of climate change, there is the increase in the frequency and magnitude of natural disasters, such as droughts, windstorms, heat waves, landslides, floods and secondary floods (i.e. rapid accumulation or pounding of surface water with very low flow velocity). The Damaging Hydrogeological Events (DHEs) can be defined as the occurrence of one or more simultaneous aforementioned phenomena causing damages. They represent a serious problem, especially in DHE-prone areas with growing urbanisation. In these areas the increasing frequency of extreme hydrological events could be related to climate variations and/or urban development. The historical analysis of DHEs can support decision making and land-use planning, ultimately reducing natural risks. The paper proposes a methodology, based on both historical and time series approaches, used for describing the influence of climatic variability on the number of phenomena observed. The historical approach is finalised to collect phenomenon historical data. The historical flood and landslide data are important for the comprehension of the evolution of a study area and for the estimation of risk scenarios as a basis for civil protection purposes. Phenomenon historical data is useful for expanding the historical period of investigation in order to assess the occurrence trend of DHEs. The time series approach includes the collection and the statistical analysis of climatic and rainfall data (monthly rainfall, wet days, rainfall intensity, and temperature data together with the annual maximum of short-duration rainfall data, from 1 hour to 5 days), which are also used as a proxy for floods and landslides. The climatic and rainfall data are useful to characterise the climate variations and trends and to roughly assess the effects of these trends on river discharge and on the triggering of landslides. The time series approach is completed by tools to analyse simultaneously all data types. The methodology was tested considering a selected Italian region (Apulia, southern Italy). The data were collected in two databases: a damaging hydrogeological event database (1186 landslides and floods since 1918) and a climate database (from 1877; short-duration rainfall from 1921). A statistically significant decreasing trend of rainfall intensity and an increasing trend of temperature, landslides, and DHEs were observed. A generalised decreasing trend of short-duration rainfall was observed. If there is not an evident relationship between climate variability and the variability of DHE occurrences, the role of anthropogenic modifications (increasing use or misuse of flood- and landslide-prone areas) could be hypothesized to justify the increasing occurrences of floods and landslides.. This study identifies the advantages of a simplifying approach to reduce the intrinsic complexities of the spatial-temporal analysis of climate variability, permitting the simultaneous analysis of the modification of flood and landslide occurrences.

  13. Understanding and Applying the QAR Strategy to Improve Test Scores

    ERIC Educational Resources Information Center

    Cummins, Sean; Streiff, Melissa; Ceprano, Maria

    2012-01-01

    The academic landscape has been changing over the last several years bringing with it an emphasis on high stakes testing. Studies conducted over the past several years that have shown the success of the Question-Answer-Relationships (QAR) strategy in helping students develop their comprehension skill. This study looks at the effects of the QAR…

  14. Development of Multiple Regression Equations To Predict Fourth Graders' Achievement in Reading and Selected Content Areas.

    ERIC Educational Resources Information Center

    Hafner, Lawrence E.

    A study developed a multiple regression prediction equation for each of six selected achievement variables in a popular standardized test of achievement. Subjects, 42 fourth-grade pupils randomly selected across several classes in a large elementary school in a north Florida city, were administered several standardized tests to determine predictor…

  15. The Geography of Racial/Ethnic Test Score Gaps. CEPA Working Paper No. 16-10

    ERIC Educational Resources Information Center

    Reardon, Sean F.; Kalogrides, Demetra; Shores, Ken

    2017-01-01

    We estimate racial/ethnic achievement gaps in several hundred metropolitan areas and several thousand school districts in the United States using the results of roughly 200 million standardized math and reading tests administered to public school students from 2009-2013. We show that achievement gaps vary substantially, ranging from nearly 0 in…

  16. Association of MTHFR polymorphism and periodontitis’ severity in Indonesian males

    NASA Astrophysics Data System (ADS)

    Auerkari, E. I.; Purwandhita, R.; Kim, K. R.; Djamal, N.; Masulili, S. L. C.; Suryandari, D. A.; Talbot, C.

    2018-05-01

    Periodontitis is an oral disease with a complex etiology and pathogenesis, but with a suspected contribution by genetic factors. This study aimed to assess the association of polymorphism in MTHFR (methylene tetrahydrofolate reductase, C677T) gene and the severity of periodontitis in Indonesian males. Severity of periodontitis was classified as mild, moderate or severe for 100 consenting, 25 to 60 years old male Indonesians. Using PCR amplification for DNA extracted from blood serum samples, the variation at the SNP polymorphism of the MTHFR (C677T) gene was evaluated by using RFLP, cutting by the restriction enzyme HinfI and subjecting the fragments to electrophoresis on agarose gel. Chi-square testing was mainly used for statistical assessment of the results. The CC genotype (wild type) of the tested polymorphism was the most common variant (78%) and TT (mutant) genotype relatively rare (2%), so that C-allele appeared in 88% of the cases and T-allele in 12% of the cases. The results suggest that there is no significant association between MTHFR C677T polymorphism and the severity of periodontitis in the tested Indonesian males.

  17. Why do trees die? Characterizing the drivers of background tree mortality.

    PubMed

    Das, Adrian J; Stephenson, Nathan L; Davis, Kristin P

    2016-10-01

    The drivers of background tree mortality rates-the typical low rates of tree mortality found in forests in the absence of acute stresses like drought-are central to our understanding of forest dynamics, the effects of ongoing environmental changes on forests, and the causes and consequences of geographical gradients in the nature and strength of biotic interactions. To shed light on factors contributing to background tree mortality, we analyzed detailed pathological data from 200,668 tree-years of observation and 3,729 individual tree deaths, recorded over a 13-yr period in a network of old-growth forest plots in California's Sierra Nevada mountain range. We found that: (1) Biotic mortality factors (mostly insects and pathogens) dominated (58%), particularly in larger trees (86%). Bark beetles were the most prevalent (40%), even though there were no outbreaks during the study period; in contrast, the contribution of defoliators was negligible. (2) Relative occurrences of broad classes of mortality factors (biotic, 58%; suppression, 51%; and mechanical, 25%) are similar among tree taxa, but may vary with tree size and growth rate. (3) We found little evidence of distinct groups of mortality factors that predictably occur together on trees. Our results have at least three sets of implications. First, rather than being driven by abiotic factors such as lightning or windstorms, the "ambient" or "random" background mortality that many forest models presume to be independent of tree growth rate is instead dominated by biotic agents of tree mortality, with potentially critical implications for forecasting future mortality. Mechanistic models of background mortality, even for healthy, rapidly growing trees, must therefore include the insects and pathogens that kill trees. Second, the biotic agents of tree mortality, instead of occurring in a few predictable combinations, may generally act opportunistically and with a relatively large degree of independence from one another. Finally, beyond the current emphasis on folivory and leaf defenses, studies of broad-scale gradients in the nature and strength of biotic interactions should also include biotic attacks on, and defenses of, tree stems and roots. © 2016 by the Ecological Society of America.

  18. Why do trees die? Characterizing the drivers of background tree mortality

    USGS Publications Warehouse

    Das, Adrian J.; Stephenson, Nathan L.; Davis, Kristin P.

    2016-01-01

    The drivers of background tree mortality rates—the typical low rates of tree mortality found in forests in the absence of acute stresses like drought—are central to our understanding of forest dynamics, the effects of ongoing environmental changes on forests, and the causes and consequences of geographical gradients in the nature and strength of biotic interactions. To shed light on factors contributing to background tree mortality, we analyzed detailed pathological data from 200,668 tree-years of observation and 3,729 individual tree deaths, recorded over a 13-yr period in a network of old-growth forest plots in California's Sierra Nevada mountain range. We found that: (1) Biotic mortality factors (mostly insects and pathogens) dominated (58%), particularly in larger trees (86%). Bark beetles were the most prevalent (40%), even though there were no outbreaks during the study period; in contrast, the contribution of defoliators was negligible. (2) Relative occurrences of broad classes of mortality factors (biotic, 58%; suppression, 51%; and mechanical, 25%) are similar among tree taxa, but may vary with tree size and growth rate. (3) We found little evidence of distinct groups of mortality factors that predictably occur together on trees. Our results have at least three sets of implications. First, rather than being driven by abiotic factors such as lightning or windstorms, the “ambient” or “random” background mortality that many forest models presume to be independent of tree growth rate is instead dominated by biotic agents of tree mortality, with potentially critical implications for forecasting future mortality. Mechanistic models of background mortality, even for healthy, rapidly growing trees, must therefore include the insects and pathogens that kill trees. Second, the biotic agents of tree mortality, instead of occurring in a few predictable combinations, may generally act opportunistically and with a relatively large degree of independence from one another. Finally, beyond the current emphasis on folivory and leaf defenses, studies of broad-scale gradients in the nature and strength of biotic interactions should also include biotic attacks on, and defenses of, tree stems and roots.

  19. Scale is not an Issue : Opportunities for Collaboration among Geoscientists in Latin America and the Caribbean

    NASA Astrophysics Data System (ADS)

    Carby, B. E.

    2015-12-01

    Latin American and Caribbean (LAC) countries face multiple hazards such as earthquakes, volcanoes, accelerated erosion, landslides, drought, flooding, windstorms and the effects of climate variability and change. World Bank (2005) data indicate that seventeen of the top thirty-five countries with relatively high mortality risk from 3 or more hazards are located in LAC, El Salvador has the second highest per cent of its population at risk - 77.7% and 7 of the top 10 countries for population exposure to multiple hazards are in LAC. All LAC countries have half or more of GDP exposed to at least one hazard. The report underscores the need for better data and information on hazards and disasters to inform disaster risk reduction (DRR) and supports the view that reduction of disaster risk is essential for achieving Sustainable Development (SD). This suggests that DRR must be integrated into development planning of countries. However the Global Assessment Report notes that globally, there has been little progress in mainstreaming DRR in national development (UNISDR 2009). Without this, countries will not realise development goals. DRR efforts in LAC require an integrated approach including societal input in deciding priority DRR research themes and interdisciplinary, multi-hazard research informing DRR policy and practice. Jiminez (2015) from a study of countries across LAC reports that efforts are being made to link research to national planning through inclusion of policy makers in some university-led research projects. Research by the author in Jamaica reveals that the public sector has started to apply research on hazards to inform DRR policy, programmes and plans. As most research is done by universities, there is collaboration between the public sector and academia. Despite differences in scale among countries across the region, similarities in exposure to multiple hazards and potential hazard impacts suggest that collaboration among researchers in LAC could be beneficial. It is proposed here that this collaboration should go beyond the scientific community and should include sharing of experiences in linking DRR research to national development needs, inclusion of policy makers in research design and implementation and integration of research results in policy and programme development.

  20. Hydrometeorological extremes and their impacts derived from taxation records for south-eastern Moravia (Czech Republic) in the period 1751-1900

    NASA Astrophysics Data System (ADS)

    Chromá, K.; Brázdil, R.; Valášek, H.; Dolák, L.

    2012-04-01

    Hydrometeorological extremes always influenced human activities and caused great material damage or even loss of human lives. In the Czech Lands (recently the Czech Republic), systematic meteorological and hydrological observations started generally in the latter half of the 19th century. In order to create long-term series of hydrometeorological extremes, it is necessary to search for other sources of information for their study before 1850. In this study, written records associated with tax relief at ten estates located in south-eastern Moravia are used for the study of hydrometeorological extremes and their impacts during the period 1751-1900. The taxation system in Moravia allowed farmers to request tax relief if their crop yields had been negatively affected by hydrological and meteorological extremes. The documentation involved contains information about the type of extreme event and the date of its occurrence, and the impacts on crops may often be derived. A total of 175 extreme events resulting in some kind of damage is documented for 1751-1900, with the highest concentration between 1811 and 1860. The nature of events leading to damage (of a possible 272 types) include hailstorm (25.7%), torrential rain (21.7%), and flood (21.0%), followed by thunderstorm, flash flood, late frost and windstorm. The four most outstanding events, affecting the highest number of settlements, were thunderstorms with hailstorms (25 June 1825, 20 May 1847 and 29 June 1890) and flooding of the River Morava (mid-June 1847). Hydrometeorological extremes in the 1816-1855 period are compared with those occurring during the recent 1961-2000 period. The results obtained are inevitably influenced by uncertainties related to taxation records, such as their temporal and spatial incompleteness, the limits of the period of outside agricultural work (i.e. mainly May-August) and the purpose for which they were originally collected (primarily tax alleviation, i.e. information about hydrometeorological extremes was of secondary importance). Taxation records represent an important source of data for historical climatology and historical hydrology and have a great potential for use.

  1. Safety and efficacy of Regadenoson in myocardial perfusion imaging (MPI) stress tests: A review

    NASA Astrophysics Data System (ADS)

    Ahmed, Ambereen

    2018-02-01

    Myocardial perfusion imaging (MPI) tests are often used to help diagnose coronary heart disease (CAD). The tests usually involve applying stress, such as hard physical exercise together with administration of vasodilators, to the patients. To date, many of these tests use non-selective A2A adenosine receptor agonists which, however, can be associated with highly undesirable and life-threatening side effects such as chest pain, dyspnea, severe bronchoconstriction and atrioventricular conduction anomalies. Regadenoson is a relatively new, highly selective A2A adenosine receptor agonist, suitable for use in MPI tests which exhibits far fewer adverse side effects and, unlike others testing agents, can be used without the necessity of excessive concomitant exercise. Also, the dose of regadenoson required is not dependent upon patient weight or renal impairment, and it can be rapidly administered by i.v. Injection. Regadenoson use in MPI testing thus has the potential as a simplified, relatively safe, time-saving and cost-effective method for helping diagnose CAD. The present study was designed to review several articles on the safety, efficacy, and suitability of regadenoson in MPI testing for CAD. Overall, the combined studies demonstrated that use of regadenoson in conjunction with low-level exercise in MPI is a highly efficient and relatively safe test for CAD, especially for more severe health-compromised patients.

  2. Workplace drug testing and worker drug use.

    PubMed

    Carpenter, Christopher S

    2007-04-01

    To examine the nature and extent of the association between workplace drug testing and worker drug use. Repeated cross-sections from the 2000 to 2001 National Household Surveys on Drug Abuse (NHSDA) and the 2002 National Survey on Drug Use and Health (NSDUH). Multivariate logistic regression models of the likelihood of marijuana use are estimated as a function of several different workplace drug policies, including drug testing. Specific questions about penalty severity and the likelihood of detection are used to further evaluate the nature of the association. Individuals whose employers perform drug tests are significantly less likely to report past month marijuana use, even after controlling for a wide array of worker and job characteristics. However, large negative associations are also found for variables indicating whether a firm has drug education, an employee assistance program, or a simple written policy about substance use. Accounting for these other workplace characteristics reduces-but does not eliminate-the testing differential. Frequent testing and severe penalties reduce the likelihood that workers use marijuana. Previous studies have interpreted the large negative correlation between workplace drug testing and employee substance use as representing a causal deterrent effect of drug testing. Our results using more comprehensive data suggest that these estimates have been slightly overstated due to omitted variables bias. The overall pattern of results remains largely consistent with the hypothesis that workplace drug testing deters worker drug use.

  3. Predictors of the On-Road Driving Assessment After Traumatic Brain Injury: Comparing Cognitive Tests, Injury Factors, and Demographics.

    PubMed

    McKay, Adam; Liew, Carine; Schönberger, Michael; Ross, Pamela; Ponsford, Jennie

    (1) To examine the relations between performance on cognitive tests and on-road driving assessment in a sample of persons with traumatic brain injury (TBI). (2) To compare cognitive predictors of the on-road assessment with demographic and injury-related predictors. Ninety-nine people with mild-severe TBI who completed an on-road driving assessment in an Australian rehabilitation setting. Retrospective case series. Wechsler Test of Adult Reading or National Adult Reading Test-Revised; 4 subtests from the Wechsler Adult Intelligence Scale-III; Rey Auditory Verbal Leaning Test; Rey Complex Figure Test; Trail Making Test; demographic factors (age, sex, years licensed); and injury-related factors (duration of posttraumatic amnesia; time postinjury). Participants who failed the driving assessment did worse on measures of attention, visual memory, and executive processing; however, cognitive tests were weak correlates (r values <0.3) and poor predictors of the driving assessment. Posttraumatic amnesia duration mediated by time postinjury was the strongest predictor of the driving assessment-that is, participants with more severe TBIs had later driving assessments and were more likely to fail. Cognitive tests are not reliable predictors of the on-road driving assessment outcome. Traumatic brain injury severity may be a better predictor of on-road driving; however, further research is needed to identify the best predictors of driving behavior after TBI.

  4. Roles of preoperative arterial blood gas tests in the surgical treatment of scoliosis with moderate or severe pulmonary dysfunction.

    PubMed

    Liu, Jia-Ming; Shen, Jian-Xiong; Zhang, Jian-Guo; Zhao, Hong; Li, Shu-Gang; Zhao, Yu; Qiu, Giu-Xing

    2012-01-01

    It has been stated that preoperative pulmonary function tests are essential to assess the surgical risk in patients with scoliosis. Arterial blood gas tests have also been used to evaluate pulmonary function before scoliotic surgery. However, few studies have been reported. The aim of this study was to investigate the roles of preoperative arterial blood gas tests in the surgical treatment of scoliosis with moderate or severe pulmonary dysfunction. This study involved scoliotic patients with moderate or severe pulmonary dysfunction (forced vital capacity < 60%) who underwent surgical treatment between January 2002 and April 2010. A total of 73 scoliotic patients (23 males and 50 females) with moderate or severe pulmonary dysfunction were included. The average age of the patients was 16.53 years (ranged 10 - 44). The demographic distribution, medical records, and radiographs of all patients were collected. All patients received arterial blood gas tests and pulmonary function tests before surgery. The arterial blood gas tests included five parameters: partial pressure of arterial oxygen, partial pressure of arterial carbon dioxide, alveolar-arterial oxygen tension gradient, pH, and standard bases excess. The pulmonary function tests included three parameters: forced expiratory volume in 1 second ratio, forced vital capacity ratio, and peak expiratory flow ratio. All five parameters of the arterial blood gas tests were compared between the two groups with or without postoperative pulmonary complications by variance analysis. Similarly, all three parameters of the pulmonary function tests were compared. The average coronal Cobb angle before surgery was 97.42° (range, 50° - 180°). A total of 15 (20.5%) patients had postoperative pulmonary complications, including hypoxemia in 5 cases (33.3%), increased requirement for postoperative ventilatory support in 4 (26.7%), pneumonia in 2 (13.3%), atelectasis in 2 (13.3%), pneumothorax in 1 (6.7%), and hydrothorax in 1 (6.7%). No significant differences in demographic characteristics or perioperative factors (P > 0.05) existed between the two groups with or without postoperative pulmonary complications. According to the variance analysis, there were no statistically significant differences in any parameter of the arterial blood gas tests between the two groups. No significant correlation between the results of the preoperative arterial blood gas tests and postoperative pulmonary complications existed in scoliotic patients with moderate or severe pulmonary dysfunction. However, the postoperative complications tended to increase with the decrease of partial pressure of arterial oxygen in the arterial blood gas tests.

  5. A discussion on disease severity index values: using the disease severity index for null hypothesis testing

    USDA-ARS?s Scientific Manuscript database

    A disease severity index (DSI) is a single number for summarizing a large amount of information on disease severity. It has been used to indicate the performance of a cultivar in regard to disease resistance at a particular location, to relate disease severity to yield loss, to determine the effecti...

  6. Symptom Presentation and Prescription of Sleep Medications for Veterans With Posttraumatic Stress Disorder.

    PubMed

    Greenbaum, Mark A; Neylan, Thomas C; Rosen, Craig S

    2017-02-01

    This study tested whether sleep medications prescribed to veterans diagnosed with posttraumatic stress disorder (PTSD) are being targeted to patients who report more severe insomnia or nightmares. Secondary analysis of survey and pharmacy data was conducted in samples of veterans from two periods: from 2006 to 2008 and from 2009 to 2013. Logistic regression tested associations between self-reported insomnia and nightmare severity, and being prescribed trazodone, prazosin, zolpidem, and benzodiazepines, controlling for PTSD severity and other covariates. In both samples, insomnia severity independently predicted trazodone receipt, and nightmare severity independently predicted prazosin receipt. In the later study, insomnia severity predicted receipt of zolpidem. Veterans in the later sample were more likely to receive trazodone, prazosin, and non-benzodiazepine hypnotics, and less likely to receive benzodiazepines than those in the earlier sample. Further research is needed to evaluate and optimize pharmacological and psychosocial treatments for sleep problems among veterans with PTSD.

  7. Program CONTRAST--A general program for the analysis of several survival or recovery rate estimates

    USGS Publications Warehouse

    Hines, J.E.; Sauer, J.R.

    1989-01-01

    This manual describes the use of program CONTRAST, which implements a generalized procedure for the comparison of several rate estimates. This method can be used to test both simple and composite hypotheses about rate estimates, and we discuss its application to multiple comparisons of survival rate estimates. Several examples of the use of program CONTRAST are presented. Program CONTRAST will run on IBM-cimpatible computers, and requires estimates of the rates to be tested, along with associated variance and covariance estimates.

  8. The effects of simulated fog and motion on simulator sickness in a driving simulator and the duration of after-effects.

    PubMed

    Dziuda, Lukasz; Biernacki, Marcin P; Baran, Paulina M; Truszczyński, Olaf E

    2014-05-01

    In the study, we checked: 1) how the simulator test conditions affect the severity of simulator sickness symptoms; 2) how the severity of simulator sickness symptoms changes over time; and 3) whether the conditions of the simulator test affect the severity of these symptoms in different ways, depending on the time that has elapsed since the performance of the task in the simulator. We studied 12 men aged 24-33 years (M = 28.8, SD = 3.26) using a truck simulator. The SSQ questionnaire was used to assess the severity of the symptoms of simulator sickness. Each of the subjects performed three 30-minute tasks running along the same route in a driving simulator. Each of these tasks was carried out in a different simulator configuration: A) fixed base platform with poor visibility; B) fixed base platform with good visibility; and C) motion base platform with good visibility. The measurement of the severity of the simulator sickness symptoms took place in five consecutive intervals. The results of the analysis showed that the simulator test conditions affect in different ways the severity of the simulator sickness symptoms, depending on the time which has elapsed since performing the task on the simulator. The simulator sickness symptoms persisted at the highest level for the test conditions involving the motion base platform. Also, when performing the tasks on the motion base platform, the severity of the simulator sickness symptoms varied depending on the time that had elapsed since performing the task. Specifically, the addition of motion to the simulation increased the oculomotor and disorientation symptoms reported as well as the duration of the after-effects. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  9. Genetic Testing Integration Panels (GTIPs): A novel approach for considering integration of direct-to-consumer and other new genetic tests into patient care

    PubMed Central

    Uhlmann, Wendy R.; Sharp, Richard R.

    2014-01-01

    There has been a dramatic increase in the number of genetic tests available but few tests have practice guidelines. In addition, many tests have become available outside of genetics clinics through direct-to-consumer (DTC) companies and several offer tests not considered standard of care. To address several practical challenges associated with the rapid introduction of clinical and DTC genetic tests, we propose that genetic counselors and geneticists organize expert panels in their institutions to discuss the integration of new tests into patient care. We propose the establishment of Genetic Testing Integration Panels (GTIPs) to bring together local experts in medical genetics, genetic counseling, bioethics and law, health communication and clinical laboratory genetics. We describe key features of this approach and consider some of the potential advantages and limitations of using a GTIP to address the many clinical challenges raised by rapidly emerging clinical and DTC genetic tests. PMID:22246561

  10. Graded Aerobic Treadmill Testing in Adolescent Traumatic Brain Injury Patients.

    PubMed

    Cordingley, Dean M; Girardin, Richard; Morissette, Marc P; Reimer, Karen; Leiter, Jeff; Russell, Kelly; Ellis, Michael J

    2017-11-01

    To examine the safety and tolerability of clinical graded aerobic treadmill testing in recovering adolescent moderate and severe traumatic brain injury (TBI) patients referred to a multidisciplinary pediatric concussion program. We completed a retrospective case series of two moderate and five severe TBI patients (mean age, 17.3 years) who underwent initial Buffalo Concussion Treadmill Testing at a mean time of 71.6 days (range, 55-87) postinjury. Six patients completed one graded aerobic treadmill test each and one patient underwent initial and repeat testing. There were no complications. Five initial treadmill tests were completely tolerated and allowed an accurate assessment of exercise tolerance. Two initial tests were terminated early by the treatment team because of neurological and cardiorespiratory limitations. As a result of testing, two patients were cleared for aerobic exercise as tolerated and four patients were treated with individually tailored submaximal aerobic exercise programs resulting in subjective improvement in residual symptoms and/or exercise tolerance. Repeat treadmill testing in one patient performed after 1 month of treatment with submaximal aerobic exercise prescription was suggestive of improved exercise tolerance. One patient was able to tolerate aerobic exercise following surgery for posterior glottic stenosis. Preliminary results suggest that graded aerobic treadmill testing is a safe, well tolerated, and clinically useful tool to assess exercise tolerance in appropriately selected adolescent patients with TBI. Future prospective studies are needed to evaluate the effect of tailored submaximal aerobic exercise prescription on exercise tolerance and patient outcomes in recovering adolescent moderate and severe TBI patients.

  11. Severe hypokinesis caused by paraneoplastic anti-Ma2 encephalitis associated with bilateral intratubular germ-cell neoplasm of the testes.

    PubMed

    Matsumoto, Lumine; Yamamoto, Tomotaka; Higashihara, Mana; Sugimoto, Izumi; Kowa, Hisatomo; Shibahara, Junji; Nakamura, Koichiro; Shimizu, Jun; Ugawa, Yoshikazu; Goto, Jun; Dalmau, Josep; Tsuji, Shoji

    2007-04-15

    We report a 40-year-old man with severe hypokinesis as paraneoplastic manifestation of a microscopic "carcinoma in situ" of the testis. The young age of the patient, along with progressive neurologic deterioration, detection of anti-Ma2 antibodies, and ultrasound findings of bilateral microcalcifications, led to bilateral orchiectomy, revealing the tumor in both testes. After orchiectomy, neurological symptoms stabilized, but the patient eventually died of systemic complications caused by his severe neurological deficits. Anti-Ma2 paraneoplastic encephalitis should be considered in patients with severe hypokinesis, and intensive investigation and aggressive approach to treatment is encouraged to prevent progression of the neurological deficits.

  12. Severe Hypokinesis Caused by Paraneoplastic Anti-Ma2 Encephalitis Associated with Bilateral Intratubular Germ-Cell Neoplasm of the Testes

    PubMed Central

    Matsumoto, Lumine; Yamamoto, Tomotaka; Higashihara, Mana; Sugimoto, Izumi; Kowa, Hisatomo; Shibahara, Junji; Nakamura, Koichiro; Shimizu, Jun; Ugawa, Yoshikazu; Goto, Jun; Dalmau, Josep; Tsuji, Shoji

    2007-01-01

    We report a 40-year-old man with severe hypokinesis as paraneoplastic manifestation of a microscopic “carcinoma in situ” of the testis. The young age of the patient, along with progressive neurologic deterioration, detection of anti-Ma2 antibodies, and ultrasound findings of bilateral microcalcifications, led to bilateral orchiectomy, revealing the tumor in both testes. After orchiectomy, neurological symptoms stabilized, but the patient eventually died of systemic complications caused by his severe neurological deficits. Anti-Ma2 paraneoplastic encephalitis should be considered in patients with severe hypokinesis, and intensive investigation and aggressive approach to treatment is encouraged to prevent progression of the neurological deficits. PMID:17269131

  13. Severity and presence of atherosclerosis signs within the segments of internal carotid artery: CBCT's contribution.

    PubMed

    Damaskos, Spyros; da Silveira, Heraldo L D; Berkhout, Erwin W R

    2016-07-01

    This study aims to assess with cone-beam computed tomography the distribution and interrelation of the presence of calcifications along the course of the internal carotid artery and to associate their severity with their allocation within the segments of internal carotid artery, gender, and age. Using a documented visual scale, 161 cone-beam computed tomography scans were evaluated on the allocation and severity of intracranial calcifications within the segments of the internal carotid artery. Calcifications were detected along the petrous (C2: 11.8%), lacerum (C3: 23.6%), cavernous (C4: 92.5%), and ophthalmic-clinoid (C5/C6: 65.8%) segments. The Friedman test showed significant differences in severity distribution among these segments; the highest degree was found in the C4 segment (P < .05). The Wilcoxon signed-rank test showed no significant differences between calcifications on the right or left side or between severities within the C1 (extracranial) and C5/C6 segments. The Chi-square test showed that the severity and allocation of calcifications are not influenced by gender; it also showed that their severity increases with age (P < .05). In the cohort studied, the incidence of calcifications increased throughout the C1, C5/C6, and C4 segments. More severe calcifications were found at the C4, C1, and C5/C6 segments in decreasing order but increased with age, regardless of gender. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Performance of subjects with and without severe mental illness on a clinical test of problem solving.

    PubMed

    Marshall, R C; McGurk, S R; Karow, C M; Kairy, T J; Flashman, L A

    2006-06-01

    Severe mental illness is associated with impairments in executive functions, such as conceptual reasoning, planning, and strategic thinking all of which impact problem solving. The present study examined the utility of a novel assessment tool for problem solving, the Rapid Assessment of Problem Solving Test (RAPS) in persons with severe mental illness. Subjects were 47 outpatients with severe mental illness and an equal number healthy controls matched for age and gender. Results confirmed all hypotheses with respect to how subjects with severe mental illness would perform on the RAPS. Specifically, the severely mentally ill subjects (1) solved fewer problems on the RAPS, (2) when they did solve problems on the test, they did so far less efficiently than their healthy counterparts, and (3) the two groups differed markedly in the types of questions asked on the RAPS. The healthy control subjects tended to take a systematic, organized, but not always optimal approach to solving problems on the RAPS. The subjects with severe mental illness used some of the problem solving strategies of the healthy controls, but their performance was less consistent and tended to deteriorate when the complexity of the problem solving task increased. This was reflected by a high degree of guessing in lieu of asking constraint questions, particularly if a category-limited question was insufficient to continue the problem solving effort.

  15. Chinese adaptation and validation of the patellofemoral pain severity scale.

    PubMed

    Cheung, Roy T H; Ngai, Shirley P C; Lam, Priscillia L; Chiu, Joseph K W; Fung, Eric Y H

    2013-05-01

    This study validated the Patellofemoral Pain Severity Scale translated into Chinese. The Chinese Patellofemoral Pain Severity Scale was translated from the original English version following standard forward and backward translation procedures recommended by the International Society for Pharmacoeconomics and Outcomes Research. The survey was then conducted in clinical settings by a questionnaire comprising the Chinese Patellofemoral Pain Severity Scale, Kujala Scale and Western Ontario and McMaster Universities (WOMAC) Osteoarthritis Index. Eighty-four Chinese reading patients with patellofemoral pain were recruited from physical therapy clinics. Internal consistency of the translated instrument was measured by Cronbach alpha. Convergent validity was examined by Spearman rank correlation coefficient (rho) tests by comparing its score with the validated Chinese version of the Kujala Scale and the WOMAC Osteoarthritis Index while the test-retest reliability was evaluated by administering the questionnaires twice. Cronbach alpha values of individual questions and their overall value were above 0.85. Strong association was found between the Chinese Patellofemoral Pain Severity Scale and the Kujala Scale (rho = -0.72, p < 0.001). Moderate correlation was also found between Chinese Patellofemoral Pain Severity Scale with the WOMAC Osteoarthritis Index (rho = 0.63, p < 0.001). Excellent test-retest reliability (Intraclass correlation coefficient = 0.98) was demonstrated. The Chinese translated version of the Patellofemoral Pain Severity Scale is a reliable and valid instrument for patients with patellofemoral pain.

  16. Pathogenicity Tests on Nine Mosquito Species and Several Non-target Organisms with Strelkovimermis spiculatus (Nemata Mermithidae).

    PubMed

    Becnel, J J; Johnson, M A

    1998-12-01

    Nine species of mosquitoes and several species of non-target aquatic organisms were tested for susceptibility to the mernaithid nematode, Strelkovimermis spiculatus. All species of Anopheles, Aedes, Culex, and Toxorhynchites exposed to S. spiculatus were susceptible. Of the nine mosquito species tested, C. pipiens quinquefasciatus had the greatest tolerance to initial invasion and the highest percent infection of those that survived. High levels of infection were also achieved with Aedes taeniorhynchus and A. albopictus, but these mosquitoes were significantly less tolerant to parasitism than C. pipiens quinquefasciatus. Strelkovimermis spiculatus did not infect or develop in any of the non-target hosts tested.

  17. [Automated analyzer of enzyme immunoassay].

    PubMed

    Osawa, S

    1995-09-01

    Automated analyzers for enzyme immunoassay can be classified by several points of view: the kind of labeled antibodies or enzymes, detection methods, the number of tests per unit time, analytical time and speed per run. In practice, it is important for us consider the several points such as detection limits, the number of tests per unit time, analytical range, and precision. Most of the automated analyzers on the market can randomly access and measure samples. I will describe the recent advance of automated analyzers reviewing their labeling antibodies and enzymes, the detection methods, the number of test per unit time and analytical time and speed per test.

  18. Validation of Scratching Severity as an Objective Assessment for Itch.

    PubMed

    Udkoff, Jeremy; Silverberg, Jonathan I

    2018-05-01

    There are currently no simple, standardized, objective assessments of itch for clinical trials and practice. We sought to validate and test the severity of scratching as an objective measure of itch (4-point ordinal scale ranging from 0 [not present] to 3 [very prominent] based on the observation of scratching lesions). We performed a prospective outpatient study using questionnaires and evaluations by a dermatologist in adults with atopic dermatitis (n = 261). Severity of scratching best correlated with patient-reported global atopic dermatitis severity (Kendall τ = 0.336, P < 0.0001), numeric rating scale of itch in the past 24 hours (τ = 0.266, P = 0.0010) and 3 days (τ = 0.296, P < 0.0001). Severity of scratching showed responsiveness over time. Patients experiencing improvement of scratching severity of 1 point or greater had significantly lower itch based on numeric rating scale in the past 3 days (Wilcoxon rank sum test, P = 0.0175), 5-D itch scale (P = 0.0146), and Patient-Oriented Eczema Measure scores (P = 0.0146). There was a significant decrease in scratching severity for patients experiencing itch improvement of 4 points or greater in the past 3 days on the numeric rating scale (Fisher exact test, P = 0.0026), Patient-Oriented Eczema Measure (P < 0.0001), and Dermatology Life Quality Index (P = 0.0285). Severity of scratching may be a useful endpoint in clinical trials and practice across the gamut of pruritic disorders. Future studies are needed to validate severity of scratching in other pruritic disease. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  19. The epidemiology of asthma and its comorbidities in Poland--Health problems of patients with severe asthma as evidenced in the Province of Lodz.

    PubMed

    Panek, Michał; Mokros, Łukasz; Pietras, Tadeusz; Kuna, Piotr

    2016-03-01

    Population studies supply interesting data regarding the epidemiology, comorbidity and risk factors of asthma, which have direct clinical implications for patients. The aim of the work was to evaluate the degree of severity of asthma in the studied group, the levels of anti-asthma treatment, the prevalence of asthma comorbidities and their influence on the clinical course of the illness. The study encompassed 451 participants: 52.11% were asthma patients (study group) and 47.89% were healthy subjects (controls). Respiratory function tests, ACT™ test and skin prick tests were performed. Asthma severity was mild in 14.89%, moderate in 49.36% and severe in 35.74%. Oral GCS were used by 29%, inhalers 44%, LABA 68%, SABA 67%, LAMA 6%, SAMA 14% and MX 16%. Rhinitis and allergy were significantly more common in patients. GERD and neurological diseases were risk factors for asthma, and GERD significantly intensified the risk of severe asthma. GERD, atherosclerosis, hypertension, ischaemic heart disease and other cardiac diseases, lipid disorders, COPD, and the presence of any neoplastic disease significantly worsened the degree of asthma control. Severe asthma was a significant clinical issue in over 35% of cases. The most commonly-used group of drugs were LABAs, while inhaled GCS and LAMA were uncommon, especially among severe cases. A significant problem was the high percentage of systemic GCS used by severe cases. The most important risk factor for asthma, including its severe form, is GERD. Numerous comorbid conditions significantly worsen the degree of asthma control. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Cardiopulmonary exercise testing early after stroke using feedback-controlled robotics-assisted treadmill exercise: test-retest reliability and repeatability.

    PubMed

    Stoller, Oliver; de Bruin, Eling D; Schindelholz, Matthias; Schuster-Amft, Corina; de Bie, Rob A; Hunt, Kenneth J

    2014-10-11

    Exercise capacity is seriously reduced after stroke. While cardiopulmonary assessment and intervention strategies have been validated for the mildly and moderately impaired populations post-stroke, there is a lack of effective concepts for stroke survivors suffering from severe motor limitations. This study investigated the test-retest reliability and repeatability of cardiopulmonary exercise testing (CPET) using feedback-controlled robotics-assisted treadmill exercise (FC-RATE) in severely motor impaired individuals early after stroke. 20 subjects (age 44-84 years, <6 month post-stroke) with severe motor limitations (Functional Ambulatory Classification 0-2) were selected for consecutive constant load testing (CLT) and incremental exercise testing (IET) within a powered exoskeleton, synchronised with a treadmill and a body weight support system. A manual human-in-the-loop feedback system was used to guide individual work rate levels. Outcome variables focussed on standard cardiopulmonary performance parameters. Relative and absolute test-retest reliability were assessed by intraclass correlation coefficients (ICC), standard error of the measurement (SEM), and minimal detectable change (MDC). Mean difference, limits of agreement, and coefficient of variation (CoV) were estimated to assess repeatability. Peak performance parameters during IET yielded good to excellent relative reliability: absolute peak oxygen uptake (ICC =0.82), relative peak oxygen uptake (ICC =0.72), peak work rate (ICC =0.91), peak heart rate (ICC =0.80), absolute gas exchange threshold (ICC =0.91), relative gas exchange threshold (ICC =0.88), oxygen cost of work (ICC =0.87), oxygen pulse at peak oxygen uptake (ICC =0.92), ventilation rate versus carbon dioxide output slope (ICC =0.78). For these variables, SEM was 4-13%, MDC 12-36%, and CoV 0.10-0.36. CLT revealed high mean differences and insufficient test-retest reliability for all variables studied. This study presents first evidence on reliability and repeatability for CPET in severely motor impaired individuals early after stroke using a feedback-controlled robotics-assisted treadmill. The results demonstrate good to excellent test-retest reliability and appropriate repeatability for the most important peak cardiopulmonary performance parameters. These findings have important implications for the design and implementation of cardiovascular exercise interventions in severely impaired populations. Future research needs to develop advanced control strategies to enable the true limit of functional exercise capacity to be reached and to further assess test-retest reliability and repeatability in larger samples.

  1. The relevance of basophil allergen sensitivity testing to distinguish between severe and mild peanut-allergic children.

    PubMed

    Homšak, Matjaž; Silar, Mira; Berce, Vojko; Tomazin, Maja; Skerbinjek-Kavalar, Maja; Celesnik, Nina; Košnik, Mitja; Korošec, Peter

    2013-01-01

    Peanut sensitization is common in children. However, it is difficult to assess which children will react mildly and which severely. This study evaluated the relevance of basophil allergen sensitivity testing to distinguish the severity of peanut allergy in children. Twenty-seven peanut-sensitized children with symptoms varying from mild symptoms to severe anaphylaxis underwent peanut CD63 dose-response curve analysis with the inclusion of basophil allergen sensitivity calculation (CD-sens) and peanut component immunoglobulin E (IgE) testing. Eleven children who had experienced anaphylaxis to peanuts showed a markedly higher peanut CD63 response at submaximal allergen concentrations and CD-sens (median 1,667 vs. 0.5; p < 0.0001) than 16 children who experienced a milder reaction. Furthermore, a negative or low CD-sens to peanuts unambiguously excluded anaphylactic peanut allergy. Children with anaphylaxis have higher levels of Ara h 1, 2, 3 and 9 IgE, but comparable levels of IgE to Ara h 8 and whole-peanut extract. The diagnostic specificity calculated with a receiver operating characteristic analysis reached 100% for CD-sens and 73% for Ara h 2. We demonstrated that severe peanut allergy is significantly associated with higher basophil allergen sensitivity. This cellular test should facilitate a more accurate diagnosis of peanut allergy. © 2013 S. Karger AG, Basel.

  2. Mechanical waves conceptual survey: Its modification and conversion to a standard multiple-choice test

    NASA Astrophysics Data System (ADS)

    Barniol, Pablo; Zavala, Genaro

    2016-06-01

    In this article we present several modifications of the mechanical waves conceptual survey, the most important test to date that has been designed to evaluate university students' understanding of four main topics in mechanical waves: propagation, superposition, reflection, and standing waves. The most significant changes are (i) modification of several test questions that had some problems in their original design, (ii) standardization of the number of options for each question to five, (iii) conversion of the two-tier questions to multiple-choice questions, and (iv) modification of some questions to make them independent of others. To obtain a final version of the test, we administered both the original and modified versions several times to students at a large private university in Mexico. These students were completing a course that covers the topics tested by the survey. The final modified version of the test was administered to 234 students. In this study we present the modifications for each question, and discuss the reasons behind them. We also analyze the results obtained by the final modified version and offer a comparison between the original and modified versions. In the Supplemental Material we present the final modified version of the test. It can be used by teachers and researchers to assess students' understanding of, and learning about, mechanical waves.

  3. The window of opportunity: decision theory and the timing of prognostic tests for newborn infants.

    PubMed

    Wilkinson, Dominic

    2009-11-01

    In many forms of severe acute brain injury there is an early phase when prognosis is uncertain, followed later by physiological recovery and the possibility of more certain predictions of future impairment. There may be a window of opportunity for withdrawal of life support early, but if decisions are delayed there is the risk that the patient will survive with severe impairment. In this paper I focus on the example of neonatal encephalopathy and the question of the timing of prognostic tests and decisions to continue or to withdraw life-sustaining treatment. Should testing be performed early or later; and how should parents decide what to do given the conflicting values at stake? I apply decision theory to the problem, using sensitivity analysis to assess how different features of the tests or different values would affect a decision to perform early or late prognostic testing. I draw some general conclusions from this model for decisions about the timing of testing in neonatal encephalopathy. Finally I consider possible solutions to the problem posed by the window of opportunity. Decision theory highlights the costs of uncertainty. This may prompt further research into improving prognostic tests. But it may also prompt us to reconsider our current attitudes towards the palliative care of newborn infants predicted to be severely impaired.

  4. Developmental Pathways to Conduct Problems: A Further Test of the Childhood and Adolescent-Onset Distinction

    ERIC Educational Resources Information Center

    Dandreaux, Danielle M.; Frick, Paul J.

    2009-01-01

    This study tested several theoretically important differences between youth with a childhood-onset and youth with an adolescent-onset to their severe conduct problems. Seventy-eight pre-adjudicated adolescent boys (ranging in age from 11 to 18) housed in two short-term detention facilities and one outpatient program for youth at risk for…

  5. Do Bioflavonoids in Juniperus virginiana Heartwood Stimulate Oviposition in the Ladybird Coleomegilla maculata?

    PubMed Central

    Riddick, Eric W; Wu, Zhixin; Eller, Fred J; Berhow, Mark A

    2018-01-01

    Maximizing the reproductive potential of ladybird beetles fed factitious foods or artificial diets, in lieu of natural prey, is a major challenge to cost-effective mass rearing for augmentative biological control. In this study, we tested the hypothesis that compounds in redcedar, Juniperus virginiana, stimulate oviposition in the ladybird Coleomegilla maculata. We also tested the prediction that several bioflavonoids, identified in heartwood fractions, elicited this behavioral response. Phenolic compounds were extracted from J. virginiana heartwood sawdust, separated into several fractions, then presented to adult beetles, in a powdered, pure form, in the laboratory. Females preferentially oviposited within 1 to 2 cm of fractions B, C, D, and E, but not A or the unfractionated extract, at the base of test cages. Chemical analysis identified bioflavonoids in heartwood fractions and subsequent bioassays using several identified in fractions C, D, and E confirmed that quercetin, taxifolin, and naringenin (to a lesser extent) stimulated oviposition. All tested fractions and bioflavonoids readily adhered to the chorion of freshly laid eggs but did not reduce egg hatch. This study demonstrates that several bioflavonoids stimulate oviposition by C. maculata and could be useful for mass rearing programs. PMID:29531477

  6. Hydrops fetalis

    MedlinePlus

    ... in skin color (pallor) More severe forms may cause: Breathing problems Bruising or purplish bruise-like spots on the skin Heart failure Severe anemia Severe jaundice Total body swelling Exams and Tests An ultrasound done during pregnancy may show: High levels of amniotic fluid Abnormally ...

  7. Noise-immune multisensor transduction of speech

    NASA Astrophysics Data System (ADS)

    Viswanathan, Vishu R.; Henry, Claudia M.; Derr, Alan G.; Roucos, Salim; Schwartz, Richard M.

    1986-08-01

    Two types of configurations of multiple sensors were developed, tested and evaluated in speech recognition application for robust performance in high levels of acoustic background noise: One type combines the individual sensor signals to provide a single speech signal input, and the other provides several parallel inputs. For single-input systems, several configurations of multiple sensors were developed and tested. Results from formal speech intelligibility and quality tests in simulated fighter aircraft cockpit noise show that each of the two-sensor configurations tested outperforms the constituent individual sensors in high noise. Also presented are results comparing the performance of two-sensor configurations and individual sensors in speaker-dependent, isolated-word speech recognition tests performed using a commercial recognizer (Verbex 4000) in simulated fighter aircraft cockpit noise.

  8. Preliminary test results of the joint FAA-USAF-NASA runway research program. Part 1: Traction measurements of several runways under wet and dry conditions with a Boeing 727, a diagonal-braked vehicle, and a mu-meter

    NASA Technical Reports Server (NTRS)

    Horne, W. B.; Yager, T. J.; Sleeper, R. K.; Merritt, L. R.

    1977-01-01

    The stopping distance, brake application velocity, and time of brake application were measured for two modern jet transports, along with the NASA diagonal-braked vehicle and the British Mu-Meter on several runways, which when wetted, cover the range of slipperiness likely to be encountered in the United States. Tests were designed to determine if correlation between the aircraft and friction measuring vehicles exists. The test procedure, data reduction techniques, and preliminary test results obtained with the Boeing 727, the Douglas DC-9, and the ground vehicles are given. Time histories of the aircraft test run parameters are included.

  9. Automation of the novel object recognition task for use in adolescent rats

    PubMed Central

    Silvers, Janelle M.; Harrod, Steven B.; Mactutus, Charles F.; Booze, Rosemarie M.

    2010-01-01

    The novel object recognition task is gaining popularity for its ability to test a complex behavior which relies on the integrity of memory and attention systems without placing undue stress upon the animal. While the task places few requirements upon the animal, it traditionally requires the experimenter to observe the test phase directly and record behavior. This approach can severely limit the number of subjects which can be tested in a reasonable period of time, as training and testing occur on the same day and span several hours. The current study was designed to test the feasibility of automation of this task for adolescent rats using standard activity chambers, with the goals of increased objectivity, flexibility, and throughput of subjects. PMID:17719091

  10. Robustness of statistical tests for multiplicative terms in the additive main effects and multiplicative interaction model for cultivar trials.

    PubMed

    Piepho, H P

    1995-03-01

    The additive main effects multiplicative interaction model is frequently used in the analysis of multilocation trials. In the analysis of such data it is of interest to decide how many of the multiplicative interaction terms are significant. Several tests for this task are available, all of which assume that errors are normally distributed with a common variance. This paper investigates the robustness of several tests (Gollob, F GH1, FGH2, FR)to departures from these assumptions. It is concluded that, because of its better robustness, the F Rtest is preferable. If the other tests are to be used, preliminary tests for the validity of assumptions should be performed.

  11. Is Pregnancy Associated with Severe Dengue? A Review of Data from the Rio de Janeiro Surveillance Information System

    PubMed Central

    Machado, Carolina Romero; Machado, Elizabeth Stankiewicz; Rohloff, Roger Denis; Azevedo, Marina; Campos, Dayse Pereira; de Oliveira, Robson Bruniera; Brasil, Patrícia

    2013-01-01

    Background Dengue is a reportable disease in Brazil; however, pregnancy has been included in the application form of the Brazilian notification information system only after 2006. To estimate the severity of maternal dengue infection, the available data that were compiled from January 2007 to December 2008 by the official surveillance information system of the city of Rio de Janeiro were reviewed. Methods and Principal Findings During the study period, 151,604 cases of suspected dengue infection were reported. Five hundred sixty-one women in their reproductive age (15–49 years) presented with dengue infection; 99 (18.1%) pregnant and 447 (81.9%) non-pregnant women were analyzed. Dengue cases were categorized using the 1997 WHO classification system, and DHF/DSS were considered severe disease. The Mann-Whitney test was used to compare maternal age, according to gestational period, and severity of disease. A chi-square test was utilized to evaluate the differences in the proportion of dengue severity between pregnant and non-pregnant women. Univariate analysis was performed to compare outcome variables (severe dengue and non-severe dengue) and explanatory variables (pregnancy, gestational age and trimester) using the Wald test. A multivariate analysis was performed to assess the independence of statistically significant variables in the univariate analysis. A p-value<0.05 was considered statistically significant. A higher percentage of severe dengue infection among pregnant women was found, p = 0.0001. Final analysis demonstrated that pregnant women are 3.4 times more prone to developing severe dengue (OR: 3.38; CI: 2.10–5.42). Mortality among pregnant women was superior to non-pregnant women. Conclusion Pregnant women have an increased risk of developing severe dengue infection and dying of dengue. PMID:23675548

  12. Consideration of "g" as a Common Antecedent for Cognitive Ability Test Performance, Test Motivation, and Perceived Fairness

    ERIC Educational Resources Information Center

    Reeve, Charlie L.; Lam, Holly

    2007-01-01

    Several different analyses were used to test the hypothesis that test-taking motivation, perceived test fairness, and actual test performance are correlated only because they share a common antecedent. First, hierarchical regressions reveal that initial test performance has a unique influence on non-ability factors even after controlling for…

  13. Comparison of test protocols for standard room/corner tests

    Treesearch

    R. H. White; M. A. Dietenberger; H. Tran; O. Grexa; L. Richardson; K. Sumathipala; M. Janssens

    1998-01-01

    As part of international efforts to evaluate alternative reaction-to-fire tests, several series of room/comer tests have been conducted. This paper reviews the overall results of related projects in which different test protocols for standard room/corner tests were used. Differences in the test protocols involved two options for the ignition burner scenario and whether...

  14. Item Pocket Method to Allow Response Review and Change in Computerized Adaptive Testing

    ERIC Educational Resources Information Center

    Han, Kyung T.

    2013-01-01

    Most computerized adaptive testing (CAT) programs do not allow test takers to review and change their responses because it could seriously deteriorate the efficiency of measurement and make tests vulnerable to manipulative test-taking strategies. Several modified testing methods have been developed that provide restricted review options while…

  15. Teaching beyond the Test: A Method for Designing Test-Preparation Classes

    ERIC Educational Resources Information Center

    Derrick, Deirdre

    2013-01-01

    Test-preparation classes that focus on skills will benefit students beyond the test by developing skills they can use at university. This article discusses the purposes of various tests and outlines how to design effective test-prep classes. Several practical activities are included, and an appendix provides information on common standardized…

  16. Comparing Direct versus Indirect Measures of the Pedagogical Effectiveness of Team Testing

    ERIC Educational Resources Information Center

    Bacon, Donald R.

    2011-01-01

    Direct measures (tests) of the pedagogical effectiveness of team testing and indirect measures (student surveys) of pedagogical effectiveness of team testing were collected in several sections of an undergraduate marketing course with varying levels of the use of team testing. The results indicate that although students perceived team testing to…

  17. Dyssynergic defecation may play an important role in postoperative Hirschsprung's disease patients with severe persistent constipation: analysis of a case series.

    PubMed

    Meinds, Rob J; Eggink, Maura C; Heineman, Erik; Broens, Paul M A

    2014-10-01

    After surgery for Hirschsprung's disease (HD) the majority of patients have satisfactory clinical outcomes. Nevertheless, a substantial number of patients remain who suffer from severe persistent constipation. Current consensus attributes these complaints to the hallmarks of HD. In non-HD patients a cause for severe constipation is dyssynergic defecation. Retrospectively, we reviewed the medical records of ten postoperative HD patients with severe persistent constipation who had undergone extensive anorectal function tests to diagnose the reason for the constipation. We analyzed the results of these tests. During the last three years, ten postoperative HD patients with severe persistent constipation were given extensive anorectal function tests. All ten patients were diagnosed with dyssynergic defecation. The ages at the time of diagnosis ranged from 7 to 19years with a median age of 12years. Signs of an enlarged rectum were seen in all ten patients, with a maximum measured value of 845mL. Patients with HD may also suffer from dyssynergic defecation. It is important to consider this possibility when dealing with severe persistent constipation in postoperative HD patients. Viable options for treating dyssynergic defecation are available that could prevent irreversible long-term complications. Copyright © 2014 Elsevier Inc. All rights reserved.

  18. Anxiety in visual field testing.

    PubMed

    Chew, Shenton S L; Kerr, Nathan M; Wong, Aaron B C; Craig, Jennifer P; Chou, Chi-Ying; Danesh-Meyer, Helen V

    2016-08-01

    To determine if Humphrey visual field (HVF) testing induces anxiety and how anxiety relates to visual field parameters of reliability and severity. A prospective cohort study at a university affiliated private ophthalmic practice. 137 consecutive age-matched and gender-matched patients with glaucoma undergoing either HVF testing only (n=102) or Heidelberg retinal tomography (HRT) only (n=35) were enrolled. Prior to testing, participants completed the State-Trait Anxiety Inventory questionnaire. A 5-point Likert scale was used to grade pretest anxiety and was repeated after testing to grade intratest anxiety. Subjective discomfort parameters were also recorded. Anxiety scores were used to make non-parametrical comparisons and correlations between cohorts and also against visual field reliability and severity indices. Trait anxiety (p=0.838) and pretest anxiety (p=0.802) were not significantly different between test groups. Within the HVF group, intratest anxiety was 1.2 times higher than pretest anxiety (p=0.0001), but was not significantly different in the HRT group (p=0.145). Pretest anxiety was correlated with test unreliability (Spearman's r=0.273, p=0.006), which was predictive of worse test severity (p=0.0027). Subjects who had undergone more than 10 visual field tests had significantly lower pretest and intratest anxiety levels than those who had not (p=0.0030 and p=0.0004, respectively). HVF testing induces more anxiety than HRT. Increased pretest anxiety may reduce HVF test reliability. Increased test experience or interventions aimed at reducing pretest anxiety may result in improved test reliability and accuracy. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  19. Can a toxin gene NAAT be used to predict toxin EIA and the severity of Clostridium difficile infection?

    PubMed

    Garvey, Mark I; Bradley, Craig W; Wilkinson, Martyn A C; Holden, Elisabeth

    2017-01-01

    Diagnosis of C. difficile infection (CDI) is controversial because of the many laboratory methods available and their lack of ability to distinguish between carriage, mild or severe disease. Here we describe whether a low C. difficile toxin B nucleic acid amplification test (NAAT) cycle threshold (CT) can predict toxin EIA, CDI severity and mortality. A three-stage algorithm was employed for CDI testing, comprising a screening test for glutamate dehydrogenase (GDH), followed by a NAAT, then a toxin enzyme immunoassay (EIA). All diarrhoeal samples positive for GDH and NAAT between 2012 and 2016 were analysed. The performance of the NAAT CT value as a classifier of toxin EIA outcome was analysed using a ROC curve; patient mortality was compared to CTs and toxin EIA via linear regression models. A CT value ≤26 was associated with ≥72% toxin EIA positivity; applying a logistic regression model we demonstrated an association between low CT values and toxin EIA positivity. A CT value of ≤26 was significantly associated ( p  = 0.0262) with increased one month mortality, severe cases of CDI or failure of first line treatment. The ROC curve probabilities demonstrated a CT cut off value of 26.6. Here we demonstrate that a CT ≤26 indicates more severe CDI and is associated with higher mortality. Samples with a low CT value are often toxin EIA positive, questioning the need for this additional EIA test. A CT ≤26 could be used to assess the potential for severity of CDI and guide patient treatment.

  20. Effectiveness of a respiratory rehabilitation programme in patients with chronic obstructive pulmonary disease.

    PubMed

    Prunera-Pardell, María Jesús; Padín-López, Susana; Domenech-Del Rio, Adolfo; Godoy-Ramírez, Ana

    To evaluate the effectiveness of the multidisciplinary respiratory rehabilitation (RR) programme in patients with severe or very severe chronic obstructive pulmonary disease pre the RR programme, at the end of the programme and one year after the RR, measuring changes in ability to exercise (walking test), effort tolerance(forced expiratory volume (FEV1)) and health-related quality of life. Quasi-experimental single group design. We included patients diagnosed with severe or very severe chronic obstructive pulmonary disease (stages III and IV of the GOLD classification) who entered the rehabilitation programme for the years 2011 and 2012. Demographic data, questionnaires on general health-related quality of life (SF-36) and specific to respiratory patients (St George's Respiratory Questionnaire), FEV1% and exercise capacity test (running test 6minutes) were collected. Data were collected before the RR programme, at the end of the RR programme and a year after completing the program. No significant differences in FEV1% values were observed. Regarding exercise capacity, an increase in distance walked in the walking test was noted, which changed significantly after training, 377±59.7 to 415±79 m after one year (P<.01). A statistically significant improvement in mean scores of HRQoL was observed, except for the emotional role dimension of the SF-36 questionnaire. A pulmonary rehabilitation programme for 8 weeks improved the exercise capacity, dyspnoea and quality of life of patients with severe and very severe chronic obstructive pulmonary disease. Copyright © 2017 Elsevier España, S.L.U. All rights reserved.

  1. Hot Cell Installation and Demonstration of the Severe Accident Test Station

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Linton, Kory D.; Burns, Zachary M.; Terrani, Kurt A.

    A Severe Accident Test Station (SATS) capable of examining the oxidation kinetics and accident response of irradiated fuel and cladding materials for design basis accident (DBA) and beyond design basis accident (BDBA) scenarios has been successfully installed and demonstrated in the Irradiated Fuels Examination Laboratory (IFEL), a hot cell facility at Oak Ridge National Laboratory. The two test station modules provide various temperature profiles, steam, and the thermal shock conditions necessary for integral loss of coolant accident (LOCA) testing, defueled oxidation quench testing and high temperature BDBA testing. The installation of the SATS system restores the domestic capability to examinemore » postulated and extended LOCA conditions on spent fuel and cladding and provides a platform for evaluation of advanced fuel and accident tolerant fuel (ATF) cladding concepts. This document reports on the successful in-cell demonstration testing of unirradiated Zircaloy-4. It also contains descriptions of the integral test facility capabilities, installation activities, and out-of-cell benchmark testing to calibrate and optimize the system.« less

  2. Testing Proficiency in Interpersonal Communication

    ERIC Educational Resources Information Center

    Byers, Burton H.

    1973-01-01

    Discusses several hypotheses about the measurement of speech-communication proficiency which are being tested at the University of Hawaii and a testing instrument entitled Dy Comm'' (dyadic communication) which emerged from this research. (DD)

  3. Scaled Rocket Testing in Hypersonic Flow

    NASA Technical Reports Server (NTRS)

    Dufrene, Aaron; MacLean, Matthew; Carr, Zakary; Parker, Ron; Holden, Michael; Mehta, Manish

    2015-01-01

    NASA's Space Launch System (SLS) uses four clustered liquid rocket engines along with two solid rocket boosters. The interaction between all six rocket exhaust plumes will produce a complex and severe thermal environment in the base of the vehicle. This work focuses on a recent 2% scale, hot-fire SLS base heating test. These base heating tests are short-duration tests executed with chamber pressures near the full-scale values with gaseous hydrogen/oxygen engines and RSRMV analogous solid propellant motors. The LENS II shock tunnel/Ludwieg tube tunnel was used at or near flight duplicated conditions up to Mach 5. Model development was strongly based on the Space Shuttle base heating tests with several improvements including doubling of the maximum chamber pressures and duplication of freestream conditions. Detailed base heating results are outside of the scope of the current work, rather test methodology and techniques are presented along with broader applicability toward scaled rocket testing in supersonic and hypersonic flow.

  4. Use of the disease severity index for null hypothesis testing

    USDA-ARS?s Scientific Manuscript database

    A disease severity index (DSI) is a single number for summarizing a large amount of disease severity information. It is used to indicate relative resistance of cultivars, to relate disease severity to yield loss, or to compare treatments. The DSI has most often been based on a special type of ordina...

  5. Energy efficient engine

    NASA Technical Reports Server (NTRS)

    Burrus, D.; Sabla, P. E.; Bahr, D. W.

    1980-01-01

    The feasibility of meeting or closely approaching the emissions goals established for the Energy Efficient Engine (E3) Project with an advanced design, single annular combustor was determined. A total of nine sector combustor configurations and one full-annular-combustor configuration were evaluated. Acceptable levels of carbon monoxide and hydrocarbon emissions were obtained with several of the sector combustor configurations tested, and several of the configurations tested demonstrated reduced levels of nitrogen oxides compared to conventional, single annular designs. None of the configurations tested demonstrated nitrogen oxide emission levels that meet the goal of the E3 Project.

  6. Use of a Dual-Antigen Rapid Diagnostic Test to Screen Children for Severe Plasmodium falciparum Malaria in a High-Transmission, Resource-Limited Setting.

    PubMed

    Boyce, Ross; Reyes, Raquel; Matte, Michael; Ntaro, Moses; Mulogo, Edgar; Siedner, Mark J

    2017-10-16

    In rural areas, many patients with malaria seek care at peripheral health facilities or community case management programs. While this strategy is effective for the management of uncomplicated malaria, severe malaria necessitates prompt detection and referral to facilities with adequate resources. In this prospective, observational cohort study, we assessed the accuracy of a dual-band (histidine-rich protein-2/pan-lactate dehydrogenase [HRP2/pLDH]) rapid diagnostic test (RDT) to differentiate uncomplicated from severe malaria. We included children aged <12 years who presented to a rural clinic in western Uganda with a positive HRP2 or HRP2/pLDH RDT. We estimated the test characteristics of a dual-antigen (HRP2+/pLDH+) band positive RDT compared to World Health Organization-defined clinical and laboratory criteria to detect severe malaria. A total of 2678 children underwent testing for malaria with an RDT, and 83 (9.0%) satisfied criteria for severe malaria. The sensitivity and specificity of a HRP2+/pLDH+ result for severe malaria was 97.6% (95% confidence interval [CI], 90.8%-99.6%) and 75.6% (95% CI, 73.8%-77.4%), respectively. An HRP2+/pLDH+ result was significantly more sensitive (97.6% vs 68.7%, P < .001) for the detection of severe malaria compared to algorithms that incorporate screening for danger signs. A positive dual-antigen (HRP2/pLDH) RDT has higher sensitivity than the use of clinical manifestations to detect severe malaria, making it a promising tool in the triage of children with malaria in low-resource settings. Additional work is needed to operationalize diagnostic and treatment algorithms that include dual-antigen RDTs to avoid over referral. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail: journals.permissions@oup.com.

  7. Characterization of DUT impedance in immunity test setups

    NASA Astrophysics Data System (ADS)

    Hassanpour Razavi, Seyyed Ali; Frei, Stephan

    2016-09-01

    Several immunity test procedures for narrowband radiated electromagnetic energy are available for automotive components. The ISO 11452 series describes the most commonly used test methods. The absorber line shielded enclosure (ALSE) is often considered as the most reliable method. However, testing with the bulk current injection (BCI) can be done with less efforts and is often preferred. As the test setup in both procedures is quite similar, there were several trials for finding appropriate modifications to the BCI in order to increase the matching to the ALSE. However, the lack of knowledge regarding the impedance of the tested component, makes it impossible to find the equivalent current to be injected by the BCI and a good match cannot be achieved. In this paper, three approaches are proposed to estimate the termination impedance indirectly by using different current probes.

  8. Workplace Drug Testing and Worker Drug Use

    PubMed Central

    Carpenter, Christopher S

    2007-01-01

    Objective To examine the nature and extent of the association between workplace drug testing and worker drug use. Data Sources Repeated cross-sections from the 2000 to 2001 National Household Surveys on Drug Abuse (NHSDA) and the 2002 National Survey on Drug Use and Health (NSDUH). Study Design Multivariate logistic regression models of the likelihood of marijuana use are estimated as a function of several different workplace drug policies, including drug testing. Specific questions about penalty severity and the likelihood of detection are used to further evaluate the nature of the association. Principal Findings Individuals whose employers perform drug tests are significantly less likely to report past month marijuana use, even after controlling for a wide array of worker and job characteristics. However, large negative associations are also found for variables indicating whether a firm has drug education, an employee assistance program, or a simple written policy about substance use. Accounting for these other workplace characteristics reduces—but does not eliminate—the testing differential. Frequent testing and severe penalties reduce the likelihood that workers use marijuana. Conclusions Previous studies have interpreted the large negative correlation between workplace drug testing and employee substance use as representing a causal deterrent effect of drug testing. Our results using more comprehensive data suggest that these estimates have been slightly overstated due to omitted variables bias. The overall pattern of results remains largely consistent with the hypothesis that workplace drug testing deters worker drug use. PMID:17362218

  9. HPV testing in routine cervical screening: cross sectional data from the ARTISTIC trial

    PubMed Central

    Kitchener, H C; Almonte, M; Wheeler, P; Desai, M; Gilham, C; Bailey, A; Sargent, A; Peto, J

    2006-01-01

    To evaluate the effectiveness of human papillomavirus (HPV) testing in primary cervical screening. This was a cross-sectional study from the recruitment phase of a prospective randomised trial. Women were screened for HPV in addition to routine cervical cytology testing. Greater Manchester, attendees at routine NHS Cervical Screening Programme. In all, 24 510 women aged 20–64 screened with liquid-based cytology (LBC) and HPV testing at entry. HPV testing in primary cervical screening. Type-specific HPV prevalence rates are presented in relation to age as well as cytological and histological findings at entry. In all, 24 510 women had adequate cytology and HPV results. Cytology results at entry were: 87% normal, 11% borderline or mild, 1.1% moderate and 0.6% severe dyskaryosis or worse. Prevalence of HPV decreased sharply with age, from 40% at age 20–24 to 12% at 35–39 and 7% or less above age 50. It increased with cytological grade, from 10% of normal cytology and 31% of borderline to 70% mild, 86% moderate, and 96% of severe dyskaryosis or worse. HPV 16 or HPV 18 accounted for 64% of infections in women with severe or worse cytology, and one or both were found in 61% of women with severe dyskaryosis but in only 2.2% of those with normal cytology. The majority of young women in Greater Manchester have been infected with a high-risk HPV by the age of 30. HPV testing is practicable as a primary routine screening test, but in women aged under 30 years, this would lead to a substantial increase in retesting and referral rates. HPV 16 and HPV 18 are more predictive of underlying disease, but other HPV types account for 30% of high-grade disease. PMID:16773068

  10. Comparison of AASHTO moisture sensitivity test (T-283) with Connecticut Department of Transportation modified test method

    DOT National Transportation Integrated Search

    1999-08-01

    Several different interpretations of the American Association of State Highway and Transportation Officials' (AASHTO's) Moisture Sensitivity Test exist. The official AASHTO interpretation of this test method does not account for water which has been ...

  11. Testing the DSM-5 severity indicator for bulimia nervosa in a treatment-seeking sample.

    PubMed

    Dakanalis, Antonios; Clerici, Massimo; Riva, Giuseppe; Carrà, Giuseppe

    2017-03-01

    This study tested the new DSM-5 severity criterion for bulimia nervosa (BN) based on the frequency of inappropriate weight compensatory behaviors in a treatment-seeking sample. Participants were 345 adults with DSM-5 BN presenting for treatment. They were sub-grouped based on DSM-5 severity levels and compared on a range of variables of clinical interest and demographics. Based on DSM-5 severity definitions, 27.2 % of the sample was categorized with mild, 26.1 % with moderate, 24.9 % with severe, and 21.8 % with extreme severity of BN. Analyses revealed that the four (mild, moderate, severe, and extreme) severity groups of BN significantly differed from each other in eating disordered and body-related attitudes and behaviors, factors involved in the maintenance process of the disorder, comorbid psychiatric disorders, psychological distress, and psychosocial impairment (medium-to-large effect sizes). No significant between-group differences were observed in demographics, body mass index, or at the age when BN first occurred, lending some credence to recent suggestions that age-at-onset of BN may be more a disorder- than a severity-dependent variable. Collectively, our findings provide support for the severity indicator for BN introduced in the DSM-5 as a means of addressing heterogeneity and variability in the severity of the disorder.

  12. Generalizing Experimental Findings

    DTIC Science & Technology

    2015-06-01

    ity.” In graphical terms, these assumptions may require several d-separation tests on several sub-graphs. It is utterly unimaginable therefore that...Education) (a) (Salary)(Education) (Skill) (b) S ( Test ) YX ZZ YX Figure 1: (a) A transportability model in which a post-treatment variable Z is S-admissible...observational studies to estimate population treatment ef- fects. Journal Royal Statistical Society: Series A (Statistics in Society) Forthcoming, doi

  13. Learning Potential Among the Moderately and Severely Retarded. Studies in Learning Potential, Volume 3, Number 52.

    ERIC Educational Resources Information Center

    Hamilton, James L.; Budoff, Milton

    The study investigated the feasibility of M. Budoff and M. Friedman's (1964) learning potential paradigm as an assessment approach with 40 moderately and severely mentally retarded persons (aged 12 to 22 years). Ss were tested three times: initially, after one week, and after one month with a match-to-sample block design test. Twenty of the Ss…

  14. Digital multishaker modal testing

    NASA Technical Reports Server (NTRS)

    Blair, M.; Craig, R. R., Jr.

    1983-01-01

    A review of several modal testing techniques is made, along with brief discussions of their advantages and limitations. A new technique is presented which overcomes many of the previous limitations. Several simulated experiments are included to verify the validity and accuracy of the new method. Conclusions are drawn from the simulation studies and recommendations for further work are presented. The complete computer code configured for the simulation study is presented.

  15. Identifying Indicators of State Change and Forecasting Future Vulnerability in Alaskan Boreal Ecosystems

    DTIC Science & Technology

    2016-08-01

    ice have catastrophic effects on facilities, infrastructure, and military testing and training. Permafrost temperature , thickness, and geographic...treeline) and fire severity (~0 to ~100% SOL consumption ), they provide an excellent suite of sites to test and quantify the effects of fire severity...stages .........................59 Table 6.1. Variables included in explanatory matrix for black spruce dominance ............68 Table 6.2. Mixed effect

  16. Identifying Indicators of State Change and Forecasting Future Vulnerability in Alaskan Boreal Ecosystems

    DTIC Science & Technology

    2016-08-01

    catastrophic effects on facilities, infrastructure, and military testing and training. Permafrost temperature , thickness, and geographic continuity...and fire severity (~0 to ~100% SOL consumption ), they provide an excellent suite of sites to test and quantify the effects of fire severity on plant...59 Table 6.1. Variables included in explanatory matrix for black spruce dominance ............68 Table 6.2. Mixed effect model

  17. Weather Associated with the Fall-2000 Turbulence Flight Tests

    NASA Technical Reports Server (NTRS)

    Hamilton, David W.; Proctor, Fred H.

    2003-01-01

    This viewgraph presentation provides information on three flight tests in which NASA Langley's ARIES B-757 research aircraft was intentionally piloted into areas with a high risk for severe atmospheric turbulence. During its encounter with turbulence, instruments aboard the aircraft monitored wind, temperature and acceleration, and onboard Doppler radar detected forward turbulence. Data was collected along a spectrum, from smooth air to severe turbulence.

  18. Advance Noise Control Fan II: Test Rig Fan Risk Management Study

    NASA Technical Reports Server (NTRS)

    Lucero, John

    2013-01-01

    Since 1995 the Advanced Noise Control Fan (ANCF) has significantly contributed to the advancement of the understanding of the physics of fan tonal noise generation. The 9'x15' WT has successfully tested multiple high speed fan designs over the last several decades. This advanced several tone noise reduction concepts to higher TRL and the validation of fan tone noise prediction codes.

  19. Early Neuropsychological Tests as Correlates of Productivity 1 Year after Traumatic Brain Injury: A Preliminary Matched Case-Control Study

    ERIC Educational Resources Information Center

    Ryu, Won Hyung A.; Cullen, Nora K.; Bayley, Mark T.

    2010-01-01

    This study explored the relative strength of five neuropsychological tests in correlating with productivity 1 year after traumatic brain injury (TBI). Six moderate-to-severe TBI patients who returned to work at 1-year post-injury were matched with six controls who were unemployed after 1 year based on age, severity of injury, and Functional…

  20. The effects of mild and severe traumatic brain injury on speed of information processing as measured by the computerized tests of information processing (CTIP).

    PubMed

    Tombaugh, Tom N; Rees, Laura; Stormer, Peter; Harrison, Allyson G; Smith, Andra

    2007-01-01

    In spite of the fact that reaction time (RT) measures are sensitive to the effects of traumatic brain injury (TBI), few RT procedures have been developed for use in standard clinical evaluations. The computerized test of information processing (CTIP) [Tombaugh, T. N., & Rees, L. (2000). Manual for the computerized tests of information processing (CTIP). Ottawa, Ont.: Carleton University] was designed to measure the degree to which TBI decreases the speed at which information is processed. The CTIP consists of three computerized programs that progressively increase the amount of information that is processed. Results of the current study demonstrated that RT increased as the difficulty of the CTIP tests increased (known as the complexity effect), and as severity of injury increased (from mild to severe TBI). The current study also demonstrated the importance of selecting a non-biased measure of variability. Overall, findings suggest that the CTIP is an easy to administer and sensitive measure of information processing speed.

  1. Non-Invasive Assessment of Liver Function

    PubMed Central

    Helmke, Steve; Colmenero, Jordi; Everson, Gregory T.

    2015-01-01

    Purpose of review It is our opinion that there is an unmet need in Hepatology for a minimally- or noninvasive test of liver function and physiology. Quantitative liver function tests (QLFTs) define the severity and prognosis of liver disease by measuring the clearance of substrates whose uptake or metabolism is dependent upon liver perfusion or hepatocyte function. Substrates with high affinity hepatic transporters exhibit high “first-pass” hepatic extraction and their clearance measures hepatic perfusion. In contrast, substrates metabolized by the liver have low first-pass extraction and their clearance measures specific drug metabolizing pathways. Recent Findings We highlight one QLFT, the dual cholate test, and introduce the concept of a disease severity index (DSI) linked to clinical outcome that quantifies the simultaneous processes of hepatocyte uptake, clearance from the systemic circulation, clearance from the portal circulation, and portal-systemic shunting. Summary It is our opinion that dual cholate is a relevant test for defining disease severity, monitoring the natural course of disease progression, and quantifying the response to therapy. PMID:25714706

  2. Predictions of structural integrity of steam generator tubes under normal operating, accident, an severe accident conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Majumdar, S.

    1997-02-01

    Available models for predicting failure of flawed and unflawed steam generator tubes under normal operating, accident, and severe accident conditions are reviewed. Tests conducted in the past, though limited, tended to show that the earlier flow-stress model for part-through-wall axial cracks overestimated the damaging influence of deep cracks. This observation was confirmed by further tests at high temperatures, as well as by finite-element analysis. A modified correlation for deep cracks can correct this shortcoming of the model. Recent tests have shown that lateral restraint can significantly increase the failure pressure of tubes with unsymmetrical circumferential cracks. This observation was confirmedmore » by finite-element analysis. The rate-independent flow stress models that are successful at low temperatures cannot predict the rate-sensitive failure behavior of steam generator tubes at high temperatures. Therefore, a creep rupture model for predicting failure was developed and validated by tests under various temperature and pressure loadings that can occur during postulated severe accidents.« less

  3. Elevated-temperature tensile and creep properties of several ferritic stainless steels

    NASA Technical Reports Server (NTRS)

    Whittenberger, J. D.

    1977-01-01

    The elevated-temperature mechanical properties of several ferritic stainless steels were determined. The alloys evaluated included Armco 18SR, GE 1541, and NASA-18T-A. Tensile and creep strength properties at 1073 and 1273 K and residual room temperature tensile properties after creep testing were measured. In addition, 1273 K tensile and creep tests and residual property testing were conducted with Armco 18SR and GE 1541 which were exposed for 200 hours to a severe oxidizing environment in automotive thermal reactors. Aside from the residual tensile properties for Armco 18SR, prior exposure did not affect the mechanical properties of either alloy. The 1273 K creep strength parallel to the sheet-rolling direction was similar for all three alloys. At 1073 K, NASA-18T-A had better creep strength than either Armco 18SR or GE 1541. NASA-18T-A possesses better residual properties after creep testing than either Armco 18SR or Ge 1541.

  4. NASA Occupant Protection Standards Development

    NASA Technical Reports Server (NTRS)

    Somers, Jeffrey; Gernhardt, Michael; Lawrence, Charles

    2012-01-01

    Historically, spacecraft landing systems have been tested with human volunteers, because analytical methods for estimating injury risk were insufficient. These tests were conducted with flight-like suits and seats to verify the safety of the landing systems. Currently, NASA uses the Brinkley Dynamic Response Index to estimate injury risk, although applying it to the NASA environment has drawbacks: (1) Does not indicate severity or anatomical location of injury (2) Unclear if model applies to NASA applications. Because of these limitations, a new validated, analytical approach was desired. Leveraging off of the current state of the art in automotive safety and racing, a new approach was developed. The approach has several aspects: (1) Define the acceptable level of injury risk by injury severity (2) Determine the appropriate human surrogate for testing and modeling (3) Mine existing human injury data to determine appropriate Injury Assessment Reference Values (IARV). (4) Rigorously Validate the IARVs with sub-injurious human testing (5) Use validated IARVs to update standards and vehicle requirement

  5. Survivability characteristics of composite compression structure

    NASA Technical Reports Server (NTRS)

    Avery, John G.; Allen, M. R.; Sawdy, D.; Avery, S.

    1990-01-01

    Test and evaluation was performed to determine the compression residual capability of graphite reinforced composite panels following perforation by high-velocity fragments representative of combat threats. Assessments were made of the size of the ballistic damage, the effect of applied compression load at impact, damage growth during cyclic loading and residual static strength. Several fiber/matrix systems were investigated including high-strain fibers, tough epoxies, and APC-2 thermoplastic. Additionally, several laminate configurations were evaluated including hard and soft laminates and the incorporation of buffer strips and stitching for improved damage resistance of tolerance. Both panels (12 x 20-inches) and full scale box-beam components were tested to assure scalability of results. The evaluation generally showed small differences in the responses of the material systems tested. The soft laminate configurations with concentrated reinforcement exhibited the highest residual strength. Ballistic damage did not grow or increase in severity as a result of cyclic loading, and the effects of applied load at impact were not significant under the conditions tested.

  6. Rapid sideline performance meets outpatient clinic: Results from a multidisciplinary concussion center registry.

    PubMed

    Kyle Harrold, G; Hasanaj, Lisena; Moehringer, Nicholas; Zhang, Isis; Nolan, Rachel; Serrano, Liliana; Raynowska, Jenelle; Rucker, Janet C; Flanagan, Steven R; Cardone, Dennis; Galetta, Steven L; Balcer, Laura J

    2017-08-15

    This study investigated the utility of sideline concussion tests, including components of the Sports Concussion Assessment Tool, 3rd Edition (SCAT3) and the King-Devick (K-D), a vision-based test of rapid number naming, in an outpatient, multidisciplinary concussion center treating patients with both sports-related and non-sports related concussions. The ability of these tests to predict clinical outcomes based on the scores at the initial visit was evaluated. Scores for components of the SCAT3 and the K-D were fit into regression models accounting for age, gender, and sport/non-sport etiology in order to predict clinical outcome measures including total number of visits to the concussion center, whether the patient reached a SCAT3 symptom severity score≤7, and the total types of referrals each patient received over their course. Patient characteristics, differences between those with sport and non-sport etiologies, and correlations between the tests were also analyzed. Among 426 patients with concussion, SCAT3 total symptom score and symptom severity score at the initial visit predicted each of the clinical outcome variables. K-D score at the initial visit predicted the total number of visits and the total number of referrals. Those with sports-related concussions were younger, had less severely-affected test scores, had fewer visits and types of referrals, and were more likely to have clinical resolution of their concussion and to reach a symptom severity score≤7. This large-scale study of concussion patients supports the use of sideline concussion tests as part of outpatient concussion assessment, especially the total symptom and symptom severity score portions of the SCAT3 and the K-D. Women in this cohort had higher total symptom and symptom severity scores compared to men. Our data also suggest that those with non-sports-related concussions have longer lasting symptoms than those with sports-related concussions, and that these two groups should perhaps be regarded separately when assessing outcomes and needs in a multidisciplinary setting. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. [A randomized controlled trial: acclimatization training on the prevention of motion sickness in hot-humid environment].

    PubMed

    Zhang, Lei; Mao, Jun-Feng; Wu, Xiao-Nong; Bao, Ying-Chun

    2014-05-01

    Incidence and severity of motion sickness (MS) in hot-humid environment are extremely high. We tried to know the effect of two-stage training for reducing incidence and severity of ms. Sixty male subjects were divided into experimental group and control group randomly. Subjects in experimental group received: (2) adaptation training including sitting, walking and running in hot lab. After adaptation confirmation based on subjective feeling, rectal temperature, heart rate, blood Pressure, sweat rates and sweat salt concentration, we tested both groups by Coriolis acceleration revolving chair test and recorded Graybiel's score and grading of severity to evaluate whether adaptation training was useful; (2) Anti-dizzy training 3m later of deacclimatization contained revolving chair training for 10 times. Then we did the same test as mentioned above to evaluate effect of anti-dizzy training. RESULST: Graybiel' s score and grading of severity had no difference between two groups through acclimatization training (P > 0.05). While they had difference through anti-dizzy training (P < 0.01). Adaptation training seems useless for reducing incidence and severity of MS in hot-humid environment, but anti-dizzy training is useful.

  8. Association between sensitization to Aureobasidium pullulans (Pullularia sp) and severity of asthma.

    PubMed

    Niedoszytko, Marek; Chełmińska, Marta; Jassem, Ewa; Czestochowska, Eugenia

    2007-02-01

    Recent data indicate that fungi may contribute to increased severity of asthma. To determine the prevalence of allergy to 15 mold allergens among patients hospitalized because of exacerbation of asthma and to evaluate the relationship between the severity of the disease and allergy to particular molds. Skin prick tests with standard aeroallergens of airborne allergens, including grass, tree, Dermatophagoides pteronyssinus, Dermatophagoides farinae, feather, and cat and dog fur, and a panel of mold allergens, including Alternaria, Cladosporium, Aspergillus, Penicillium, Trichothecium, Chaetomium globosum, Epicoccum, Epidermophyton, Helminthosporium, Aureobasidium pullulans, Rhizopus nigricans, Fusarium, Mucor, Merulius lacrymans, and yeast mix, were performed in 105 asthmatic patients and 30 controls. Positive skin prick test results were found in 98% of asthmatic patients and 66% of controls. Sensitivity to A pullulans was significantly associated with more severe asthma (odds ratio, 1.4; 95% confidence interval, 1.09-1.75; P = .006). Sensitization to Helminthosporium was associated with an increased number of asthma exacerbations that required hospitalization (17% vs 38%; chi2 test P = .03). Sensitization to A pullulans is a risk factor for severe asthma. Sensitization to Helminthosporium may be related to asthma exacerbation that requires hospitalization.

  9. Shelf life extension for the lot AAE nozzle severance LSCs

    NASA Technical Reports Server (NTRS)

    Cook, M.

    1990-01-01

    Shelf life extension tests for the remaining lot AAE linear shaped charges for redesigned solid rocket motor nozzle aft exit cone severance were completed in the small motor conditioning and firing bay, T-11. Five linear shaped charge test articles were thermally conditioned and detonated, demonstrating proper end-to-end charge propagation. Penetration depth requirements were exceeded. Results indicate that there was no degradation in performance due to aging or the linear shaped charge curving process. It is recommended that the shelf life of the lot AAE nozzle severance linear shaped charges be extended through January 1992.

  10. Impairments in Dark Adaptation Are Associated with Age-Related Macular Degeneration Severity and Reticular Pseudodrusen.

    PubMed

    Flamendorf, Jason; Agrón, Elvira; Wong, Wai T; Thompson, Darby; Wiley, Henry E; Doss, E Lauren; Al-Holou, Shaza; Ferris, Frederick L; Chew, Emily Y; Cukras, Catherine

    2015-10-01

    We investigate whether ocular and person-based characteristics were associated with dark adaptation (DA). Cross-sectional, single-center, observational study. One hundred sixteen participants older than 50 years of age with a range of age-related macular degeneration (AMD) severity. Participants underwent best-corrected visual acuity (BCVA) testing, ophthalmoscopic examination, and multimodal imaging. Presence of reticular pseudodrusen (RPD) was assessed by masked grading of fundus images and was confirmed with optical coherence tomography. Eyes also were graded for AMD features (drusen, pigmentary changes, late AMD) to generate person-based AMD severity groups. One eye was designated the study eye for DA testing. Nonparametric statistical testing was performed on all comparisons. The primary outcome of this study was the rod intercept time (RIT), which is defined as the time for a participant's visual sensitivity to recover to a stimulus intensity of 5×10(-3) cd/m(2) (a decrease of 3 log units), or until a maximum test duration of 40 minutes was reached. A total of 116 study eyes from 116 participants (mean age, 75.4±9.4 years; 58% female) were analyzed. Increased RIT was associated significantly with increasing AMD severity, increasing age (r = 0.34; P = 0.0002), decreasing BCVA (r = -0.54; P < 0.0001), pseudophakia (P = 0.03), and decreasing subfoveal choroidal thickness (r = -0.27; P = 0.003). Study eyes with RPD (15/116 [13%]) had a significantly greater mean RIT compared with eyes without RPD in any AMD severity group (P < 0.02 for all comparisons), with 80% reaching the DA test ceiling. Impairments in DA increased with age, worse visual acuity, presence of RPD, AMD severity, and decreased subfoveal choroidal thickness. Analysis of covariance found the multivariate model that best fit the data included age, AMD group, and presence of RPD (R(2) = 0.56), with the presence of RPD conferring the largest parameter estimate. Copyright © 2015 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.

  11. Influence of poor effort on neuropsychological test performance in U.S. military personnel following mild traumatic brain injury.

    PubMed

    Lange, Rael T; Pancholi, Sonal; Bhagwat, Aditya; Anderson-Barnes, Victoria; French, Louis M

    2012-01-01

    The purpose of this study was to examine the influence of poor effort on neuropsychological test performance in military personnel following mild traumatic brain injury (MTBI). Participants were 143 U.S. service members who sustained a TBI, divided into three groups based on injury severity and performance on the Word Memory Test and four embedded markers of poor effort: MTBI-pass (n = 87), MTBI-fail (n = 21), and STBI-pass (n = 35; where STBI denotes severe TBI). Patients were evaluated at the Walter Reed Army Medical Center on average 3.9 months (SD = 3.4) post injury. The majority of the sample was Caucasian (84.6%), was male (93.0%), and had 12+ years of education (96.5%). Measures included the Personality Assessment Inventory (PAI) and 13 common neurocognitive measures. Patients in the MTBI-fail group performed worse on the majority of neurocognitive measures, followed by the Severe TBI-Pass group and the MTBI-pass group. Using a criterion of three or more low scores <10th percentile, the MTBI-fail group had the greatest rate of impairment (76.2%), followed by the Severe TBI-Pass group (34.3%) and MTBI-pass group (16.1%). On the PAI, the MTBI-fail group had higher scores on the majority of clinical scales (p < .05). There were a greater number of elevated scales (e.g., 5 or more elevated mild or higher) in the MTBI-fail group (71.4%) than in the MTBI-pass group (32.2%) and Severe TBI-Pass group (17.1%). Effort testing is an important component of postacute neuropsychological evaluations following combat-related MTBI. Those who fail effort testing are likely to be misdiagnosed as having severe cognitive impairment, and their symptom reporting is likely to be inaccurate.

  12. Focused use of drug screening in overdose patients increases impact on management.

    PubMed

    Erdmann, Andreas; Werner, Dominique; Hugli, Olivier; Yersin, Bertrand

    2015-01-01

    Drug poisoning is a common cause for attendance in the emergency department. Several toxicology centres suggest performing urinary drug screens, even though they rarely influence patient management. Measuring the impact on patient management, in a University Emergency Department with approximately 40 000 admissions annually, of a rapid urinary drug screening test using specifically focused indications. Drug screening was restricted to patients having a first psychotic episode or cases demonstrating respiratory failure, coma, seizures, a sympathomimetic toxidrome, severe opiate overdose necessitating naloxone, hypotension, ventricular arrhythmia, acquired long QT or QRS >100 ms, and high-degree heart block. Retrospective analysis of Triage® TOX drug screen tests performed between September 2009 and November 2011, and between January 2013 and March 2014. A total of 262 patients were included, mean age 35 ± 14.6 (standard deviation) years, 63% men; 29% poisoning with alcohol, and 2.3% deaths. Indications for testing were as follows: 34% were first psychotic episodes; 20% had acute respiratory failure; 16% coma; 8% seizures; 8% sympathomimetic toxidromes; 7% severe opioid toxidromes; 4% hypotension; 3% ventricular arrhythmias or acquired long QT intervals on electrocardiogram. A total of 78% of the tests were positive (median two substances, maximum five). The test resulted in drug-specific therapy in 6.1%, drug specific diagnostic tests in 13.3 %, prolonged monitoring in 10.7% of methadone-positive tests, and psychiatric admission in 4.2%. Overall, 34.3% tests influenced patient management. In contrast to previous studies showing modest effects of toxicological testing, restricted use of rapid urinary drug testing increases the impact on management of suspected overdose patients in the ED.

  13. Limited electromagnetic interference testing of evidential breath testers

    DOT National Transportation Integrated Search

    1983-05-06

    This report summarizes a limited test program conducted to determine the susceptibility of evidential breath testers (EBTs) to radio frequency interference (RFI). Several comprehensive test protocols were prepared based on procedures developed by the...

  14. Methylene blue test

    MedlinePlus

    The methylene blue test is a test to determine the type or to treat methemoglobinemia , a blood disorder. ... higher, you can become sick because the protein is not carrying ... of red. Methemoglobinemia has several causes, many of which are ...

  15. The culprit insect but not severity of allergic reactions to bee and wasp venom can be determined by molecular diagnosis.

    PubMed

    Gattinger, Pia; Lupinek, Christian; Kalogiros, Lampros; Silar, Mira; Zidarn, Mihaela; Korosec, Peter; Koessler, Christine; Novak, Natalija; Valenta, Rudolf; Mittermann, Irene

    2018-01-01

    Allergy to bee and wasp venom can lead to life-threatening systemic reactions. The identification of the culprit species is important for allergen-specific immunotherapy. To determine a panel of recombinant bee and wasp allergens which is suitable for the identification of bee or wasp as culprit allergen sources and to search for molecular surrogates of clinical severity of sting reactions. Sera from eighty-seven patients with a detailed documentation of their severity of sting reaction (Mueller grade) and who had been subjected to titrated skin testing with bee and wasp venom were analyzed for bee and wasp-specific IgE levels by ImmunoCAPTM. IgE-reactivity testing was performed using a comprehensive panel of recombinant bee and wasp venom allergens (rApi m 1, 2, 3, 4, 5 and 10; rVes v 1 and 5) by ISAC chip technology, ImmunoCAP and ELISA. IgG4 antibodies to rApi m 1 and rVes v 5 were determined by ELISA and IgE/IgG4 ratios were calculated. Results from skin testing, IgE serology and IgE/IgG4 ratios were compared with severity of sting reactions. The panel of rApi m 1, rApi m 10, rVes v 1 and rVes v 5 allowed identification of the culprit venom in all but two of the 87 patients with good agreement to skin testing. Severities of sting reactions were not associated with results obtained by skin testing, venom-specific IgE levels or molecular diagnosis. Severe sting reactions were observed in patients showing < 1 ISU and < 2kUA/L of IgE to Api m 1 and/or Ves v 5. We identified a minimal panel of recombinant bee and wasp allergens for molecular diagnosis which may permit identification of bee and/or wasp as culprit insect in venom-sensitized subjects. The severity of sting reactions was not associated with parameters obtained by molecular diagnosis.

  16. Thermal Environmental Testing of NSTAR Engineering Model Ion Thrusters

    NASA Technical Reports Server (NTRS)

    Rawlin, Vincent K.; Patterson, Michael J.; Becker, Raymond A.

    1999-01-01

    NASA's New Millenium program will fly a xenon ion propulsion system on the Deep Space 1 Mission. Tests were conducted under NASA's Solar Electric Propulsion Technology Applications Readiness (NSTAR) Program with 3 different engineering model ion thrusters to determine thruster thermal characteristics over the NSTAR operating range in a variety of thermal environments. A liquid nitrogen-cooled shroud was used to cold-soak the thruster to -120 C. Initial tests were performed prior to a mature spacecraft design. Those results and the final, severe, requirements mandated by the spacecraft led to several changes to the basic thermal design. These changes were incorporated into a final design and tested over a wide range of environmental conditions.

  17. Evaluation Parameters for Computer-Adaptive Testing

    ERIC Educational Resources Information Center

    Georgiadou, Elisabeth; Triantafillou, Evangelos; Economides, Anastasios A.

    2006-01-01

    With the proliferation of computers in test delivery today, adaptive testing has become quite popular, especially when examinees must be classified into two categories (passfail, master nonmaster). Several well-established organisations have provided standards and guidelines for the design and evaluation of educational and psychological testing.…

  18. Unilateral spatial neglect in the acute phase of ischemic stroke can predict long-term disability and functional capacity.

    PubMed

    Luvizutto, Gustavo José; Moliga, Augusta Fabiana; Rizzatti, Gabriela Rizzo Soares; Fogaroli, Marcelo Ortolani; Moura Neto, Eduardo de; Nunes, Hélio Rubens de Carvalho; Resende, Luiz Antônio de Lima; Bazan, Rodrigo

    2018-05-21

    The aim of this study was to assess the relationship between the degree of unilateral spatial neglect during the acute phase of stroke and long-term functional independence. This was a prospective study of right ischemic stroke patients in which the independent variable was the degree of spatial neglect and the outcome that was measured was functional independence. The potential confounding factors included sex, age, stroke severity, topography of the lesion, risk factors, glycemia and the treatment received. Unilateral spatial neglect was measured using the line cancellation test, the star cancellation test and the line bisection test within 48 hours of the onset of symptoms. Functional independence was measured using the modified Rankin and Barthel scales at 90 days after discharge. The relationship between unilateral spatial neglect and functional independence was analyzed using multiple logistic regression that was corrected for confounding factors. We studied 60 patients with a median age of 68 (34-89) years, 52% of whom were male and 74% of whom were Caucasian. The risk for moderate to severe disability increased with increasing star cancellation test scores (OR=1.14 [1.03-1.26], p=0.01) corrected for the stroke severity, which was a confounding factor that had a statistically positive association with disability (OR=1.63 [1.13-2.65], p=0.01). The best chance of functional independence decreased with increasing star cancellation test scores (OR=0.86 [0.78-0.96], p=0.006) corrected for the stroke severity, which was a confounding factor that had a statistically negative association with independence (OR=0.66 [0.48-0.92], p=0.017). The severity of unilateral spatial neglect in acute stroke worsens the degree of long-term disability and functional independence.

  19. Performance and Stability Analyses of Rocket Combustion Devices Using Liquid Oxygen/Liquid Methane Propellants

    NASA Technical Reports Server (NTRS)

    Hulka, James R.; Jones, G. W.

    2010-01-01

    Liquid rocket engines using oxygen and methane propellants are being considered by the National Aeronautics and Space Administration (NASA) for in-space vehicles. This propellant combination has not been previously used in flight-qualified engine systems, so limited test data and analysis results are available at this stage of early development. NASA has funded several hardware-oriented programs with oxygen and methane propellants over the past several years with the Propulsion and Cryogenic Advanced Development (PCAD) project, under the Exploration Technology Development Program. As part of this effort, NASA Marshall Space Flight Center has conducted combustion, performance, and combustion stability analyses of several of the configurations on these programs. This paper summarizes these analyses. Test and analysis results of impinging and coaxial element injectors using liquid oxygen and liquid methane propellants are included. Several cases with gaseous methane are included for reference. Several different thrust chamber configurations have been modeled, including thrust chambers with multi-element like-on-like and swirl coax element injectors tested at NASA MSFC, and a unielement chamber with shear and swirl coax injectors tested at The Pennsylvania State University. Configurations were modeled with two one-dimensional liquid rocket combustion analysis codes, the Rocket Combustor Interaction Design and Analysis (ROCCID), and the Coaxial Injector Combustion Model (CICM). Significant effort was applied to show how these codes can be used to model combustion and performance with oxygen/methane propellants a priori, and what anchoring or calibrating features need to be applied or developed in the future. This paper describes the test hardware configurations, presents the results of all the analyses, and compares the results from the two analytical methods.

  20. Strength of SiCf-SiCm composite tube under uniaxial and multiaxial loading

    NASA Astrophysics Data System (ADS)

    Shapovalov, Kirill; Jacobsen, George M.; Alva, Luis; Truesdale, Nathaniel; Deck, Christian P.; Huang, Xinyu

    2018-03-01

    The authors report mechanical strength of nuclear grade silicon carbide fiber reinforced silicon carbide matrix composite (SiCf-SiCm) tubing under several different stress states. The composite tubing was fabricated via a Chemical Vapor Infiltration (CVI) process, and is being evaluated for accident tolerant nuclear fuel cladding. Several experimental techniques were applied including uniaxial tension, elastomer insert burst test, open and closed end hydraulic bladder burst test, and torsion test. These tests provided critical stress and strain values at proportional limit and at ultimate failure points. Full field strain measurements using digital image correlation (DIC) were obtained in order to acquire quantitative information on localized deformation during application of stress. Based on the test results, a failure map was constructed for the SiCf-SiCm composites.

  1. Heavy hydrocarbon main injector technology program

    NASA Technical Reports Server (NTRS)

    Arbit, H. A.; Tuegel, L. M.; Dodd, F. E.

    1991-01-01

    The Heavy Hydrocarbon Main Injector Program was an analytical, design, and test program to demonstrate an injection concept applicable to an Isolated Combustion Compartment of a full-scale, high pressure, LOX/RP-1 engine. Several injector patterns were tested in a 3.4-in. combustor. Based on these results, features of the most promising injector design were incorporated into a 5.7-in. injector which was then hot-fire tested. In turn, a preliminary design of a 5-compartment 2D combustor was based on this pattern. Also the additional subscale injector testing and analysis was performed with an emphasis on improving analytical techniques and acoustic cavity design methodology. Several of the existing 3.5-in. diameter injectors were hot-fire tested with and without acoustic cavities for spontaneous and dynamic stability characteristics.

  2. A study of graphite-epoxy laminate failures due to high transverse shear strains using the multi-span-beam shear test procedure

    NASA Technical Reports Server (NTRS)

    Jegley, Dawn C.

    1989-01-01

    The multi-span-beam shear test procedure is used to study failure mechanisms in graphite-epoxy laminates due to high transverse shear strains induced by severe local bending deformations in test specimens. Results of a series of tests on specimens with a variety of stacking sequences, including some with adhesive interleaving, are presented. These results indicate that laminates with stacking sequences with several + or - 45 and 90 deg plies next to each other are more susceptible to failures due to high transverse shear strains than laminates with + or - 45 and 0 deg plies next to each other or with + or - 45 deg plies next to layers of adhesive interleaving. Results of these tests are compared with analytical results based on finite elements.

  3. Quantitative Differences in Retest Effects across Different Methods Used to Construct Alternate Test Forms

    ERIC Educational Resources Information Center

    Arendasy, Martin E.; Sommer, Markus

    2013-01-01

    Allowing respondents to retake a cognitive ability test has shown to increase their test scores. Several theoretical models have been proposed to explain this effect, which make distinct assumptions regarding the measurement invariance of psychometric tests across test administration sessions with regard to narrower cognitive abilities and general…

  4. Spelling for the Office. Competency Test Package. Office Occupations. Instructor's Guide.

    ERIC Educational Resources Information Center

    Hines, Donna

    This competency test package, one of a series of test packages for office occupations education, contains a list of performance objectives; a pool of objective questions matched with these performance objectives; a sample, 50-point objective test; and several performance test activities. The package also includes complete directions for the…

  5. Basic Skills for Word Processing. Competency Test Package. Office Occupations. Instructor's Guide.

    ERIC Educational Resources Information Center

    Hines, Donna

    This competency test package, one of a series of test packages for office occupations education, contains a list of performance objectives; a pool of objective questions matched with these performance objectives; a sample, 50-point objective test; and several performance test activities. The package also includes complete directions for the…

  6. Resistance to Soybean Aphid Among Soybean Lines, Growth-chamber Tests, 2006 Through 2008

    USDA-ARS?s Scientific Manuscript database

    We tested for resistance to the soybean aphid (SBA, Aphis glycines) among several soybean lines, and rated lines as resistant or susceptible in seven tests. The ratings of plants with respect to SBA infestation differed among lines in all tests. Kosamame (PI 171451, test II), Bhart (PI 165989, tes...

  7. 46. Historic photo of Building 202 test cell interior, detail ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    46. Historic photo of Building 202 test cell interior, detail of test stand A with engine severely damaged during testing, September 7, 1961. On file at NASA Plumbrook Research Center, Sandusky, Ohio. NASA photo number C-57837. - Rocket Engine Testing Facility, GRC Building No. 202, NASA Glenn Research Center, Cleveland, Cuyahoga County, OH

  8. RSCABS: An R package for performing the Rao-Scott Adjusted Cochran-Armitage trend test By Slices

    EPA Science Inventory

    RSCABS[3] (Rao-Scott adjusted Cochran-Armitage trend test By Slices) is a modification to the Rao-Scott[5] adjusted Cochran-Armitage trend test[1, 2] that allows for testing at each individual severity score often seen in histopathological data. The test was originally developed ...

  9. Business Telephone Etiquette. Comptetency Test Package. Office Occupations. Instructor's Guide.

    ERIC Educational Resources Information Center

    Hines, Donna

    This competency test package, one of a series of test packages for office occupations education, contains a list of performance objectives: a sample, 50-point objective test; and several performance test activities. The package also includes complete directions for the student and the instructor, plus answer keys and a guide for evaluating the…

  10. Women and Educational Testing: A Selective Review of the Research Literature and Testing Practices.

    ERIC Educational Resources Information Center

    Tittle, Carol Kehr; And Others

    This report provides an exploratory survey of several aspects of educational testing, with a view toward identifying discrimination against women. Two major ways in which discrimination can occur are examined in educational testing: reinforcement of sex-role stereotypes and restriction of individual choice. Major educational achievement tests are…

  11. Reliability demonstration test for load-sharing systems with exponential and Weibull components

    PubMed Central

    Hu, Qingpei; Yu, Dan; Xie, Min

    2017-01-01

    Conducting a Reliability Demonstration Test (RDT) is a crucial step in production. Products are tested under certain schemes to demonstrate whether their reliability indices reach pre-specified thresholds. Test schemes for RDT have been studied in different situations, e.g., lifetime testing, degradation testing and accelerated testing. Systems designed with several structures are also investigated in many RDT plans. Despite the availability of a range of test plans for different systems, RDT planning for load-sharing systems hasn’t yet received the attention it deserves. In this paper, we propose a demonstration method for two specific types of load-sharing systems with components subject to two distributions: exponential and Weibull. Based on the assumptions and interpretations made in several previous works on such load-sharing systems, we set the mean time to failure (MTTF) of the total system as the demonstration target. We represent the MTTF as a summation of mean time between successive component failures. Next, we introduce generalized test statistics for both the underlying distributions. Finally, RDT plans for the two types of systems are established on the basis of these test statistics. PMID:29284030

  12. Reliability demonstration test for load-sharing systems with exponential and Weibull components.

    PubMed

    Xu, Jianyu; Hu, Qingpei; Yu, Dan; Xie, Min

    2017-01-01

    Conducting a Reliability Demonstration Test (RDT) is a crucial step in production. Products are tested under certain schemes to demonstrate whether their reliability indices reach pre-specified thresholds. Test schemes for RDT have been studied in different situations, e.g., lifetime testing, degradation testing and accelerated testing. Systems designed with several structures are also investigated in many RDT plans. Despite the availability of a range of test plans for different systems, RDT planning for load-sharing systems hasn't yet received the attention it deserves. In this paper, we propose a demonstration method for two specific types of load-sharing systems with components subject to two distributions: exponential and Weibull. Based on the assumptions and interpretations made in several previous works on such load-sharing systems, we set the mean time to failure (MTTF) of the total system as the demonstration target. We represent the MTTF as a summation of mean time between successive component failures. Next, we introduce generalized test statistics for both the underlying distributions. Finally, RDT plans for the two types of systems are established on the basis of these test statistics.

  13. The consequences of high-risk behaviors: trauma during pregnancy.

    PubMed

    Patteson, Stephen K; Snider, Carolyn C; Meyer, David S; Enderson, Blaine L; Armstrong, Janice E; Whitaker, Gregory L; Carroll, Roger C

    2007-04-01

    Trauma during pregnancy places two lives at risk. Knowledge of risk factors for trauma during pregnancy may improve outcomes. We reviewed the charts of 188 such patients admitted to a Level I trauma center from 1996 to 2004. A comparison was made of injury severity and outcome from a cohort of nonpregnant female trauma patients selected with a similar temporal occurrence and age range. Motor vehicle collisions comprised 160 cases, 67 using a restraint device. Of 84 patients tested, 45 tested positive for intoxicants, 16 positive for 2 or more intoxicants. A significant trend toward less testing through the study period was observed (p = 0.0002). Injury severity was assessed by Revised Trauma Score (RTS). RTS <11 or admission to operating room or intensive care units (OR/ICU) classified patients as severely injured. The six maternal fatalities had an RTS <11 or OR/ICU disposition. Fetal outcomes included 155 live in utero, 18 live births, and 15 fatalities correlating with injury severity by either criteria (p < 0.0001). Of the fetal fatalities, 7 occurred with RTS = 12, but only 3 fatalities occurred in the 147 cases not admitted to OR/ICU. Gestational age correlated (p < 0.0001) with fetal outcomes. The 18 live births had mean gestational ages of 35 +/- 4 weeks as compared with fetal fatalities at 20 +/- 9 weeks, and fetuses alive in utero at 22 +/- 9 weeks gestation. Coagulation tests prothrombin time (PT), international normalized ratio (INR) (both p < 0.008), and partial thromboplastin time (PTT) (p < 0.0001) correlated with maternal outcome. A matched cohort of nonpregnancy trauma cases during the same time frame indicated that, despite a significantly higher percentage of severely injured patients, fewer fatalities occurred. This might reflect a greater risk for the pregnant trauma patient. This study of trauma in pregnancy cases revealed a high percentage with risk behaviors. There was a significant trend toward less intoxicant testing in recent years. Coagulation tests were the most predictive of outcomes. Lower gestational age correlated with fetal demise.

  14. Pancam Mast Assembly on Mars Rover

    NASA Technical Reports Server (NTRS)

    Warden, Robert M.; Cross, Mike; Harvison, Doug

    2004-01-01

    The Pancam Mast Assembly (PMA) for the 2003 Mars Rover is a deployable structure that provides an elevated platform for several cameras. The PMA consists of several mechanisms that enable it to raise the cameras as well as point the cameras in all directions. This paper describes the function of the various mechanisms as well as a description of the mechanisms and some test parameters. Designing these mechanisms to operate on the surface of Mars presented several challenges. Typical spacecraft mechanisms must operate in zero-gravity and high vacuum. These mechanisms needed to be designed to operate in Martian gravity and atmosphere. Testing conditions were a little easier because the mechanisms are not required to operate in a vacuum. All of the materials are vacuum compatible, but the mechanisms were tested in a dry nitrogen atmosphere at various cold temperatures.

  15. The Relationship Between Ocular Itch, Ocular Pain, and Dry Eye Symptoms (An American Ophthalmological Society Thesis).

    PubMed

    Galor, Anat; Small, Leslie; Feuer, William; Levitt, Roy C; Sarantopoulos, Konstantinos D; Yosipovitch, Gil

    2017-08-01

    To evaluate associations between sensations of ocular itch and dry eye (DE) symptoms, including ocular pain, and DE signs. A cross-sectional study of 324 patients seen in the Miami Veterans Affairs eye clinic was performed. The evaluation consisted of questionnaires regarding ocular itch, DE symptoms, descriptors of neuropathic-like ocular pain (NOP), and evoked pain sensitivity testing on the forehead and forearm, followed by a comprehensive ocular surface examination including corneal mechanical sensitivity testing. Analyses were performed to examine for differences between those with and without subjective complaints of ocular itch. The mean age was 62 years with 92% being male. Symptoms of DE and NOP were more frequent in patients with moderate-severe ocular itch compared to those with no or mild ocular itch symptoms. With the exception of ocular surface inflammation (abnormal matrix metalloproteinase 9 testing) which was less common in those with moderate-severe ocular itch symptoms, DE signs were not related to ocular itch. Individuals with moderate-severe ocular itch also demonstrated greater sensitivity to evoked pain on the forearm and had higher non-ocular pain, depression, and post-traumatic stress disorders scores, compared to those with no or mild itch symptoms. Subjects with moderate-severe ocular itch symptoms have more severe symptoms of DE, NOP, non-ocular pain and demonstrate abnormal somatosensory testing in the form of increased sensitivity to evoked pain at a site remote from the eye, consistent with generalized hypersensitivity.

  16. The Relationship Between Ocular Itch, Ocular Pain, and Dry Eye Symptoms (An American Ophthalmological Society Thesis)

    PubMed Central

    Galor, Anat; Small, Leslie; Feuer, William; Levitt, Roy C.; Sarantopoulos, Konstantinos D.; Yosipovitch, Gil

    2017-01-01

    Purpose To evaluate associations between sensations of ocular itch and dry eye (DE) symptoms, including ocular pain, and DE signs. Methods A cross-sectional study of 324 patients seen in the Miami Veterans Affairs eye clinic was performed. The evaluation consisted of questionnaires regarding ocular itch, DE symptoms, descriptors of neuropathic-like ocular pain (NOP), and evoked pain sensitivity testing on the forehead and forearm, followed by a comprehensive ocular surface examination including corneal mechanical sensitivity testing. Analyses were performed to examine for differences between those with and without subjective complaints of ocular itch. Results The mean age was 62 years with 92% being male. Symptoms of DE and NOP were more frequent in patients with moderate-severe ocular itch compared to those with no or mild ocular itch symptoms. With the exception of ocular surface inflammation (abnormal matrix metalloproteinase 9 testing) which was less common in those with moderate-severe ocular itch symptoms, DE signs were not related to ocular itch. Individuals with moderate-severe ocular itch also demonstrated greater sensitivity to evoked pain on the forearm and had higher non-ocular pain, depression, and post-traumatic stress disorders scores, compared to those with no or mild itch symptoms. Conclusions Subjects with moderate-severe ocular itch symptoms have more severe symptoms of DE, NOP, non-ocular pain and demonstrate abnormal somatosensory testing in the form of increased sensitivity to evoked pain at a site remote from the eye, consistent with generalized hypersensitivity. PMID:29391860

  17. Immediate Wheal Reactivity to Autologous Sweat in Atopic Dermatitis Is Associated with Clinical Severity, Serum Total and Specific IgE and Sweat Tryptase Activity.

    PubMed

    Ilves, Tiina; Virolainen, Anu; Harvima, Ilkka Tapani

    2016-01-01

    Sweating can worsen atopic dermatitis (AD). The purpose of this work was to study the associations between reactivity to autologous sweat and the clinical severity of AD as well as investigate the possible wheal-inducing factors of sweat. Intracutaneous skin tests with autologous sweat were performed on 50 AD patients and 24 control subjects. In skin biopsies, tryptase and PAR-2 were enzyme and immunohistochemically stained. The associations between skin test reactivity and sweat histamine concentration, tryptase or chymase activity levels, tryptase or PAR-2 expression and AD clinical severity or IgE levels were investigated. The wheal reactions in the intracutaneous tests with autologous sweat were positive, weakly positive and negative in 38, 34 and 28% of the AD patients, respectively, and in 4, 46 and 50% of the healthy controls, respectively (p = 0.008). In AD, the wheal reaction was associated significantly with clinical severity, serum total and specific IgE levels and sweat tryptase activity, but not with sweat histamine and chymase. In nonlesional AD skin, the percentage of PAR-2+ mast cells (MCs) or the number of tryptase+ MCs did not differ significantly between the intracutaneous test reactivity groups. Reactivity to autologous sweat correlates with the clinical severity of AD, and tryptase may be one of the factors involved in the sweat-induced wheal. © 2016 S. Karger AG, Basel.

  18. Diagnosis from functional perspectives: usefulness of a manual tactile test for predicting precision pinch performance and disease severity in subjects with carpal tunnel syndrome.

    PubMed

    Hsu, Hsiu-Yun; Kuo, Yao-Lung; Jou, I-Ming; Su, Fong-Chin; Chiu, Haw-Yen; Kuo, Li-Chieh

    2014-04-01

    To investigate how the severity levels revealed in a nerve conduction study (NCS) affect the results of the Manual Tactile Test (MTT) for patients with carpal tunnel syndrome (CTS), and to examine the relationships between the results of the MTT and precision pinch performance. Case-control studies. Hospital and local community. Patients with CTS (N=70) with 119 affected hands were studied. A control group matched by age, sex, and hand dominance was also recruited. Not applicable. CTS severity was determined based on NCS findings. The MTT, traditional sensory tests, and precision pinch performance were used to examine the functional sensory status of the hand from different perspectives. The patients with CTS exhibited deterioration in all of the sensibility tests (P<.001). The results showed that the MTT could classify subgroups of severity in CTS (P<.001). A moderate correlation was found between the results of the MTT and precision pinch performance (r=.526-.585, P<.001). Multiple linear regression analysis showed that the MTT results were useful indicators for predicting precision pinch performance and differentiating severity in subjects with CTS (r(2)=.376 and .323, respectively). The findings indicate that the MTT could be a valid and useful assessment for hand sensibility and prehensile pinch performance in patients with CTS. Copyright © 2014 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  19. A Computer Adaptive Testing Version of the Addiction Severity Index-Multimedia Version (ASI-MV): The Addiction Severity CAT

    PubMed Central

    Butler, Stephen F.; Black, Ryan A.; McCaffrey, Stacey A.; Ainscough, Jessica; Doucette, Ann M.

    2017-01-01

    The purpose of this study was to develop and validate a computer adaptive testing (CAT) version of the Addiction Severity Index-Multimedia Version (ASI-MV®), the Addiction Severity CAT. This goal was accomplished in four steps. First, new candidate items for Addiction Severity CAT domains were evaluated after brainstorming sessions with experts in substance abuse treatment. Next, this new item bank was psychometrically evaluated on a large non-clinical (n =4419) and substance abuse treatment sample (n =845). Based on these results, final items were selected and calibrated for the creation of the Addiction Severity CAT algorithms. Once the algorithms were developed for the entire assessment, a fully functioning prototype of an Addiction Severity CAT was created. CAT simulations were conducted and optimal termination criteria were selected for the Addiction Severity CAT algorithms. Finally, construct validity of the CAT algorithms was evaluated by examining convergent/discriminant validity and sensitivity to change. The Addiction Severity CAT was determined to be valid, sensitive to change, and reliable. Further, the Addiction Severity CAT’s time of administration was found to be significantly less than the average time of administration for the ASI-MV composite scores. This study represents the initial validation of an IRT-based Addiction Severity CAT, and further exploration of the Addiction Severity CAT is needed. PMID:28230387

  20. A computer adaptive testing version of the Addiction Severity Index-Multimedia Version (ASI-MV): The Addiction Severity CAT.

    PubMed

    Butler, Stephen F; Black, Ryan A; McCaffrey, Stacey A; Ainscough, Jessica; Doucette, Ann M

    2017-05-01

    The purpose of this study was to develop and validate a computer adaptive testing (CAT) version of the Addiction Severity Index-Multimedia Version (ASI-MV), the Addiction Severity CAT. This goal was accomplished in 4 steps. First, new candidate items for Addiction Severity CAT domains were evaluated after brainstorming sessions with experts in substance abuse treatment. Next, this new item bank was psychometrically evaluated on a large nonclinical (n = 4,419) and substance abuse treatment (n = 845) sample. Based on these results, final items were selected and calibrated for the creation of the Addiction Severity CAT algorithms. Once the algorithms were developed for the entire assessment, a fully functioning prototype of an Addiction Severity CAT was created. CAT simulations were conducted, and optimal termination criteria were selected for the Addiction Severity CAT algorithms. Finally, construct validity of the CAT algorithms was evaluated by examining convergent and discriminant validity and sensitivity to change. The Addiction Severity CAT was determined to be valid, sensitive to change, and reliable. Further, the Addiction Severity CAT's time of completion was found to be significantly less than the average time of completion for the ASI-MV composite scores. This study represents the initial validation of an Addiction Severity CAT based on item response theory, and further exploration of the Addiction Severity CAT is needed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  1. Cognitive Deficits Underlying Error Behavior on a Naturalistic Task after Severe Traumatic Brain Injury

    PubMed Central

    Hendry, Kathryn; Ownsworth, Tamara; Beadle, Elizabeth; Chevignard, Mathilde P.; Fleming, Jennifer; Griffin, Janelle; Shum, David H. K.

    2016-01-01

    People with severe traumatic brain injury (TBI) often make errors on everyday tasks that compromise their safety and independence. Such errors potentially arise from the breakdown or failure of multiple cognitive processes. This study aimed to investigate cognitive deficits underlying error behavior on a home-based version of the Cooking Task (HBCT) following TBI. Participants included 45 adults (9 females, 36 males) with severe TBI aged 18–64 years (M = 37.91, SD = 13.43). Participants were administered the HBCT in their home kitchens, with audiovisual recordings taken to enable scoring of total errors and error subtypes (Omissions, Additions, Estimations, Substitutions, Commentary/Questions, Dangerous Behavior, Goal Achievement). Participants also completed a battery of neuropsychological tests, including the Trail Making Test, Hopkins Verbal Learning Test-Revised, Digit Span, Zoo Map test, Modified Stroop Test, and Hayling Sentence Completion Test. After controlling for cooking experience, greater Omissions and Estimation errors, lack of goal achievement, and longer completion time were significantly associated with poorer attention, memory, and executive functioning. These findings indicate that errors on naturalistic tasks arise from deficits in multiple cognitive domains. Assessment of error behavior in a real life setting provides insight into individuals' functional abilities which can guide rehabilitation planning and lifestyle support. PMID:27790099

  2. Testing and Analysis of a Composite Non-Cylindrical Aircraft Fuselage Structure . Part II; Severe Damage

    NASA Technical Reports Server (NTRS)

    Przekop, Adam; Jegley, Dawn C.; Lovejoy, Andrew E.; Rouse, Marshall; Wu, Hsi-Yung T.

    2016-01-01

    The Environmentally Responsible Aviation Project aimed to develop aircraft technologies enabling significant fuel burn and community noise reductions. Small incremental changes to the conventional metallic alloy-based 'tube and wing' configuration were not sufficient to achieve the desired metrics. One airframe concept identified by the project as having the potential to dramatically improve aircraft performance was a composite-based hybrid wing body configuration. Such a concept, however, presented inherent challenges stemming from, among other factors, the necessity to transfer wing loads through the entire center fuselage section which accommodates a pressurized cabin confined by flat or nearly flat panels. This paper discusses a finite element analysis and the testing of a large-scale hybrid wing body center section structure developed and constructed to demonstrate that the Pultruded Rod Stitched Efficient Unitized Structure concept can meet these challenging demands of the next generation airframes. Part II of the paper considers the final test to failure of the test article in the presence of an intentionally inflicted severe discrete source damage under the wing up-bending loading condition. Finite element analysis results are compared with measurements acquired during the test and demonstrate that the hybrid wing body test article was able to redistribute and support the required design loads in a severely damaged condition.

  3. Acute toxicity of zinc to several aquatic species native to the Rocky Mountains.

    PubMed

    Brinkman, Stephen F; Johnston, Walter D

    2012-02-01

    National water-quality criteria for the protection of aquatic life are based on toxicity tests, often using organisms that are easy to culture in the laboratory. Species native to the Rocky Mountains are poorly represented in data sets used to derive national water-quality criteria. To provide additional data on the toxicity of zinc, several laboratory acute-toxicity tests were conducted with a diverse assortment of fish, benthic invertebrates, and an amphibian native to the Rocky Mountains. Tests with fish were conducted using three subspecies of cutthroat trout (Colorado River cutthroat trout Oncorhynchus clarkii pleuriticus, greenback cutthroat trout O. clarkii stomias, and Rio Grande cutthroat trout O. clarkii virginalis), mountain whitefish (Prosopium williamsoni), mottled sculpin (Cottus bairdi), longnose dace (Rhinichthys cataractae), and flathead chub (Platygobio gracilis). Aquatic invertebrate tests were conducted with mayflies (Baetis tricaudatus, Drunella doddsi, Cinygmula sp. and Ephemerella sp.), a stonefly (Chloroperlidae), and a caddis fly (Lepidostoma sp.). The amphibian test was conducted with tadpoles of the boreal toad (Bufo boreas). Median lethal concentrations (LC(50)s) ranged more than three orders of magnitude from 166 μg/L for Rio Grande cutthroat trout to >67,000 μg/L for several benthic invertebrates. Of the organisms tested, vertebrates were the most sensitive, and benthic invertebrates were the most tolerant.

  4. Experimental investigation and damage assessment in a post tensioned concrete beam

    NASA Astrophysics Data System (ADS)

    Limongelli, Maria; Siegert, Dominique; Merliot, Erick; Waeytens, Julien; Bourquin, Frederic; Vidal, Roland; Le Corvec, Veronique; Guegen, Ivan; Cottineau, Louis-Marie

    2017-04-01

    This paper presents the results of an experimental campaign carried out on a prestressed concrete beam in the realm of the project SIPRIS (Systèmes Intelligents pour la Prévention des Risques Structurels), aimed to develop intelligent systems for the prevention of structural risk related to the aging of large infrastructures. The specimen was tested in several configurations aimed to re-produce several different phases of the 'life' of the beam: in the original undamaged state, under an increasing loss of tension in the cables, during and after cracking induced by a point load, after a strengthening intervention, after new cracking of the 'repaired' beam. Damage was introduced in a controlled way by means of three-point static bending tests. The transverse point loads were ap-plied at several different sections along the beam axis. Before and after each static test, the dy-namical response of the beam was measured under sine-sweep and impact tests by an extensive set of accelerometers deployed along the beam axis. The availability of both static and dynamic tests allows to investigate and compare their effectiveness to detect damages in the tensioned beam and to reliably identify the evolution of damage. The paper discusses the tests program and some results relevant to the dynamic characterization of the beam in the different phases.

  5. Expanded experience with an intradermal skin test to predict for the presence or absence of carboplatin hypersensitivity.

    PubMed

    Markman, Maurie; Zanotti, Kristine; Peterson, Gertrude; Kulp, Barbara; Webster, Kenneth; Belinson, Jerome

    2003-12-15

    Carboplatin-associated hypersensitivity is increasingly recognized as a potentially serious toxicity when this agent is administered for more than six total cycles. Our group has used a predictive skin test in women with gynecologic cancers who have previously received more than six cumulative cycles of platinum-based chemotherapy. Thirty minutes before all subsequent carboplatin courses, a 0.02-mL aliquot from the solution prepared for treatment is injected intradermally. A positive test is considered to be a > or = 5-mm wheal, with a surrounding flare. From October 1998 through March 2003, 126 patients received a total of 717 carboplatin skin tests (median per patient, four tests; range, one to 54 tests). Of the 668 negative tests (93% of the total performed), 10 were associated with evidence of carboplatin hypersensitivity (1.5% false-negative rate; 95% CI, 0.6% to 2.4%), none of which were severe (eg, dyspnea, hypotension, cardiac/respiratory compromise). Of the 41 positive tests, the decision was made to not deliver the drug to 32 patients, although seven women ultimately underwent a future attempt at re-treatment with a platinum agent using a desensitization program. In seven episodes where patients received the carboplatin despite the finding of a positive test, six were associated with the development of symptoms of anaphylaxis (none severe). A negative carboplatin skin test seems to predict with reasonable reliability for the absence of a severe hypersensitivity reaction with the subsequent drug infusion. The implications of a positive test remain less certain, but limited experience with continued treatment suggests this approach must be undertaken with considerable caution.

  6. Ability Tests? A Shot in the Dark

    ERIC Educational Resources Information Center

    Petrovsky, Arthur V.

    1973-01-01

    Several areas of controversy between Soviet pyschologists and their Western colleagues concerning the usefulness of tests for measurement of mental ability are noted in this article. The author outlines test procedures suggested by Russian psychologist, Lev Vygotsky. (SM)

  7. Neonatal Screening Tests.

    ERIC Educational Resources Information Center

    Vigue, Charles L.

    1986-01-01

    Describes several laboratory experiments that are adaptations of clinical tests for certain genetic diseases in babies. Information and procedures are provided for tests for phenylketonuria (PKU), galactosemia, tyrosinemia, cystinuria, and mucopolysaccharidosis. Discusses the effects of each disease on the infants' development. (TW)

  8. Results of 1938 IUFRO Scotch pine provenance test in New York

    Treesearch

    Ernst J. Schreiner; E. W. Littlefield; E. J. Eliason

    1961-01-01

    The International Union of Forest Research Organizations (IUFRO) has sponsored several international provenance tests. This is a report on the practical aspects of an IUFRO provenance test with Scotch pine in New York.

  9. Factor Analysis of Various Anaerobic Power Tests.

    ERIC Educational Resources Information Center

    Manning, James M.; And Others

    A study investigated the relationship between selected anthropometric variables and of numerous anaerobic power tests with measures obtained on an isokinetic dynamometer. Thirty-one male college students performed several anaerobic power tests, including: the vertical jump using the Lewis formula; the Margaria-Kalamen stair climb test; the Wingate…

  10. Perspective on Intelligence Testing.

    ERIC Educational Resources Information Center

    Lennon, Roger T.

    1978-01-01

    We should seek an updated perspective on intelligence testing because it is useful to reevaluate any practice that has long become institutionalized, especially one that is subject to severe criticism. Cultural bias is the most prominent criticism. Testing proponents contend that intelligence tests reflect skills and knowledge emphasized in school…

  11. Thermal Response of UHMWPE Materials in a Flash Flame Test Environment

    DTIC Science & Technology

    2014-11-13

    Evaluation of Flame Resistant Clothing for Protection Against Fire Simulations Using an Instrumented Manikin. Several UHMWPE fabrics were tested underneath...PROTECTIVE CLOTHING COTTON FLASH FLAMES UNDERGARMENTS TEST AND EVALUATION FABRICS FLAME TESTING FIRE ...PROTECTION FIRE RESISTANT TEXTILES UHMWPE(ULTRA HIGH MOLECULAR WEIGHT POLYETHYLENE

  12. Model-Based Development of Automotive Electronic Climate Control Software

    NASA Astrophysics Data System (ADS)

    Kakade, Rupesh; Murugesan, Mohan; Perugu, Bhupal; Nair, Mohanan

    With increasing complexity of software in today's products, writing and maintaining thousands of lines of code is a tedious task. Instead, an alternative methodology must be employed. Model-based development is one candidate that offers several benefits and allows engineers to focus on the domain of their expertise than writing huge codes. In this paper, we discuss the application of model-based development to the electronic climate control software of vehicles. The back-to-back testing approach is presented that ensures flawless and smooth transition from legacy designs to the model-based development. Simulink report generator to create design documents from the models is presented along with its usage to run the simulation model and capture the results into the test report. Test automation using model-based development tool that support the use of unique set of test cases for several testing levels and the test procedure that is independent of software and hardware platform is also presented.

  13. The Design, Fabrication, and Testing of Composite Heat Exchange Coupons

    NASA Technical Reports Server (NTRS)

    Quade, Derek J.; Meador, Michael A.; Shin, Euy-Sik; Johnston, James C.; Kuczmarski, Maria A.

    2011-01-01

    Several heat exchanger (HX) test panels were designed, fabricated and tested at the NASA Glenn Research Center to explore the fabrication and performance of several designs for composite heat exchangers. The development of these light weight, high efficiency air-liquid test panels was attempted using polymer composites and carbon foam materials. The fundamental goal of this effort was to demonstrate the feasibility of the composite HX for various space exploration and thermal management applications including Orion CEV and Altair. The specific objectives of this work were to select optimum materials, designs, and to optimize fabrication procedures. After fabrication, the individual design concept prototypes were tested to determine their thermal performance and to guide the future development of full-size engineering development units (EDU). The overall test results suggested that the panel bonded with pre-cured composite laminates to KFOAM Grade L1 scored above the other designs in terms of ease of manufacture and performance.

  14. Comprehensive Carrier Screening and Molecular Diagnostic Testing for Recessive Childhood Diseases

    PubMed Central

    Kingsmore, Stephen

    2012-01-01

    Of 7,028 disorders with suspected Mendelian inheritance, 1,139 are recessive and have an established molecular basis. Although individually uncommon, Mendelian diseases collectively account for ~20% of infant mortality and ~18% of pediatric hospitalizations. Molecular diagnostic testing is currently available for only ~300 recessive disorders. Preconception screening, together with genetic counseling of carriers, has resulted in remarkable declines in the incidence of several severe recessive diseases including Tay-Sachs disease and cystic fibrosis. However, extension of preconception screening and molecular diagnostic testing to most recessive disease genes has hitherto been impractical. Recently, we reported a preconception carrier screen / molecular diagnostic test for 448 recessive childhood diseases. The current status of this test is reviewed here. Currently, this reports analytical validity of the comprehensive carrier test. As the clinical validity and clinical utility in the contexts described is ascertained, this article will be updated. PMID:22872815

  15. Berthing mechanism final test report and program assessment

    NASA Technical Reports Server (NTRS)

    1988-01-01

    The purpose is to document the testing performed on both hardware and software developed under the Space Station Berthing Mechanisms Program. Testing of the mechanism occurred at three locations. Several system components, e.g., actuators and computer systems, were functionally tested before assembly. A series of post assembly tests were performed. The post assembly tests, as well as the dynamic testing of the mechanism, are presented.

  16. Numerical considerations in the development and implementation of constitutive models

    NASA Technical Reports Server (NTRS)

    Haisler, W. E.; Imbrie, P. K.

    1985-01-01

    Several unified constitutive models were tested in uniaxial form by specifying input strain histories and comparing output stress histories. The purpose of the tests was to evaluate several time integration methods with regard to accuracy, stability, and computational economy. The sensitivity of the models to slight changes in input constants was also investigated. Results are presented for In100 at 1350 F and Hastelloy-X at 1800 F.

  17. Proteomic Approach for Diagnostic Applications in Head and Neck Cancer — EDRN Public Portal

    Cancer.gov

    To evaluate the test characteristics of a panel of biomarkers for identifying patients with early stage head and neck squamous cell carcinoma (HNSCC). The primary endpoints are sensitivity, specificity and accuracy of the marker panel. This study of the test characteristics of a modeling strategy for diagnosing HNSCC uses a case-control design, with several types of cases and several types of controls.

  18. Measuring Electrostatic Discharge

    NASA Technical Reports Server (NTRS)

    Smith, William C.

    1987-01-01

    Apparatus measures electrostatic-discharge properties of several materials at once. Allows samples charged either by friction or by exposure to corona. By testing several samples simultaneously, apparatus eliminates errors introduced by variations among test conditions. Samples spaced so they pass at intervals under either of two retractable arms. Samples are 2 inches wide along circular path. Arm tips and voltmeter probe are 6 inches from turntable center. Servocontrolled turntable speed constant within 0.1 percent.

  19. An Overview of High Temperature Seal Development and Testing Capabilities at the NASA Glenn Research Center

    NASA Technical Reports Server (NTRS)

    Demange, Jeffrey J.; Taylor, Shawn C.; Dunlap, Patrick H.; Steinetz, Bruce M.; Finkbeiner, Joshua R.; Proctor, Margaret P.

    2014-01-01

    The NASA Glenn Research Center (GRC), partnering with the University of Toledo, has a long history of developing and testing seal technologies for high-temperature applications. The GRC Seals Team has conducted research and development on high-temperature seal technologies for applications including advanced propulsion systems, thermal protection systems (airframe and control surface thermal seals), high-temperature preloading technologies, and other extreme-environment seal applications. The team has supported several high-profile projects over the past 30 years and has partnered with numerous organizations, including other government entities, academic institutions, and private organizations. Some of these projects have included the National Aerospace Space Plane (NASP), Space Shuttle Space Transport System (STS), the Multi-Purpose Crew Vehicle (MPCV), and the Dream Chaser Space Transportation System, as well as several high-speed vehicle programs for other government organizations. As part of the support for these programs, NASA GRC has developed unique seal-specific test facilities that permit evaluations and screening exercises in relevant environments. The team has also embarked on developing high-temperature preloaders to help maintain seal functionality in extreme environments. This paper highlights several propulsion-related projects that the NASA GRC Seals Team has supported over the past several years and will provide an overview of existing testing capabilities

  20. A 12-Day Course of Imiquimod 5% for the Treatment of Actinic Keratosis: Effectiveness and Local Reactions.

    PubMed

    Serra-Guillén, C; Nagore, E; Llombart, B; Sanmartín, O; Requena, C; Calomarde, L; Guillén, C

    2018-04-01

    Imiquimod is an excellent option for patients with actinic keratosis, although its use may be limited by the long course of treatment required (4 weeks) and the likelihood of local skin reactions. The objectives of the present study were to demonstrate the effectiveness of a 12-day course of imiquimod 5% for the treatment of actinic keratosis and to examine the association between treatment effectiveness and severity of local reactions. We included patients with at least 8 actinic keratoses treated with imiquimod 5% cream for 12 consecutive days. Local reactions were classified as mild, moderate, or severe. The statistical analysis of the association between local reactions and clinical response was based on the Pearson χ 2 test and the Spearman rank correlation test. Sixty-five patients completed the study. Complete response was recorded in 52.3% and partial response in 75.4%. We found a statistically significant association between severity of the local reaction and response to treatment in both the Pearson χ 2 test and the Spearman rank correlation test. A 12-day course of imiquimod 5% proved effective for the treatment of actinic keratosis. Severity of local reactions during treatment was correlated with clinical response. Copyright © 2017 AEDV. Publicado por Elsevier España, S.L.U. All rights reserved.

Top